When looking at the result count, the number of results in results returned when suppressing duplicates is much higher then what would be expected.
RMS only suppresses duplicates for the first set of results returned. This set is configurable but is relatively small; normally 100 but can be changed.
As only the duplicates in this first set will be removed from all the duplicates in the the entire result set, the number will only accurately represent the duplicate count if duplicates are suppressed if result sets less then that value.
There is no solution to this problem.
You don't need to suppress duplicates beyond how far the user is going to page. They would rarely page past 100 results. If users do page past 100 results, you can increase the value to 200. The larger the value the greater the system resources used.
If the value is set much over our default, you will begin to notice performance degradation; you can experiment in your own environment. It depends on how many concurrent users; IIS must loop through each result and remove duplicates for each search on each search.