In 2020, the European Union published ordinance EU 2020/741, establishing minimum requirements for water reuse in agriculture. The ordinance differentiates between several water quality classes. For the highest water quality class (Class A), the ordinance mandates analytical validation of the treatment performance of new water reuse treatment plants (WRTP) related to the removal of microbial indicators for viral, bacterial, and parasitic pathogens. While the ordinance clearly defines the numeric target values for the required log10-reduction values (LRV), it provides limited to no guidance on the necessary sample sizes and statistical evaluation approaches. The main requirement is that at least 90 % of the validation samples should meet the requirements. However, the interpretation of this 90 % validation target can significantly impact the required sample size, efforts necessary, and the risk of misclassifying WRTPs in practice. The present study compares different statistical evaluation approaches that might be considered applicable for LRV validation monitoring. Special emphasis is placed on the use of tolerance intervals, which combine percentile estimations with sample size-based uncertainty and confidence regions. Tolerance interval-based approaches are compared with alternative methods, including a) a binomial evaluation and b) the calculation of empirical percentiles. The latter are already used in existing European and U.S. regulations for bathing water and irrigation water quality.