To assess the performance of a new method, the diagnostic sensitivity (DSn) and specificity (DSp) should be determined in reference samples with known history and infection status [8, 9, 22]. However, these samples may not reflect the actual population for which the methods are intended and thus, the accuracy of the test might vary. In the present study, Dsn and Dsp of the ELISA applied to samples obtained from the target population differed from the previously established values. To cope with the problem of having an imperfect gold standard, Bayesian estimation was applied as recommended by OIE. The Bayesian estimation is a latent class model that does not assume the previously used method to be the best method (i.e. used as gold standard) and the known sensitivity and specificity of the reference test are not a prerequisite.
To further illustrate the importance of evaluating the method in the target population with a "natural" distribution of positive and negative animals, reference samples from two populations with known disease status (animals that were confirmed to be infected with L. intracellularis by PCR on faecal samples and animals from a high health herd where L. intracellularis had not been demonstrated) were analysed by both methods and compared by a 2 × 2 table using the IFAT as gold standard. This resulted in 100% specificity and sensitivity for both methods employed (data not shown).
Several cross-sectional studies on the serological prevalence of L. intracellularis in Europe and Asia have been conducted using the blocking ELISA but very few studies have attempted to validate the method. Keller et al. (2006) reported a good reproducibility between different laboratories. Further, in comparison to IFAT, the ELISA provided a higher sensitivity and more unambiguous results. By comparison to an in-house IFA, the DSn and DSp were estimated to 92 and 98%, respectively . However, these results have only been presented in conference proceedings. In previous validations of the IFAT, the DSn varied from 58 to 90% and the DSp from 92 to 100% [5, 21, 24].
The results partly reflect the difficulties to clinically assess the stage of infection. Based on previous results, >40% of the pigs aged 9 to 15 weeks with clinical signs indicative of proliferative enteropathy were expected to shed the microbe in faeces [25, 26]. Experimentally, pigs developed diarrhoea 7-14 days post inoculation in a dose-dependent manner , and circulating L. intracellularis-specific antibodies was first detected 2 weeks after challenge . The purpose of study A was to determine the primary cause of diarrhoea in growing pigs and therefore, pigs were sampled immediately at the commencement of clinical signs. Hence, only a few pigs were expected to have seroconverted. However, most pigs turned out to be seropositive and slowly progressing infections will probably remain subclinical for some time before being detected . Study B targeted growing pigs with diarrhoea, irrespective of the commencement of clinical signs and therefore, most pigs were expected to have seroconverted . Furthermore, some animals will not seroconvert in response to infection [3, 26].
Several other methods are reported to have a good performance with high DSn and DSp [8, 9, 22]. However, these methods use various gold standards and target populations and the results given in previous papers are therefore not possible to use for comparison. Further, they are presently not commercially available. Hence, sera must be submitted to these particular laboratories and factors such as geographical variation and local variations in infectious load may not be accounted for. Only the two methods employed in the present study are commercially available and may be adapted to various laboratories. The IFAT have been widely used, but the method is laborious and time-consuming, and discrepancies in interpretation of the results between various laboratories have been reported .
On the other hand, several studies report a high prevalence (90-100%) of seropositive finisher pigs close to slaughter (25-27 weeks of age), despite that shedding of L. intracellularis or chronic intestinal lesions are rarely reported [9, 21]. This may be caused by booster infections , but since shedding in growing pigs may occur at short intervals [21, 26], and circulating antibodies may be detected for 5 weeks only [3, 21], the consistently high levels of antibody found in older pigs remains to be clarified. The blocking ELISA is based on monoclonal antibodies for capture and blocking to ensure a defined analytic sensitivity and specificity and, assumingly, an increased diagnostic specificity. However, if antibodies are formed towards other antigens with similar antigenic epitopes, unspecific or crossreactivity reactions may occur . This was however not investigated in the present study.
In conclusion, the diagnostic sensitivity of the blocking ELISA was 72% and the diagnostic specificity was 93%, as evaluated by Bayesian statistic techniques. This technique allows the validation of a diagnostic method also without a true gold standard. The positive predictive value was 0.82 and the negative predictive value was 0.89, when the analysis was applied on samples from growing pigs in the age of 8-12 weeks. The sensitivity and specificity demonstrated in the present study differed from that previously reported and the importance of validating new methods with respect to the target population was emphasised.