These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: Reliability of an expert rating procedure for retrospective assessment of occupational exposures in community-based case-control studies.
    Author: Siemiatycki J, Fritschi L, Nadon L, Gérin M.
    Journal: Am J Ind Med; 1997 Mar; 31(3):280-6. PubMed ID: 9055950.
    Abstract:
    The most daunting problem in community-based studies of occupational cancer is retrospective exposure assessment. To avoid the error involved in using job title as the exposure variable or self-report of exposure, our team developed an approach based on expert judgment applied to job descriptions obtained by interviewers. A population-based case-control study of cancer and occupation was carried out in Montreal between 1979 and 1986, and over 4,000 job histories were assessed by our team of experts. The job histories of these subjects were evaluated, by consensus, by a team of chemist/hygienists for evidence of exposure to a list of 294 workplace chemicals. In order to evaluate the reliability of this exposure assessment procedure, four years after the rating was completed, we selected 50 job histories at random and had two members of the expert team carry out the same type of coding, blind to the original ratings for these jobs. For 25 job histories, comprising 94 distinct jobs, the pair worked as a consensus panel; for the other 25, comprising 92 distinct jobs, they worked independently. Statistical comparisons were made between the new ratings and the old. Among those rated by consensus, the marginal distribution of exposure prevalence was almost identical between old and new. The weighted kappa for agreement was 0.80. Among items for which both ratings agreed that there had been exposure, there was good agreement on the frequency, concentration, and route of contact. When the two raters worked independently, the levels of agreement between them and between each of them and the original rating was good (kappas around 0.70), though not as high as when they worked together. It is concluded that high levels of reliability are attainable for retrospective exposure assessment by experts.
    [Abstract] [Full Text] [Related] [New Search]