These tools will no longer be maintained as of December 31, 2024. Archived website can be found here. PubMed4Hh GitHub repository can be found here. Contact NLM Customer Service if you have questions.


PUBMED FOR HANDHELDS

Search MEDLINE/PubMed


  • Title: Agreement of experts and non-experts in a desktop exercise evaluating exposure to asthmagens in the cotton and textile, and other industries.
    Author: Robinson C, Money A, Agius R, de Vocht F.
    Journal: Ann Occup Hyg; 2015 Mar; 59(2):200-9. PubMed ID: 25324562.
    Abstract:
    In the absence of personal exposure measurements, expert assessment, generally on a case-by-case basis, is often used to estimate exposures. However, the decision processes of individual experts when making assessments are unknown, making it difficult to assess the quality of these assessments or to compare different assessments to each other. We conducted a study in primarily the textile and cotton industries, but also in baking, metal work, and agriculture industries in which we assessed agreement between experts assessing intensity and probability of exposure in the absence of exposure measurements to compare how well their performance compares to agreement of non-desktop-based exercises reported in literature. In addition, agreement was compared with that of non-experts undertaking the same exercise, and results were further stratified to assess the impact of factors expected of affected assessments. Intraclass correlation coefficients of absolute agreement (ICC1) and consistency (ICC3) between raters were calculated. Sensitivity and specificity were estimated using a probabilistic simulation methodology developed previously. Fourteen occupational hygienists and exposure assessors with complete data for all 48 job descriptions and 8 non-experts participated. Although confidence intervals about correlation-coefficient differences are not reported, the individual limits were found to be so broad as to suggest that no statistically significant comparisons can be made. Nevertheless, preliminary observations are presented here as suggested by the computed means. Absolute agreement between expert raters was fair-good, but was somewhat better for intensity (ICC1 = 0.61) than for probability (ICC1 = 0.44) of exposure and was better for experts than non-experts. Estimated sensitivity was 0.95 and specificity 0.82 for intensity, and 0.91 and 0.78 for probability of exposure, respectively. Stratification for factors hypothesized to affect agreement did not show statistically significant differences, but consistent patterns of point estimates indicated that agreement between raters (both expert on non-experts) dropped for medium levels of information compared with little or extensive information. Inclusion of a photo or video generally improved agreement between experts but not between non-experts, whereas the year of the job description had no influence on the assessments. These data indicate that the desktop exposure assessment exercise was of similar quality to previously reported levels of agreement. Agreements between experts' assessments were independent of the time period of the job and can be improved by inclusion of visual material. Agreement between experts as well as the non-experts does not increase with the detail of provided job information.
    [Abstract] [Full Text] [Related] [New Search]