Advertisement
Original Contributions| Volume 57, ISSUE 4, P469-477, October 2019

Quality, Trustworthiness, Readability, and Accuracy of Medical Information Regarding Common Pediatric Emergency Medicine-Related Complaints on the Web

Open AccessPublished:September 24, 2019DOI:https://doi.org/10.1016/j.jemermed.2019.06.043

      Abstract

      Background

      The Internet is a universal source of information for parents of children with acute complaints.

      Objectives

      We sought to analyze information directed at parents regarding common acute pediatric complaints.

      Methods

      Authors searched three search engines for four complaints (child + fever, vomiting, cough, stomach pain), assessing the first 20 results for each query. Readability was evaluated using: Flesch-Kincaid Grade Level, Gunning Fog, Simple Measure of Gobbledygook, and the Coleman-Liau Index. Two reviewers independently evaluated Journal of the American Medical Association (JAMA) Benchmark Criteria and National Library of Medicine (NLM) Trustworthy scores. Two physicians (emergency medicine/EM, pediatric EM) analyzed text accuracy (number correct divided by total number of facts). Disagreements were settled by a third physician. Accuracy was defined as ≥ 95% correct, readability as an 8th-grade reading level, high quality as at least three JAMA criteria, and trustworthiness as an NLM score ≥ 3. Accurate and inaccurate websites were compared using chi-squared analysis and Mann-Whitney U test.

      Results

      Eighty-seven websites (60%) were accurate (k = 0.94). Sixty (42%) of 144 evaluable websites were readable, 38 (26%) had high-quality JAMA criteria (kappa/k = 0.68), and 44 (31%) had reliable NLM trustworthy scores (k = 0.66). Accurate websites were more frequently published by professional medical organizations (hospitals, academic societies, governments) compared with inaccurate websites (63% vs. 33%, p < 0.01). There was no association between accuracy and physician authorship, search rank, quality, trustworthiness, or readability.

      Conclusion

      Many studied websites had inadequate accuracy, quality, trustworthiness, and readability. Measures should be taken to improve web-based information related to acute pediatric complaints.

      Keywords

      Article Summary

        1. Why is this topic important?

      • The Internet is a widely used source of information for parents of children with acute complaints. Little is known about the adequacy of this web-based information.

        2. What does the study attempt to show?

      • We analyzed websites directed at parents with acutely ill children for accuracy, readability, quality, and trustworthiness of information.

        3. What are the key findings?

      • This study found that only 60% of websites provided accurate information. The majority of websites were not written at a readable level for the average parent (8th-grade level or lower). Most websites had poor-quality JAMA Benchmark criteria and poor National Library of Medicine trustworthiness scores.

        4. How is patient care impacted?

      • Inadequate information has the potential to influence decisions made by parents when their child is acutely ill.

      Introduction

      As of 2018, 4 billion people were using the Internet worldwide, there were 1.9 billion different websites, and users performed nearly 2 trillion searches per day (
      Internet Live Stats
      Trends & more (statistics).
      ). In 2016, health-related topics were the second most common category searched in Google (
      • Cocco A.M.
      • Zordan R.
      • Taylor D.
      • et al.
      Dr Google in the ED: searching for online health information by adult emergency department patients.
      ). With its widespread adoption, the Internet has become an important health information resource for parents. Up to 56% of parents perform online searches prior to bringing their child to the emergency department (ED), with 68% of queries containing symptoms and 51% containing treatment options (
      • Cocco A.M.
      • Zordan R.
      • Taylor D.
      • et al.
      Dr Google in the ED: searching for online health information by adult emergency department patients.
      ). Nearly one in eight parents use the Internet to search health information immediately prior to visiting an ED (
      • Shroff P.L.
      • Hayes R.W.
      • Padmanabhan P.
      • Stevenson M.D.
      Internet usage by parents prior to seeking care at a pediatric emergency department: observation study.
      ).
      Concerns have arisen about the quality and accuracy of this consumer-oriented information on the Internet (
      • Eysenbach G.
      • Powell J.
      • Kuss O.
      Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review.
      ,
      • Fahy E.
      • Hardikar R.
      • Fox A.
      • Mackay S.
      Quality of patient health information on the internet: reviewing a complex and evolving landscape.
      ). Incorrect or inadequate information has the potential to influence parents into making inappropriate health-related decisions for their children. Ignoring or inappropriately treating potentially serious disorders at home might lead to a delay in care or harm. Alternately, incorrect information might lead to overutilization of emergency services when no serious illness is present. Due to its potential importance, we chose to study the adequacy of information on the Internet directed at parents of children with acute medical complaints. Our primary objective was to analyze the accuracy of websites directed at consumers regarding pediatric emergency medicine complaints. Secondary objectives were to analyze the readability, quality, and trustworthiness of websites and their association with website accuracy.

      Methods

      We performed Internet searches on the three most popular U.S. search engines (Google, Bing, Yahoo) for four of the top five most common nontrauma complaints in infants and children presenting to EDs (,
      • Gorelick M.H.
      • Alpern E.R.
      • Alessandrini E.A.
      A system for grouping presenting complaints: the pediatric emergency reason for visit clusters.
      ,
      • McDermott K.W.
      • Stocks C.
      • Freeman W.J.
      Overview of pediatric emergency department visits, 2015. Statistical brief #242. Healthcare Cost and Utilization Project (HCUP).
      ). Search terms were created by two parents/study authors who were board certified in emergency medicine and pediatric emergency medicine. Search terms included the word “child” plus each of the following: “fever,” “vomiting,” “cough,” and “stomach pain.” Prior to performing searches, websites were set to their default settings of English language, safe search mode, with 10 results per page. Websites for the top 20 results listed for each search engine were analyzed for accuracy. Websites were also analyzed for the secondary outcomes of readability, quality, and trustworthiness. Websites were included if they were addressed to parents/caregivers (e.g., term “your child” is used), addressed to patients (e.g., term “your cough” used), or detailed instructions for lay people on when to seek medical care with a physician or hospital (e.g., term “when to see a doctor” used). Websites were excluded from analysis if they were written for doctors or health professionals, duplicates, isolated videos, newspaper reports, restricted to subscriptions or fees, not written in English, presentations at conferences, or comprised lectures.
      We categorized websites as professional organizations if they were sponsored by a government, academic or medical specialty society, or hospital/hospital system. Sites not meeting these criteria were categorized as nonprofessional organizations/individuals. Authors were categorized as physicians if the website article was written or reviewed by a physician (MD or DO) or if the article consisted entirely of an interview of a physician. All other authors were categorized as nonphysicians. If no author was listed, the author was categorized as nonphysician.
      Prior to initiating the study, a 2-h training session took place with all study authors that emphasized definitions, uniform website review, and coding of website information. Initially, all reviewers/raters simultaneously analyzed 10 websites with pediatric complaints unrelated to this study. After review of every 48 websites, data abstraction, data entry, and coding rules were re-reviewed with reviewers by the principal investigator. The principal investigator arbitrated all coding questions on an ongoing basis.
      We analyzed website domain ages using an online database (
      Small SEO Tools
      Domain age checker.
      ).
      Between December 1 and December 5, 2018, the principal author connected a computer to the Internet and entered each set of search terms into each search engine. Retrieved web pages were saved to a spreadsheet and ordered (ranked) depending on how they initially appeared in each search engine. At that time, domain age, web page age, page categorization, article length, and author status were recorded for each website. At a later date, between December 15, 2018 and January 19, 2019, two physician study authors independently analyzed saved websites for information accuracy, with a third physician arbitrating accuracy disagreements. During the same time period (between December 15, 2018 and January 19, 2019), two study authors independently conducted trustworthy and quality reviews for each saved website. All information was entered directly into a spreadsheet and authors used print versions of quality and trustworthy criteria to calculate scores (Supplementary Tables 1 and 2).

      Accuracy

      Two physician authors initially independently assessed the accuracy of each website. Prior to assessment, text from each website was copied into a word processing document and converted to the same font (Times New Roman), text size, and single spaced (
      • McNally S.L.
      • Donohue M.C.
      • Kagnoff M.F.
      Can consumers trust web-based information about celiac disease? Accuracy, comprehensiveness, transparency, and readability of information on the internet.
      ). Website identifiers, affiliations, sponsors, advertisements, videos, nonessential pictures, and author names were removed prior to evaluation to allow blind assessment. Raters were instructed to identify correct and incorrect statements within the text. Inaccurate information was defined as that information that contradicted guidelines or published information from the American Academy of Pediatrics, the Canadian Paediatric Society, the American College of Emergency Physicians, and current major textbooks in pediatrics, emergency medicine, and pediatric emergency medicine at the time of the study. In addition to these publications, reviewers were allowed to search the National Library of Medicine (PubMed) for original articles, and the Cochrane Database to analyze accuracy of text within webpages. After independent review by two authors, all statements that were graded differently by the two initial reviewers were re-reviewed by a third physician, and disagreements regarding accuracy were settled by consensus. A ratio of correct divided by total correct plus incorrect medical statements was calculated for each site (
      • McNally S.L.
      • Donohue M.C.
      • Kagnoff M.F.
      Can consumers trust web-based information about celiac disease? Accuracy, comprehensiveness, transparency, and readability of information on the internet.
      ). According to McNally et al., a cutoff of ≥ 95% vs < 95% correct statements identifies a website as accurate vs. inaccurate (
      • McNally S.L.
      • Donohue M.C.
      • Kagnoff M.F.
      Can consumers trust web-based information about celiac disease? Accuracy, comprehensiveness, transparency, and readability of information on the internet.
      ). McNally et al. described this cut-off as a “minimum for providing patients, health care providers, and the public a reasonably high level of confidence that the information posted was accurate, irrespective and independent of the amount or diversity of information provided” (
      • McNally S.L.
      • Donohue M.C.
      • Kagnoff M.F.
      Can consumers trust web-based information about celiac disease? Accuracy, comprehensiveness, transparency, and readability of information on the internet.
      ).

      Readability

      Readability was analyzed using the Flesch Kincaid Grade Level (FKGL), Gunning Fog Index (FOG), Coleman-Liau Index (CLI), and the Simple Measure of Gobbledygook (SMOG) score for each website (
      • Memom M.
      • Ginsbery L.
      • Simunovic N.
      • et al.
      Quality of web-based information for the 10 most common fractures.
      ,
      • Sharma N.
      • Tridimas
      • Fitzsimmons P.R.
      A readability assessment of online stroke information.
      ,
      • Walsh T.M.
      • Volsko T.A.
      Readability assessment of Internet-based consumer health information.
      ,
      • Coleman M.
      • Liau T.L.
      A computer readability formula designed for machine scoring.
      ). All informational text from each website excluding authors, advertisements, links, pictures, copyright notices, disclaimers, acknowledgements, citations, and videos was entered into an online tool to calculate scores (
      • McNally S.L.
      • Donohue M.C.
      • Kagnoff M.F.
      Can consumers trust web-based information about celiac disease? Accuracy, comprehensiveness, transparency, and readability of information on the internet.
      ). Each analyzes text in a different manner: FKGL–sentence length and syllables, SMOG–complex word density, FOG–sentence numbers/length and complexity, CLI–characters per word and words/sentence. Calculations for each of these measures are listed in Table 1 (
      • Memom M.
      • Ginsbery L.
      • Simunovic N.
      • et al.
      Quality of web-based information for the 10 most common fractures.
      ,
      • Sharma N.
      • Tridimas
      • Fitzsimmons P.R.
      A readability assessment of online stroke information.
      ,
      • Walsh T.M.
      • Volsko T.A.
      Readability assessment of Internet-based consumer health information.
      ,
      • Coleman M.
      • Liau T.L.
      A computer readability formula designed for machine scoring.
      ).
      Table 1Readability Measures-Scores
      Readability MeasureCalculation-Formula
      Flesch Kincaid Grade Level (FKGL)Multiply 0.39 × (words per sentence) + 11.8 × (syllables/words) – 15.59. The result is converted into a U.S. grade level.
      Gunning Fog Index (FOG)Count the number of words per sentence, and the number of words with 3 or more syllables or complex words. The average number of years of formal education required to read the text is calculated using the formula: 0.4 × [(words/sentences) + 100 × (complex words/total words)].
      Coleman-Liau Index (CLI)Multiply 0.0588 × (average number of letters/100 words) – 0.296 (average number of sentences/100 words) – 15.8.
      Simple Measure of Gobbledygook (SMOG)Counts 10 consecutive sentences in beginning, middle and end of webpages (30 sentences). Then, count the number of words with 3 or more syllables in each of these sentences. Then, convert this into a grade level using formula 1.043 × the square root of (number of polysyllables × 30/number of sentences) + 3.1291.
      Use of these scores is recommended by the National Institutes of Health, the U.S. National Library of Medicine, and the Centers for Medicare and Medicaid to analyze the readability of health information published for the public (
      Centers for Medicare & Medicaid Services
      TOOLkit for making written material clear and effective: section 4 – Special topics for writing and design. Part 7 – using readability formulas: a cautionary note.
      ,
      U.S. National Library of Medicine. Medline Plus
      How to write easy-to-read health materials.
      ,
      National Institutes of Health – National Cancer Institute
      Making health communications programs work.
      ). The FKGL was developed for reading technical manuals by the U.S. Navy and was validated by the U.S. Department of Defense (
      • Ley P.
      • Florio T.
      The use of readability formulas in health care.
      ). The FOG and SMOG indexes have been validated using standard McCall-Crabbs test grade-level reading lessons (
      • Ley P.
      • Florio T.
      The use of readability formulas in health care.
      ). Although no validation studies for the CLI could be found, this measure has been shown to have a moderate to strong correlation with the FKGL (
      • Azer S.A.
      Is Wikipedia a reliable learning resource for medical students? Evaluating respiratory topics.
      ,
      • Azer S.A.
      Inflammatory bowel disease: an evaluation of health information on the internet.
      ).
      Using previously described methodology, multiple readability formulae were used to improve the reliability of measures, with scores averaged to form a composite readability score for individual websites (
      • Ley P.
      • Florio T.
      The use of readability formulas in health care.
      ,
      • Minoughan C.
      • Schumaier A.
      • Grawe B.
      • Kadazu R.
      Readability of sports injury and prevention patient education materials from the American Academy of Orthopeadic Surgeons website.
      ,
      • Oliffe M.
      • Thompson E.
      • Johnston J.
      • et al.
      Assessing the readability and patient comprehension of rheumatology medicine information sheets: a cross-sectional health literacy study.
      ,
      • Purdy A.C.
      • Idriss A.
      • Ahern S.
      • Lin E.
      • Elfenbein D.M.
      Dr. Google: the readability and accuracy of patient education websites for Graves’ disease treatment.
      ,
      • Rosenberg S.A.
      • Francis D.
      • Hullett C.R.
      • et al.
      Readability of online patient educational resources found on NCI-designated cancer center web sites.
      ,
      • Schumaier A.P.
      • Kakazu R.
      • Minoughan C.E.
      • Grawe B.M.
      Readability assessment of American shoulder and elbow surgeons patient brochures with suggestions for improvement.
      ,
      • Vargas C.R.
      • Koolen P.G.L.
      • Chuang D.J.
      • Ganor O.
      • Lee B.T.
      Online patient resources for breast reconstruction: an analysis of readability.
      ,
      • Yeung A.W.K.
      • Goto T.K.
      • Leung W.K.
      Readability of the 100 most-cited neuroimaging papers assessed by common readability formulae.
      ,
      • Zhou S.
      • Jeong H.
      • Green P.A.
      How consistent are the best known-known readability equations in estimating the readability of design standards?.
      ). A cutoff of an 8th-grade or lower reading level (≤8.9 grade level) was chosen to define readability.

      Quality and Trustworthiness

      Two study authors independently evaluated the quality of websites using JAMA benchmark criteria (Supplementary Table 1) (
      • Silberg W.M.
      • Lundberg G.D.
      • Muscacchio R.A.
      Assessing, controlling, and assuring the quality of medical information on the internet. Caveat lector et viewor – let the reader and viewer beware.
      ). These criteria comprise 1) a description of the website material's author (name, affiliation, and credentials), 2) attribution or references for the content, 3) currency with dates of posts and updates provided, and 4) disclosure of any potential conflicts of interest. Each criterion fulfilled received 1 point, for a total of 4 points. Raters' scores were averaged to calculate a single score for each site. Using previously published definitions, sites with ≥ 3 JAMA criteria were classified as high quality, and sites with < 3 of these criteria were classified as low quality (
      • Barker S.
      • Charlton N.P.
      • Holstege C.P.
      Accuracy of internet recommendations for prehospital care of venomous snake bites.
      ,
      • Meric F.
      • Bernstam E.V.
      • Mirza N.Q.
      • et al.
      Breast cancer on the World Wide Web: cross sectional survey of quality of information and popularity of websites.
      ,
      • Hess D.R.
      Information retrieval in respiratory care: tips to locate what you need to know.
      ).
      Two study authors independently evaluated sites using the National Library of Medicine (NLM) criteria (
      National Network of Libraries of Medicine
      Evaluating health websites.
      ). NLM criteria rate three features of website trustworthiness as 0, 1, or 2, based on currency or timeliness, publisher's authority, and accuracy of cited sources, for a maximum total of 6 points (Supplementary Table 2). Raters' scores were averaged to calculate a single score for each site. To be trustworthy, websites needed ≥ 3 points, with no individual criterion receiving a 0 average.
      The presence or absence of Health on the Net Foundation (HON) code of conduct certification was recorded for each website. HON is a Swiss not-for-profit organization originating from a September 1995 conference entitled “The Use of the Internet and World-Wide Web for Telematics in Health Care” (
      • Boyer C.
      • Appel R.D.
      • Ball M.J.
      • et al.
      Health on the net’s 20 years of transparent and reliable information.
      ,
      • Ball M.J.
      Twenty years of health on the net: committed to reliable information.
      ). It is affiliated with the University Hospital of Geneva and the Swiss Institute of Bioinformatics. HON certification (HONcode) requires a health or medical website to request a review. For submitted sites, an annual review of content based on eight principles is performed by an expert panel of medical professionals. To obtain HON certification, websites must conform to each of the principles of authority (author qualifications), complementarity (information supports and does not replace information), confidentiality (of site user and visitors), attribution (citing sources and dates of medical information), justification (justifies claims in balanced and objective manner), transparency (accessible, valid contact details), financial disclosure, and advertising (clearly distinguishes advertising from editorial commentary) (
      • Boyer C.
      • Appel R.D.
      • Ball M.J.
      • et al.
      Health on the net’s 20 years of transparent and reliable information.
      ,
      • Ball M.J.
      Twenty years of health on the net: committed to reliable information.
      ).

      Statistics

      Prior studies found that 39–50% of websites provide accurate medical advice related to children's health, and that 21.5% (20–23%) of websites about pediatric disorders have high-quality JAMA benchmark criteria (
      • Joury A.
      • Joraid A.
      • Algahtani F.
      • et al.
      The variation in quality and content of patient-focused health information on the internet for otitis media.
      ,
      • Nassiri M.
      • Bruce-Brand R.A.
      • O’Neill F.
      • et al.
      Perthes disease: the quality and reliability of information on the internet.
      ,
      • Scullard P.
      • Peacock C.
      • Davies P.
      Googling children’s health: reliability of medical advice on the internet.
      ). Assuming 45% of studied websites were accurate, a study with at least 114 websites would be needed to detect a 25% difference (21.5% vs. 46.5%) in high-quality JAMA benchmark criteria rates between accurate and inaccurate sites. It was estimated that half of initially searched websites would be excluded (e.g., duplicates, physician oriented, paid sites, copies of newspaper articles). Thus, at least 228 websites would need to be included in an initial search.
      All data were treated as nonparametric. Categorical data were compared between accurate (≥95% correct) and inaccurate websites using chi-squared analysis or Fisher's exact test. Pairwise comparisons of continuous and ordinal data were made using the Mann-Whitney U test. p Values were adjusted for multiple comparisons using the Benjamini-Hochberg method (
      • McDonald J.H.
      Multiple comparisons.
      ). Spearman rank order was used to assess correlation between website accuracy and the respective rank order during the initial search.
      Interrater correlation for initial website accuracy, JAMA benchmark criteria, and NLM trustworthy scores was calculated using Cohen's kappa. A kappa coefficient was considered almost perfect at 0.81–1, showed substantial or good agreement at 0.61–0.80, moderate agreement at 0.41–0.60, fair agreement at 0.21–0.40, slight agreement at 0.01–0.20, and less than chance at < 0.
      Data were analyzed using MedCalc statistical software (v18.2.1; MedCalc Software bvba, Ostend, Belgium, 2018).

      Results

      Each search engine was queried between December 1, 2018 and December 5, 2018, resulting in 240 initial websites. Analysis and rating of websites took place between December 15, 2018 and January 19, 2019. During review, one website was no longer available and was analyzed using a website that periodically archives webpages throughout the year (
      Internet Archive. Waybackmachine.
      ). Of the original 240 websites, 96 were excluded: 94 that were duplicates and two websites that were written for medical professionals (one that addressed American Gastroenterological Association guidelines for nausea and vomiting, one that stated “This article is for Medical Professionals”). The remaining 144 websites were identified as being directed at parents, patients, or caregivers in the following manner: 130 used the term “your child” with diagnostic or management recommendations for parents/caregivers, eight instructed caregivers when to seek medical care, and six directly addressed a patient's symptoms (e.g., “if you feel sick,” “if you have a cough”). Website details are listed in Table 2.
      Table 2Website Information
      Website domain age
      Age in years.
      18.4 (10.4–20.8, IQR)
      Webpage age
      Age in years.
      1 (0.4–2, IQR)
      Site sponsor – professional medical organization74 (51%)
       Hospital/hospital system45
       Government agency15
       Academic medical society14
      Site sponsor – nonprofessional medical organization70 (49%)
       Nonmedical professional54
       Health care professional16
      Physician author52 (36%)
      No physician author92 (64%)
       Nonphysician author17
       No author listed75
      Unique websites for each query
       Child + Fever32 (22%)
       Child + Cough40 (28%)
       Child + Vomiting34 (24%)
       Child + Stomach pain38 (26%)
      IQR = interquartile range.
      Age in years.
      Median physician-rated accuracy of all websites was 96.4% (interquartile range [IQR] 90–100). Interrater agreement for the initial two-physician assessment of accurate vs. inaccurate websites was superior (k = 0.94, 95% confidence interval [CI] 0.88–1.0). Five representative incorrect medical statements are listed for each complaint (Supplementary Table 3). Accurate websites (≥95% correct) were more frequently published by professional medical organizations, compared with inaccurate websites (63% vs. 33%, p < 0.001). Readability (reading grade level), domain age, article/page age, article length (number of words), physician authorship, website query/complaint, HON certification, JAMA Benchmark Criteria, and NLM Trustworthy scores did not differ between accurate and inaccurate sites (Table 3).
      Table 3Comparison of Accurate vs. Inaccurate Websites
      Features
      Median (interquartile range) for continuous and ordinal data.
      Accurate n = 87Inaccurate n = 57p Value
      p Values corrected for multiple comparisons using Benjamini and Hochberg adjustment (39).
      Domain age - years17.2 (9.9–20)19.5 (12.6–21.1)0.390
      Web article age - years
      Includes the most recent date page published or updated, 27 accurate sites and 16 inaccurate sites did not have date web article published or date updated and were not included in this comparison.
      0.9 (0.3–2.3)1.1 (0.7–1.9)0.598
      Web article length - words917 (694–1302)964 (695–1423)0.652
      Professional medical organization (hospital, academic medical society, government agency)55 (63.2%)19 (33.3%)0.007
      Physician author or interview28 (32.2%)24 (42.1%)0.454
      Composite reading grade - years9.2 (8.4–11)9.1 (8–10.5)0.639
       FGKL8.7 (7.7–10.4)8.6 (7.3–10)0.778
       SMOG8.3 (7.4–9.7)8.1 (7.2–9.3)0.639
       FOG11.1 (10.1–13.1)10.9 (9.6–12.5)0.598
       CLI10 (9–11)9 (8–10)0.390
      HON Foundation Certified27 (31%)22 (38.6%)0.598
      JAMA Benchmark High Quality17 (19.5%)21 (36.8%)0.154
      NLM Trustworthy22 (25.3%)22 (38.6%)0.329
      Website medical query/complaint
      Percentage signifies number of websites with each medical complaint and accuracy/inaccuracy divided by total number of websites (n = 144).
      0.329
       Fever14 (9.7%)18 (12.5%)
       Cough23 (16%)17 (11.8%)
       Vomiting23 (16%)11 (7.6%)
       Stomach pain27 (18.8%)11 (7.6%)
      FKGL = Flesch Kincaid Grade Level; SMOG = Simple Measure of Gobbledygook; FOG = Gunning Fog Index; CLI = Coleman-Liau Index; HON = Health on the Net; JAMA = Journal of the American Medical Association; NLM = National Library of Medicine.
      Median (interquartile range) for continuous and ordinal data.
      p Values corrected for multiple comparisons using Benjamini and Hochberg adjustment
      • McDonald J.H.
      Multiple comparisons.
      .
      Includes the most recent date page published or updated, 27 accurate sites and 16 inaccurate sites did not have date web article published or date updated and were not included in this comparison.
      § Percentage signifies number of websites with each medical complaint and accuracy/inaccuracy divided by total number of websites (n = 144).
      The median readability grade levels for all websites were as follows: Composite of all sites 9.2 years (IQR 8.3–10.7), FKGL 8.6 years (IQR 7.6–10.2), SMOG 8.2 years (IQR 7.3–9.7), FOG 10.9 years (IQR 9.9–12.6), and CLI 9 years (IQR 8.3–10.7). Sixty websites (42%) were written at or below an 8th-grade level, and nine (6%) were written at or below a 6th-grade level.
      Thirty-eight websites (26.4%) had high-quality JAMA Benchmark criteria. Forty-four websites (30.6%) were graded as trustworthy (NLM trustworthy score). Interrater agreement was good for determination of high- vs. low-quality JAMA Benchmark Criteria (k = 0.68, 95% CI 0.55–0.81) and good for determination of NLM trustworthiness (k = 0.66, 95% CI 0.51–0.81).
      Forty-nine (34%) websites were HONCode certified. HON-certified websites more frequently had high-quality JAMA Benchmark Criteria (50% vs. 28.3%, p = 0.02) and NLM Trustworthy scores (63.6% vs. 21%, p < 0.01) than sites without HON certification.
      Overall, seven websites (5%) were readable plus met each of the definitions for accuracy, NLM trustworthiness, and JAMA high quality.
      There was no correlation between accuracy and website rank order during the initial search, Spearman rho −0.046 (95% CI −0.208–0.118, p = 0.49).

      Discussion

      To the best of our knowledge, our study is the first to evaluate the adequacy of parent-directed health information regarding pediatric emergency complaints on the Internet. Our results indicate that many such websites provide inadequate information. Although overall median website accuracy was 96%, we found that 40% of websites were inaccurate based on our predefined cut-off of 95% correct items. Only publication by a professional agency (government, hospital/hospital system, or academic society) was associated with website accuracy. The majority of websites did not meet criteria for high quality and trustworthiness, and most websites were written at a reading level above that of the average parent. Only seven websites (5%) met each of our definitions for readability, accuracy, and quality/trustworthiness, indicating the potential need for improved oversite, rating, or regulation of parent- and consumer-oriented health information websites.
      With its widespread use, experts have expressed concern about the quality of health information on the web (
      • Eysenbach G.
      • Powell J.
      • Kuss O.
      Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review.
      ,
      • Fahy E.
      • Hardikar R.
      • Fox A.
      • Mackay S.
      Quality of patient health information on the internet: reviewing a complex and evolving landscape.
      ). In 2002, Eysenbach et al. evaluated 55 prior studies of health information on the Internet; 70% of analyzed studies indicated that information quality was a problem (
      • Eysenbach G.
      • Powell J.
      • Kuss O.
      Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review.
      ). Ideally, search engine results should directly answer a specific medical question in the most trustworthy, accurate, and understandable fashion. However, algorithms used by search engines are proprietary and involve more than these features. Popularity, number of links, link traffic, importance of links, advertisements, and other unknown features play a role in the ranking of search engine results (
      • Shields W.C.
      • Omaki E.
      • McDonald E.M.
      • et al.
      Cell phone and computer use among parents visiting an urban pediatric emergency department.
      ). The role that accuracy, quality, trustworthiness, and readability of health information play in a search engine's results is unknown.
      A parent's decision to visit their doctor or an ED is multifactorial and includes convenience, referrals, insurance status, time of day, concern about severity of an illness, and the need for a second opinion (
      • Butun A.
      • Linden M.
      • Lynn F.
      • McGaughey J.
      Exploring parents’ reasons for attending the emergency department for children with minor illnesses: a mixed methods systematic review.
      ). Of parents visiting clinics and EDs, 94–99% have Internet access, with 80–88% having smartphones (
      • Shields W.C.
      • Omaki E.
      • McDonald E.M.
      • et al.
      Cell phone and computer use among parents visiting an urban pediatric emergency department.
      ,
      • Drent A.M.
      • Brousseau D.C.
      • Morrison A.K.
      Health information preferences of parents in a pediatric emergency department.
      ,
      • Katz V.S.
      • Gonzalez C.
      • Clark K.
      Digital inequality and developmental trajectories of low-income, immigrant, and minority children.
      ,
      • Saidinejad M.
      • Teach S.J.
      • Chamberlain J.M.
      Internet access and electronic communication among families in an urban pediatric emergency department.
      ,
      • Pehora C.
      • Gajaria N.
      • Stoute M.
      • Fracassa S.
      • Serebale-O'Sullivan R.
      • Matava C.T.
      Are parents getting it right? A survey of parents’ internet use for children’s health care information.
      ). There is evidence that Internet use influences a parent's decision about whether or not to seek care in an ED. Shroff et al. found that 29% of parents were more certain and 19% were less certain that they needed to visit an ED after searching the Internet (
      • Shroff P.L.
      • Hayes R.W.
      • Padmanabhan P.
      • Stevenson M.D.
      Internet usage by parents prior to seeking care at a pediatric emergency department: observation study.
      ). Experts have noted that parents regard online information as more up to date, easy to access, and often trustworthy, especially if that information is from a hospital or university website (
      • Walsh T.M.
      • Volsko T.A.
      Readability assessment of Internet-based consumer health information.
      ,
      • Drent A.M.
      • Brousseau D.C.
      • Morrison A.K.
      Health information preferences of parents in a pediatric emergency department.
      ,
      • Pehora C.
      • Gajaria N.
      • Stoute M.
      • Fracassa S.
      • Serebale-O'Sullivan R.
      • Matava C.T.
      Are parents getting it right? A survey of parents’ internet use for children’s health care information.
      ,
      • Jones C.H.
      • Neill S.
      • Lakhanpaul M.
      • Roland D.
      • Singlehurst-Mooney H.
      • Thompson M.
      Information needs of parents for acute childhood illness: determining ‘what, how, where and when’ of safety netting using a qualitative exploration with parents and clinicians.
      ). The fact that websites are frequently accessed and often trusted has important implications regarding health care information dissemination and actions by parents of children with acute medical complaints.
      Experts have found that parents with low health literacy are less likely to access the Internet compared with those with high health literacy (
      • Drent A.M.
      • Brousseau D.C.
      • Morrison A.K.
      Health information preferences of parents in a pediatric emergency department.
      ,
      • Mackert M.
      • Mabry-Flynn A.
      • Pounders K.
      Health literacy and health information technology adoption: the potential for a new digital divide.
      ). It is possible that this important group of parents would not be affected by information on the Internet. Moreover, readers with lower overall literacy have poorer short-term memory and more difficulty understanding text on websites. Individuals with limited literacy read more slowly, often reread text, and may miss sections on webpages (
      Agency for Healthcare Research and Quality, U.S
      Department of Health and Human Services. Tip 6. Be cautious about using readability formulas.
      ). Independent of reading grade level, formatting (font, text size, line spacing) and length of text (number of words) may influence readability and comprehension of information on websites. Other than length of text, these website features were not assessed in our study.
      Higher-quality JAMA benchmark criteria and higher NLM trustworthy scores in HON-certified sites in our study indicate that this certification may be a useful indicator of quality. However, HON certification is voluntary, and most health information websites have not undergone a HON review. For this reason, most consumers of health information cannot rely solely on this measure. Although HON certification addresses quality, transparency, and trustworthiness of information, it does not measure accuracy of information.
      Multiple potential solutions exist to address the types of inadequate health information found in our study. One solution would be to compel internet and social media companies to censor incorrect information. Lawmakers have been partially successful in calling on Internet and social media companies like Amazon, Google, Facebook, Instagram, Pinterest, and Twitter to eliminate incorrect information regarding vaccines (
      • Frier S.
      Facebook, facing lawmaker questions, says it may remove anti-vaccine recommendations. Bloomberg News.
      ,
      • Johnson S.R.
      AMA urges Amazon, Facebook, Google and Twitter to do more to limit false anti-vaccine claims. Modern Healthcare.
      ,
      • Telford T.
      Pinterest is blocking search results about vaccines to protect users from misinformation. Washington Post.
      ). However, these efforts involved years of discussion and concerned only a single issue: the safety of childhood vaccines. Another approach might be to improve the rating of websites, replacing or augmenting HON certification using American College of Emergency Physicians or American Academy of Pediatrics designees to assess accuracy of websites while adding website requirements for an appropriate readability level and specific NLM trustworthy and JAMA quality criteria. From a practical standpoint, this might prove difficult, as the search results for each query in this study returned over a million results. Alternately, specialty societies could create accurate, readable, high-quality online sources for health-related information for individual medical complaints. This information, in turn, could be given the highest “rank” by search engines so that it appears first during search queries.

      Limitations

      This study did not directly evaluate websites any parent accessed prior to visiting a physician's office or ED. Measuring previsit Internet use would require a survey of parents on arrival to an office or ED. This measurement is problematic because it would require parents to remember their exact query terms, the rank order of results, and the exact sites that were visited within a specified time period prior to their visit. One survey of parents who accessed the Internet 24 h prior to an ED visit found that over one-third of parents could not remember any visited website (
      • Shroff P.L.
      • Hayes R.W.
      • Padmanabhan P.
      • Stevenson M.D.
      Internet usage by parents prior to seeking care at a pediatric emergency department: observation study.
      ). Thus, relying on a parent's memory to analyze visited websites might be inaccurate. Without direct monitoring and measuring of Internet activity and parent behavior, it would not be possible to know which websites were viewed, what text within sites was read, what text was understood, and what effect information on these sites had on parent behavior.
      Only the first 20 results were analyzed for each search engine query in this study. This number was chosen because the default number of results for common search engines (Google, Yahoo, Bing) is 10, and online surveys indicate that 80% of Internet searches are ended within the first two pages of any search (
      iProspect. Search engine user behavior study.
      ). We chose to study four of the five most common nontrauma complaints (59.3% of all nontrauma complaints) in children presenting to EDs (
      • Gorelick M.H.
      • Alpern E.R.
      • Alessandrini E.A.
      A system for grouping presenting complaints: the pediatric emergency reason for visit clusters.
      ). These four complaints are also within the top 10 reasons parents search the Internet for health care information (
      • Pehora C.
      • Gajaria N.
      • Stoute M.
      • Fracassa S.
      • Serebale-O'Sullivan R.
      • Matava C.T.
      Are parents getting it right? A survey of parents’ internet use for children’s health care information.
      ). It is uncertain if alternate complaints or search terms would have yielded different results.
      The majority of parents visiting a pediatric clinic or ED read at or below an 8th-grade level (
      • Moon R.Y.
      • Cheng T.L.
      • Patel K.
      • et al.
      Parental literacy level and understanding of medical information.
      ). For this reason, we chose an 8th-grade level as the cutoff for readability. Independent of text readability, other factors not analyzed in this study may influence a website's reading ease and understandability. These include organization of the information, page design, graphics, type setting, pictures, and cultural relevance.
      We chose a previously described criterion, ≥ 95% correct information, to define websites as accurate vs. inaccurate based on a single study using this cutoff (
      • McNally S.L.
      • Donohue M.C.
      • Kagnoff M.F.
      Can consumers trust web-based information about celiac disease? Accuracy, comprehensiveness, transparency, and readability of information on the internet.
      ). It can be argued that this cutoff is arbitrary and that any cutoff < 100% might place patients at risk for harm. Realistically, an accurate website (≥95% correct) with a single incorrect important fact (e.g., use of a temperature cutoff of 38.5°C/101.3°F to define neonatal fever) might lead to patient harm, whereas an inaccurate website (<95% correct) with multiple minor unimportant incorrect facts (e.g., an incorrect appendix size on computed tomography) might cause no harm. Our study weighted information equally regardless of importance. Subsequently, websites with the same level of accuracy may have different potentials to lead to parent actions that might be harmful to a child's health.
      Readability scores were originally developed for analyzing technical manuals and educational material and not necessarily health-related information (
      • Ley P.
      • Florio T.
      The use of readability formulas in health care.
      ,
      • Zhou S.
      • Jeong H.
      • Green P.A.
      How consistent are the best known-known readability equations in estimating the readability of design standards?.
      ). Prior authors have commented that medical terminology can inflate the grade level of text (
      • Thompson A.E.
      • Graydon S.L.
      Patient-oriented methotrexate information sites on the Internet: a review of completeness, accuracy, format, reliability, credibility, and readability.
      ,
      • Krass I.
      • Svarstad B.L.
      • Bultman D.
      Using alternative methodologies for evaluating medication leaflets.
      ). However, Berland et al. found that removing medical terminology lowered the reading level of health information on studied websites by only 0.3 years (
      • Berland G.K.
      • Elliott M.N.
      • Morales L.S.
      • et al.
      Health information on the internet. Accessibility, quality, and readability in English and Spanish.
      ). Others have found that the readability of patient education material can be improved up to three grade levels by substituting multisyllabic words, adapting sentence structure, and shortening sentences (
      • Betschart P.
      • Zumstein V.
      • Bentivoglio
      • et al.
      Readability assessment of online patient education materials provided by the European Association of Urology.
      ,
      • Horner S.D.
      • Surratt D.
      • Juliusson S.
      Improving readability of patient education materials.
      ,
      • Sheppard E.D.
      • Hyde Z.
      • Florence M.N.
      • et al.
      Improving the readability of online foot and ankle patient education materials.
      ). Future work on the readability of parent education materials should consider incorporating this editing process while simultaneously testing whether or not parents comprehend prepared material.

      Conclusion

      In our study, only 60% of websites directed at parents with acutely ill children were categorized as accurate. The majority of websites were of low quality, had low trustworthy scores, and were written at a grade level too difficult for most parents to understand. Inadequate and inaccurate information has the potential to adversely influence medical decisions made by parents. Measures should be taken to ensure that information on the Internet related to acute pediatric complaints is accurate, readable, and has high quality/trustworthiness.

      Acknowledgments

      The authors would like to thank Joe Pagane, MD for assistance with arbitrating website physician accuracy questions.

      Supplementary Data

      References

        • Internet Live Stats
        Trends & more (statistics).
        http://www.internetlivestats.com/statistics/
        Date accessed: January 30, 2019
        • Cocco A.M.
        • Zordan R.
        • Taylor D.
        • et al.
        Dr Google in the ED: searching for online health information by adult emergency department patients.
        Med J Aus. 2018; 209: 342-347
        • Shroff P.L.
        • Hayes R.W.
        • Padmanabhan P.
        • Stevenson M.D.
        Internet usage by parents prior to seeking care at a pediatric emergency department: observation study.
        Interact J Med Res. 2017; 28: e17
        • Eysenbach G.
        • Powell J.
        • Kuss O.
        Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review.
        JAMA. 2002; 287: 2691-2700
        • Fahy E.
        • Hardikar R.
        • Fox A.
        • Mackay S.
        Quality of patient health information on the internet: reviewing a complex and evolving landscape.
        Australas Med J. 2014; 7: 24-28
        • Alexa
        The top 500 sites on the web.
        • Gorelick M.H.
        • Alpern E.R.
        • Alessandrini E.A.
        A system for grouping presenting complaints: the pediatric emergency reason for visit clusters.
        Acad Emerg Med. 2005; 12: 723-731
        • McDermott K.W.
        • Stocks C.
        • Freeman W.J.
        Overview of pediatric emergency department visits, 2015. Statistical brief #242. Healthcare Cost and Utilization Project (HCUP).
        Agency for Healthcare Research and Quality, Rockville, MD2018
        • Small SEO Tools
        Domain age checker.
        https://smallseotools.com/domain-age-checker/
        Date accessed: January 19, 2019
        • McNally S.L.
        • Donohue M.C.
        • Kagnoff M.F.
        Can consumers trust web-based information about celiac disease? Accuracy, comprehensiveness, transparency, and readability of information on the internet.
        Interact J Med Res. 2012; 1: e1
        • Memom M.
        • Ginsbery L.
        • Simunovic N.
        • et al.
        Quality of web-based information for the 10 most common fractures.
        Interact J Med Res. 2016; 17: e19
        • Sharma N.
        • Tridimas
        • Fitzsimmons P.R.
        A readability assessment of online stroke information.
        J Stroke Cerebrovasc Dis. 2014; 23: 1362-1367
        • Walsh T.M.
        • Volsko T.A.
        Readability assessment of Internet-based consumer health information.
        Respir Care. 2008; 53: 1310-1315
        • Coleman M.
        • Liau T.L.
        A computer readability formula designed for machine scoring.
        J Appl Psychol. 1975; 60: 283-284
        • Centers for Medicare & Medicaid Services
        TOOLkit for making written material clear and effective: section 4 – Special topics for writing and design. Part 7 – using readability formulas: a cautionary note.
        • U.S. National Library of Medicine. Medline Plus
        How to write easy-to-read health materials.
        https://medlineplus.gov/etr.html
        Date accessed: April 29, 2019
        • National Institutes of Health – National Cancer Institute
        Making health communications programs work.
        • Ley P.
        • Florio T.
        The use of readability formulas in health care.
        Psych Health Med. 1996; 1: 7-28
        • Azer S.A.
        Is Wikipedia a reliable learning resource for medical students? Evaluating respiratory topics.
        Adv Physiol Edu. 2015; 39: 5-14
        • Azer S.A.
        Inflammatory bowel disease: an evaluation of health information on the internet.
        World J Gastroenterol. 2017; 23: 1676-1696
        • Minoughan C.
        • Schumaier A.
        • Grawe B.
        • Kadazu R.
        Readability of sports injury and prevention patient education materials from the American Academy of Orthopeadic Surgeons website.
        J Am Acad Orthop Surg Glob Res Rev. 2018; 2: e002
        • Oliffe M.
        • Thompson E.
        • Johnston J.
        • et al.
        Assessing the readability and patient comprehension of rheumatology medicine information sheets: a cross-sectional health literacy study.
        BMJ Open. 2019; 9: e024582
        • Purdy A.C.
        • Idriss A.
        • Ahern S.
        • Lin E.
        • Elfenbein D.M.
        Dr. Google: the readability and accuracy of patient education websites for Graves’ disease treatment.
        Surgery. 2017; 162: 1148-1154
        • Rosenberg S.A.
        • Francis D.
        • Hullett C.R.
        • et al.
        Readability of online patient educational resources found on NCI-designated cancer center web sites.
        J Natl Compr Canc Netw. 2016; 14: 735-740
        • Schumaier A.P.
        • Kakazu R.
        • Minoughan C.E.
        • Grawe B.M.
        Readability assessment of American shoulder and elbow surgeons patient brochures with suggestions for improvement.
        JSES Open Access. 2018; 2: 150-154
        • Vargas C.R.
        • Koolen P.G.L.
        • Chuang D.J.
        • Ganor O.
        • Lee B.T.
        Online patient resources for breast reconstruction: an analysis of readability.
        Plast Reconstr Surg. 2014; 134: 406-413
        • Yeung A.W.K.
        • Goto T.K.
        • Leung W.K.
        Readability of the 100 most-cited neuroimaging papers assessed by common readability formulae.
        Front Hum Neurosci. 2018; 12: 308
        • Zhou S.
        • Jeong H.
        • Green P.A.
        How consistent are the best known-known readability equations in estimating the readability of design standards?.
        IEEE Trans Prof Commun. 2017; 60: 97-111
        • Silberg W.M.
        • Lundberg G.D.
        • Muscacchio R.A.
        Assessing, controlling, and assuring the quality of medical information on the internet. Caveat lector et viewor – let the reader and viewer beware.
        JAMA. 1997; 277: 1244-1245
        • Barker S.
        • Charlton N.P.
        • Holstege C.P.
        Accuracy of internet recommendations for prehospital care of venomous snake bites.
        Wilderness Environ Med. 2010; 21: 298-302
        • Meric F.
        • Bernstam E.V.
        • Mirza N.Q.
        • et al.
        Breast cancer on the World Wide Web: cross sectional survey of quality of information and popularity of websites.
        BMJ. 2002; 324: 577-581
        • Hess D.R.
        Information retrieval in respiratory care: tips to locate what you need to know.
        Respir Care. 2004; 49: 389-399
        • National Network of Libraries of Medicine
        Evaluating health websites.
        • Boyer C.
        • Appel R.D.
        • Ball M.J.
        • et al.
        Health on the net’s 20 years of transparent and reliable information.
        Stud Health Technol Inform. 2016; 228: 700-704
        • Ball M.J.
        Twenty years of health on the net: committed to reliable information.
        Stud Health Technol Inform. 2016; 225: 738-740
        • Joury A.
        • Joraid A.
        • Algahtani F.
        • et al.
        The variation in quality and content of patient-focused health information on the internet for otitis media.
        Child Care Health Dev. 2018; 44: 221-226
        • Nassiri M.
        • Bruce-Brand R.A.
        • O’Neill F.
        • et al.
        Perthes disease: the quality and reliability of information on the internet.
        J Pediatr Orthop. 2015; 35: 530-535
        • Scullard P.
        • Peacock C.
        • Davies P.
        Googling children’s health: reliability of medical advice on the internet.
        Arch Dis Child. 2010; 95: 580-582
        • McDonald J.H.
        Multiple comparisons.
        Handbook of biological statistics. 3rd edn. Sparky House Publications, Baltimore, MD2014: 257-263
      1. Internet Archive. Waybackmachine.
        https://archive.org/web/
        Date accessed: January 19, 2019
        • Shields W.C.
        • Omaki E.
        • McDonald E.M.
        • et al.
        Cell phone and computer use among parents visiting an urban pediatric emergency department.
        Pediatr Emerg Care. 2018; 34: 878-882
        • Butun A.
        • Linden M.
        • Lynn F.
        • McGaughey J.
        Exploring parents’ reasons for attending the emergency department for children with minor illnesses: a mixed methods systematic review.
        Emerg Med J. 2019; 36: 39-46
        • Drent A.M.
        • Brousseau D.C.
        • Morrison A.K.
        Health information preferences of parents in a pediatric emergency department.
        Clin Pediatr. 2018; 57: 519-527
        • Katz V.S.
        • Gonzalez C.
        • Clark K.
        Digital inequality and developmental trajectories of low-income, immigrant, and minority children.
        Pediatrics. 2017; 140: S132-S136
        • Saidinejad M.
        • Teach S.J.
        • Chamberlain J.M.
        Internet access and electronic communication among families in an urban pediatric emergency department.
        Pediatr Emerg Care. 2012; 28: 553-557
        • Pehora C.
        • Gajaria N.
        • Stoute M.
        • Fracassa S.
        • Serebale-O'Sullivan R.
        • Matava C.T.
        Are parents getting it right? A survey of parents’ internet use for children’s health care information.
        Interact J Med Res. 2015; 4: e12
        • Jones C.H.
        • Neill S.
        • Lakhanpaul M.
        • Roland D.
        • Singlehurst-Mooney H.
        • Thompson M.
        Information needs of parents for acute childhood illness: determining ‘what, how, where and when’ of safety netting using a qualitative exploration with parents and clinicians.
        BMJ Open. 2014; 4: e003874
        • Mackert M.
        • Mabry-Flynn A.
        • Pounders K.
        Health literacy and health information technology adoption: the potential for a new digital divide.
        J Med Internet Res. 2016; 18: e264
        • Agency for Healthcare Research and Quality, U.S
        Department of Health and Human Services. Tip 6. Be cautious about using readability formulas.
        • Frier S.
        Facebook, facing lawmaker questions, says it may remove anti-vaccine recommendations. Bloomberg News.
        • Johnson S.R.
        AMA urges Amazon, Facebook, Google and Twitter to do more to limit false anti-vaccine claims. Modern Healthcare.
        • Telford T.
        Pinterest is blocking search results about vaccines to protect users from misinformation. Washington Post.
      2. iProspect. Search engine user behavior study.
        • Moon R.Y.
        • Cheng T.L.
        • Patel K.
        • et al.
        Parental literacy level and understanding of medical information.
        Pediatrics. 1998; 102: e25
        • Thompson A.E.
        • Graydon S.L.
        Patient-oriented methotrexate information sites on the Internet: a review of completeness, accuracy, format, reliability, credibility, and readability.
        J Rheumatol. 2009; 36: 41-49
        • Krass I.
        • Svarstad B.L.
        • Bultman D.
        Using alternative methodologies for evaluating medication leaflets.
        Patient Educ Couns. 2002; 47: 29-35
        • Berland G.K.
        • Elliott M.N.
        • Morales L.S.
        • et al.
        Health information on the internet. Accessibility, quality, and readability in English and Spanish.
        JAMA. 2001; 285: 2612-2621
        • Betschart P.
        • Zumstein V.
        • Bentivoglio
        • et al.
        Readability assessment of online patient education materials provided by the European Association of Urology.
        Int Urol Nephrol. 2017; 49: 2111-2117
        • Horner S.D.
        • Surratt D.
        • Juliusson S.
        Improving readability of patient education materials.
        J Commun Health Nurs. 2000; 17: 15-23
        • Sheppard E.D.
        • Hyde Z.
        • Florence M.N.
        • et al.
        Improving the readability of online foot and ankle patient education materials.
        Foot Ankle Int. 2014; 35: 1282-1286

      Linked Article

      • Search at Your Own Risk: Online Health Queries and Your Patient
        Journal of Emergency MedicineVol. 57Issue 4
        • Preview
          During a recent, busy evening shift I found myself spending a bit more time in the room of a child with an uncomplicated viral upper respiratory infection. Instead of explaining my expected course of illness or supportive care, the bulk of my time was spent sitting with the young parents reviewing a collection of websites on their smartphone, all with information they had collected on caring for fever and cough for their toddler.
        • Full-Text
        • PDF