Abstract
Background
The Internet is a universal source of information for parents of children with acute complaints.
Objectives
We sought to analyze information directed at parents regarding common acute pediatric complaints.
Methods
Authors searched three search engines for four complaints (child + fever, vomiting, cough, stomach pain), assessing the first 20 results for each query. Readability was evaluated using: Flesch-Kincaid Grade Level, Gunning Fog, Simple Measure of Gobbledygook, and the Coleman-Liau Index. Two reviewers independently evaluated Journal of the American Medical Association (JAMA) Benchmark Criteria and National Library of Medicine (NLM) Trustworthy scores. Two physicians (emergency medicine/EM, pediatric EM) analyzed text accuracy (number correct divided by total number of facts). Disagreements were settled by a third physician. Accuracy was defined as ≥ 95% correct, readability as an 8th-grade reading level, high quality as at least three JAMA criteria, and trustworthiness as an NLM score ≥ 3. Accurate and inaccurate websites were compared using chi-squared analysis and Mann-Whitney U test.
Results
Eighty-seven websites (60%) were accurate (k = 0.94). Sixty (42%) of 144 evaluable websites were readable, 38 (26%) had high-quality JAMA criteria (kappa/k = 0.68), and 44 (31%) had reliable NLM trustworthy scores (k = 0.66). Accurate websites were more frequently published by professional medical organizations (hospitals, academic societies, governments) compared with inaccurate websites (63% vs. 33%, p < 0.01). There was no association between accuracy and physician authorship, search rank, quality, trustworthiness, or readability.
Conclusion
Many studied websites had inadequate accuracy, quality, trustworthiness, and readability. Measures should be taken to improve web-based information related to acute pediatric complaints.
Keywords
Article Summary
- The Internet is a widely used source of information for parents of children with acute complaints. Little is known about the adequacy of this web-based information.
1. Why is this topic important?
- We analyzed websites directed at parents with acutely ill children for accuracy, readability, quality, and trustworthiness of information.
2. What does the study attempt to show?
- This study found that only 60% of websites provided accurate information. The majority of websites were not written at a readable level for the average parent (8th-grade level or lower). Most websites had poor-quality JAMA Benchmark criteria and poor National Library of Medicine trustworthiness scores.
3. What are the key findings?
- Inadequate information has the potential to influence decisions made by parents when their child is acutely ill.
4. How is patient care impacted?
Introduction
As of 2018, 4 billion people were using the Internet worldwide, there were 1.9 billion different websites, and users performed nearly 2 trillion searches per day (
1
). In 2016, health-related topics were the second most common category searched in Google (Internet Live Stats
Trends & more (statistics).
Trends & more (statistics).
http://www.internetlivestats.com/statistics/
Date accessed: January 30, 2019
2
). With its widespread adoption, the Internet has become an important health information resource for parents. Up to 56% of parents perform online searches prior to bringing their child to the emergency department (ED), with 68% of queries containing symptoms and 51% containing treatment options (2
). Nearly one in eight parents use the Internet to search health information immediately prior to visiting an ED (3
).Concerns have arisen about the quality and accuracy of this consumer-oriented information on the Internet (
4
, 5
). Incorrect or inadequate information has the potential to influence parents into making inappropriate health-related decisions for their children. Ignoring or inappropriately treating potentially serious disorders at home might lead to a delay in care or harm. Alternately, incorrect information might lead to overutilization of emergency services when no serious illness is present. Due to its potential importance, we chose to study the adequacy of information on the Internet directed at parents of children with acute medical complaints. Our primary objective was to analyze the accuracy of websites directed at consumers regarding pediatric emergency medicine complaints. Secondary objectives were to analyze the readability, quality, and trustworthiness of websites and their association with website accuracy.Methods
We performed Internet searches on the three most popular U.S. search engines (Google, Bing, Yahoo) for four of the top five most common nontrauma complaints in infants and children presenting to EDs (
6
, Alexa
The top 500 sites on the web.
The top 500 sites on the web.
https://www.alexa.com/topsites/category/Computers/Internet/Searching/Search_Engines
Date accessed: January 30, 2019
7
, 8
). Search terms were created by two parents/study authors who were board certified in emergency medicine and pediatric emergency medicine. Search terms included the word “child” plus each of the following: “fever,” “vomiting,” “cough,” and “stomach pain.” Prior to performing searches, websites were set to their default settings of English language, safe search mode, with 10 results per page. Websites for the top 20 results listed for each search engine were analyzed for accuracy. Websites were also analyzed for the secondary outcomes of readability, quality, and trustworthiness. Websites were included if they were addressed to parents/caregivers (e.g., term “your child” is used), addressed to patients (e.g., term “your cough” used), or detailed instructions for lay people on when to seek medical care with a physician or hospital (e.g., term “when to see a doctor” used). Websites were excluded from analysis if they were written for doctors or health professionals, duplicates, isolated videos, newspaper reports, restricted to subscriptions or fees, not written in English, presentations at conferences, or comprised lectures.- McDermott K.W.
- Stocks C.
- Freeman W.J.
Overview of pediatric emergency department visits, 2015. Statistical brief #242. Healthcare Cost and Utilization Project (HCUP).
Agency for Healthcare Research and Quality,
Rockville, MD2018
https://hcup-us.ahrq.gov/reports/statbriefs/sb242-Pediatric-ED-Visits-2015.jsp
Date accessed: January 30, 2019
We categorized websites as professional organizations if they were sponsored by a government, academic or medical specialty society, or hospital/hospital system. Sites not meeting these criteria were categorized as nonprofessional organizations/individuals. Authors were categorized as physicians if the website article was written or reviewed by a physician (MD or DO) or if the article consisted entirely of an interview of a physician. All other authors were categorized as nonphysicians. If no author was listed, the author was categorized as nonphysician.
Prior to initiating the study, a 2-h training session took place with all study authors that emphasized definitions, uniform website review, and coding of website information. Initially, all reviewers/raters simultaneously analyzed 10 websites with pediatric complaints unrelated to this study. After review of every 48 websites, data abstraction, data entry, and coding rules were re-reviewed with reviewers by the principal investigator. The principal investigator arbitrated all coding questions on an ongoing basis.
We analyzed website domain ages using an online database (
9
).Small SEO Tools
Domain age checker.
Domain age checker.
https://smallseotools.com/domain-age-checker/
Date accessed: January 19, 2019
Between December 1 and December 5, 2018, the principal author connected a computer to the Internet and entered each set of search terms into each search engine. Retrieved web pages were saved to a spreadsheet and ordered (ranked) depending on how they initially appeared in each search engine. At that time, domain age, web page age, page categorization, article length, and author status were recorded for each website. At a later date, between December 15, 2018 and January 19, 2019, two physician study authors independently analyzed saved websites for information accuracy, with a third physician arbitrating accuracy disagreements. During the same time period (between December 15, 2018 and January 19, 2019), two study authors independently conducted trustworthy and quality reviews for each saved website. All information was entered directly into a spreadsheet and authors used print versions of quality and trustworthy criteria to calculate scores (Supplementary Tables 1 and 2).
Accuracy
Two physician authors initially independently assessed the accuracy of each website. Prior to assessment, text from each website was copied into a word processing document and converted to the same font (Times New Roman), text size, and single spaced (
10
). Website identifiers, affiliations, sponsors, advertisements, videos, nonessential pictures, and author names were removed prior to evaluation to allow blind assessment. Raters were instructed to identify correct and incorrect statements within the text. Inaccurate information was defined as that information that contradicted guidelines or published information from the American Academy of Pediatrics, the Canadian Paediatric Society, the American College of Emergency Physicians, and current major textbooks in pediatrics, emergency medicine, and pediatric emergency medicine at the time of the study. In addition to these publications, reviewers were allowed to search the National Library of Medicine (PubMed) for original articles, and the Cochrane Database to analyze accuracy of text within webpages. After independent review by two authors, all statements that were graded differently by the two initial reviewers were re-reviewed by a third physician, and disagreements regarding accuracy were settled by consensus. A ratio of correct divided by total correct plus incorrect medical statements was calculated for each site (10
). According to McNally et al., a cutoff of ≥ 95% vs < 95% correct statements identifies a website as accurate vs. inaccurate (10
). McNally et al. described this cut-off as a “minimum for providing patients, health care providers, and the public a reasonably high level of confidence that the information posted was accurate, irrespective and independent of the amount or diversity of information provided” (10
).Readability
Readability was analyzed using the Flesch Kincaid Grade Level (FKGL), Gunning Fog Index (FOG), Coleman-Liau Index (CLI), and the Simple Measure of Gobbledygook (SMOG) score for each website (
11
, 12
, 13
, 14
). All informational text from each website excluding authors, advertisements, links, pictures, copyright notices, disclaimers, acknowledgements, citations, and videos was entered into an online tool to calculate scores (10
). Each analyzes text in a different manner: FKGL–sentence length and syllables, SMOG–complex word density, FOG–sentence numbers/length and complexity, CLI–characters per word and words/sentence. Calculations for each of these measures are listed in Table 1 (11
, 12
, 13
, 14
).Table 1Readability Measures-Scores
Readability Measure | Calculation-Formula |
---|---|
Flesch Kincaid Grade Level (FKGL) | Multiply 0.39 × (words per sentence) + 11.8 × (syllables/words) – 15.59. The result is converted into a U.S. grade level. |
Gunning Fog Index (FOG) | Count the number of words per sentence, and the number of words with 3 or more syllables or complex words. The average number of years of formal education required to read the text is calculated using the formula: 0.4 × [(words/sentences) + 100 × (complex words/total words)]. |
Coleman-Liau Index (CLI) | Multiply 0.0588 × (average number of letters/100 words) – 0.296 (average number of sentences/100 words) – 15.8. |
Simple Measure of Gobbledygook (SMOG) | Counts 10 consecutive sentences in beginning, middle and end of webpages (30 sentences). Then, count the number of words with 3 or more syllables in each of these sentences. Then, convert this into a grade level using formula 1.043 × the square root of (number of polysyllables × 30/number of sentences) + 3.1291. |
Use of these scores is recommended by the National Institutes of Health, the U.S. National Library of Medicine, and the Centers for Medicare and Medicaid to analyze the readability of health information published for the public (
15
, 16
, U.S. National Library of Medicine. Medline Plus
How to write easy-to-read health materials.
How to write easy-to-read health materials.
https://medlineplus.gov/etr.html
Date accessed: April 29, 2019
17
). The FKGL was developed for reading technical manuals by the U.S. Navy and was validated by the U.S. Department of Defense (National Institutes of Health – National Cancer Institute
Making health communications programs work.
Making health communications programs work.
https://www.cancer.gov/publications/health-communication/pink-book.pdf
Date accessed: April 29, 2019
18
). The FOG and SMOG indexes have been validated using standard McCall-Crabbs test grade-level reading lessons (18
). Although no validation studies for the CLI could be found, this measure has been shown to have a moderate to strong correlation with the FKGL (19
, 20
).Using previously described methodology, multiple readability formulae were used to improve the reliability of measures, with scores averaged to form a composite readability score for individual websites (
18
, 21
, 22
, 23
, 24
, 25
, 26
, 27
, 28
). A cutoff of an 8th-grade or lower reading level (≤8.9 grade level) was chosen to define readability.Quality and Trustworthiness
Two study authors independently evaluated the quality of websites using JAMA benchmark criteria (Supplementary Table 1) (
29
). These criteria comprise 1) a description of the website material's author (name, affiliation, and credentials), 2) attribution or references for the content, 3) currency with dates of posts and updates provided, and 4) disclosure of any potential conflicts of interest. Each criterion fulfilled received 1 point, for a total of 4 points. Raters' scores were averaged to calculate a single score for each site. Using previously published definitions, sites with ≥ 3 JAMA criteria were classified as high quality, and sites with < 3 of these criteria were classified as low quality (30
, 31
, 32
).Two study authors independently evaluated sites using the National Library of Medicine (NLM) criteria (
33
). NLM criteria rate three features of website trustworthiness as 0, 1, or 2, based on currency or timeliness, publisher's authority, and accuracy of cited sources, for a maximum total of 6 points (Supplementary Table 2). Raters' scores were averaged to calculate a single score for each site. To be trustworthy, websites needed ≥ 3 points, with no individual criterion receiving a 0 average.National Network of Libraries of Medicine
Evaluating health websites.
Evaluating health websites.
https://nnlm.gov/initiatives/topics/health-websites
Date accessed: January 30, 2019
The presence or absence of Health on the Net Foundation (HON) code of conduct certification was recorded for each website. HON is a Swiss not-for-profit organization originating from a September 1995 conference entitled “The Use of the Internet and World-Wide Web for Telematics in Health Care” (
34
, 35
). It is affiliated with the University Hospital of Geneva and the Swiss Institute of Bioinformatics. HON certification (HONcode) requires a health or medical website to request a review. For submitted sites, an annual review of content based on eight principles is performed by an expert panel of medical professionals. To obtain HON certification, websites must conform to each of the principles of authority (author qualifications), complementarity (information supports and does not replace information), confidentiality (of site user and visitors), attribution (citing sources and dates of medical information), justification (justifies claims in balanced and objective manner), transparency (accessible, valid contact details), financial disclosure, and advertising (clearly distinguishes advertising from editorial commentary) (34
, 35
).Statistics
Prior studies found that 39–50% of websites provide accurate medical advice related to children's health, and that 21.5% (20–23%) of websites about pediatric disorders have high-quality JAMA benchmark criteria (
36
, 37
, 38
). Assuming 45% of studied websites were accurate, a study with at least 114 websites would be needed to detect a 25% difference (21.5% vs. 46.5%) in high-quality JAMA benchmark criteria rates between accurate and inaccurate sites. It was estimated that half of initially searched websites would be excluded (e.g., duplicates, physician oriented, paid sites, copies of newspaper articles). Thus, at least 228 websites would need to be included in an initial search.All data were treated as nonparametric. Categorical data were compared between accurate (≥95% correct) and inaccurate websites using chi-squared analysis or Fisher's exact test. Pairwise comparisons of continuous and ordinal data were made using the Mann-Whitney U test. p Values were adjusted for multiple comparisons using the Benjamini-Hochberg method (
39
). Spearman rank order was used to assess correlation between website accuracy and the respective rank order during the initial search.Interrater correlation for initial website accuracy, JAMA benchmark criteria, and NLM trustworthy scores was calculated using Cohen's kappa. A kappa coefficient was considered almost perfect at 0.81–1, showed substantial or good agreement at 0.61–0.80, moderate agreement at 0.41–0.60, fair agreement at 0.21–0.40, slight agreement at 0.01–0.20, and less than chance at < 0.
Data were analyzed using MedCalc statistical software (v18.2.1; MedCalc Software bvba, Ostend, Belgium, 2018).
Results
Each search engine was queried between December 1, 2018 and December 5, 2018, resulting in 240 initial websites. Analysis and rating of websites took place between December 15, 2018 and January 19, 2019. During review, one website was no longer available and was analyzed using a website that periodically archives webpages throughout the year (
40
). Of the original 240 websites, 96 were excluded: 94 that were duplicates and two websites that were written for medical professionals (one that addressed American Gastroenterological Association guidelines for nausea and vomiting, one that stated “This article is for Medical Professionals”). The remaining 144 websites were identified as being directed at parents, patients, or caregivers in the following manner: 130 used the term “your child” with diagnostic or management recommendations for parents/caregivers, eight instructed caregivers when to seek medical care, and six directly addressed a patient's symptoms (e.g., “if you feel sick,” “if you have a cough”). Website details are listed in Table 2.Internet Archive. Waybackmachine.
https://archive.org/web/
Date accessed: January 19, 2019
Table 2Website Information
Website domain age | 18.4 (10.4–20.8, IQR) |
Webpage age | 1 (0.4–2, IQR) |
Site sponsor – professional medical organization | 74 (51%) |
Hospital/hospital system | 45 |
Government agency | 15 |
Academic medical society | 14 |
Site sponsor – nonprofessional medical organization | 70 (49%) |
Nonmedical professional | 54 |
Health care professional | 16 |
Physician author | 52 (36%) |
No physician author | 92 (64%) |
Nonphysician author | 17 |
No author listed | 75 |
Unique websites for each query | |
Child + Fever | 32 (22%) |
Child + Cough | 40 (28%) |
Child + Vomiting | 34 (24%) |
Child + Stomach pain | 38 (26%) |
IQR = interquartile range.
∗ Age in years.
Median physician-rated accuracy of all websites was 96.4% (interquartile range [IQR] 90–100). Interrater agreement for the initial two-physician assessment of accurate vs. inaccurate websites was superior (k = 0.94, 95% confidence interval [CI] 0.88–1.0). Five representative incorrect medical statements are listed for each complaint (Supplementary Table 3). Accurate websites (≥95% correct) were more frequently published by professional medical organizations, compared with inaccurate websites (63% vs. 33%, p < 0.001). Readability (reading grade level), domain age, article/page age, article length (number of words), physician authorship, website query/complaint, HON certification, JAMA Benchmark Criteria, and NLM Trustworthy scores did not differ between accurate and inaccurate sites (Table 3).
Table 3Comparison of Accurate vs. Inaccurate Websites
Features | Accurate n = 87 | Inaccurate n = 57 | p Value |
---|---|---|---|
Domain age - years | 17.2 (9.9–20) | 19.5 (12.6–21.1) | 0.390 |
Web article age - years | 0.9 (0.3–2.3) | 1.1 (0.7–1.9) | 0.598 |
Web article length - words | 917 (694–1302) | 964 (695–1423) | 0.652 |
Professional medical organization (hospital, academic medical society, government agency) | 55 (63.2%) | 19 (33.3%) | 0.007 |
Physician author or interview | 28 (32.2%) | 24 (42.1%) | 0.454 |
Composite reading grade - years | 9.2 (8.4–11) | 9.1 (8–10.5) | 0.639 |
FGKL | 8.7 (7.7–10.4) | 8.6 (7.3–10) | 0.778 |
SMOG | 8.3 (7.4–9.7) | 8.1 (7.2–9.3) | 0.639 |
FOG | 11.1 (10.1–13.1) | 10.9 (9.6–12.5) | 0.598 |
CLI | 10 (9–11) | 9 (8–10) | 0.390 |
HON Foundation Certified | 27 (31%) | 22 (38.6%) | 0.598 |
JAMA Benchmark High Quality | 17 (19.5%) | 21 (36.8%) | 0.154 |
NLM Trustworthy | 22 (25.3%) | 22 (38.6%) | 0.329 |
Website medical query/complaint | 0.329 | ||
Fever | 14 (9.7%) | 18 (12.5%) | |
Cough | 23 (16%) | 17 (11.8%) | |
Vomiting | 23 (16%) | 11 (7.6%) | |
Stomach pain | 27 (18.8%) | 11 (7.6%) |
FKGL = Flesch Kincaid Grade Level; SMOG = Simple Measure of Gobbledygook; FOG = Gunning Fog Index; CLI = Coleman-Liau Index; HON = Health on the Net; JAMA = Journal of the American Medical Association; NLM = National Library of Medicine.
∗ Median (interquartile range) for continuous and ordinal data.
† p Values corrected for multiple comparisons using Benjamini and Hochberg adjustment
39
.‡ Includes the most recent date page published or updated, 27 accurate sites and 16 inaccurate sites did not have date web article published or date updated and were not included in this comparison.
§ Percentage signifies number of websites with each medical complaint and accuracy/inaccuracy divided by total number of websites (n = 144).
The median readability grade levels for all websites were as follows: Composite of all sites 9.2 years (IQR 8.3–10.7), FKGL 8.6 years (IQR 7.6–10.2), SMOG 8.2 years (IQR 7.3–9.7), FOG 10.9 years (IQR 9.9–12.6), and CLI 9 years (IQR 8.3–10.7). Sixty websites (42%) were written at or below an 8th-grade level, and nine (6%) were written at or below a 6th-grade level.
Thirty-eight websites (26.4%) had high-quality JAMA Benchmark criteria. Forty-four websites (30.6%) were graded as trustworthy (NLM trustworthy score). Interrater agreement was good for determination of high- vs. low-quality JAMA Benchmark Criteria (k = 0.68, 95% CI 0.55–0.81) and good for determination of NLM trustworthiness (k = 0.66, 95% CI 0.51–0.81).
Forty-nine (34%) websites were HONCode certified. HON-certified websites more frequently had high-quality JAMA Benchmark Criteria (50% vs. 28.3%, p = 0.02) and NLM Trustworthy scores (63.6% vs. 21%, p < 0.01) than sites without HON certification.
Overall, seven websites (5%) were readable plus met each of the definitions for accuracy, NLM trustworthiness, and JAMA high quality.
There was no correlation between accuracy and website rank order during the initial search, Spearman rho −0.046 (95% CI −0.208–0.118, p = 0.49).
Discussion
To the best of our knowledge, our study is the first to evaluate the adequacy of parent-directed health information regarding pediatric emergency complaints on the Internet. Our results indicate that many such websites provide inadequate information. Although overall median website accuracy was 96%, we found that 40% of websites were inaccurate based on our predefined cut-off of 95% correct items. Only publication by a professional agency (government, hospital/hospital system, or academic society) was associated with website accuracy. The majority of websites did not meet criteria for high quality and trustworthiness, and most websites were written at a reading level above that of the average parent. Only seven websites (5%) met each of our definitions for readability, accuracy, and quality/trustworthiness, indicating the potential need for improved oversite, rating, or regulation of parent- and consumer-oriented health information websites.
With its widespread use, experts have expressed concern about the quality of health information on the web (
4
, 5
). In 2002, Eysenbach et al. evaluated 55 prior studies of health information on the Internet; 70% of analyzed studies indicated that information quality was a problem (4
). Ideally, search engine results should directly answer a specific medical question in the most trustworthy, accurate, and understandable fashion. However, algorithms used by search engines are proprietary and involve more than these features. Popularity, number of links, link traffic, importance of links, advertisements, and other unknown features play a role in the ranking of search engine results (41
). The role that accuracy, quality, trustworthiness, and readability of health information play in a search engine's results is unknown.A parent's decision to visit their doctor or an ED is multifactorial and includes convenience, referrals, insurance status, time of day, concern about severity of an illness, and the need for a second opinion (
42
). Of parents visiting clinics and EDs, 94–99% have Internet access, with 80–88% having smartphones (41
, 43
, 44
, 45
, 46
). There is evidence that Internet use influences a parent's decision about whether or not to seek care in an ED. Shroff et al. found that 29% of parents were more certain and 19% were less certain that they needed to visit an ED after searching the Internet (3
). Experts have noted that parents regard online information as more up to date, easy to access, and often trustworthy, especially if that information is from a hospital or university website (13
, 43
, 46
, 47
). The fact that websites are frequently accessed and often trusted has important implications regarding health care information dissemination and actions by parents of children with acute medical complaints.Experts have found that parents with low health literacy are less likely to access the Internet compared with those with high health literacy (
43
, 48
). It is possible that this important group of parents would not be affected by information on the Internet. Moreover, readers with lower overall literacy have poorer short-term memory and more difficulty understanding text on websites. Individuals with limited literacy read more slowly, often reread text, and may miss sections on webpages (49
). Independent of reading grade level, formatting (font, text size, line spacing) and length of text (number of words) may influence readability and comprehension of information on websites. Other than length of text, these website features were not assessed in our study.Agency for Healthcare Research and Quality, U.S
Department of Health and Human Services. Tip 6. Be cautious about using readability formulas.
Department of Health and Human Services. Tip 6. Be cautious about using readability formulas.
https://www.ahrq.gov/professionals/quality-patient-safety/talkingquality/resources/writing/tip6.html
Date accessed: January 19, 2019
Higher-quality JAMA benchmark criteria and higher NLM trustworthy scores in HON-certified sites in our study indicate that this certification may be a useful indicator of quality. However, HON certification is voluntary, and most health information websites have not undergone a HON review. For this reason, most consumers of health information cannot rely solely on this measure. Although HON certification addresses quality, transparency, and trustworthiness of information, it does not measure accuracy of information.
Multiple potential solutions exist to address the types of inadequate health information found in our study. One solution would be to compel internet and social media companies to censor incorrect information. Lawmakers have been partially successful in calling on Internet and social media companies like Amazon, Google, Facebook, Instagram, Pinterest, and Twitter to eliminate incorrect information regarding vaccines (
50
, - Frier S.
Facebook, facing lawmaker questions, says it may remove anti-vaccine recommendations. Bloomberg News.
https://www.bloomberg.com/news/articles/2019-02-14/facebook-says-it-may-remove-anti-vaccine-recommendations
Date: 2019
Date accessed: May 1, 2019
51
, - Johnson S.R.
AMA urges Amazon, Facebook, Google and Twitter to do more to limit false anti-vaccine claims. Modern Healthcare.
https://www.modernhealthcare.com/safety/ama-urges-amazon-facebook-google-and-twitter-do-more-limit-false-anti-vaccine-claims
Date: 2019
Date accessed: May 1, 2019
52
). However, these efforts involved years of discussion and concerned only a single issue: the safety of childhood vaccines. Another approach might be to improve the rating of websites, replacing or augmenting HON certification using American College of Emergency Physicians or American Academy of Pediatrics designees to assess accuracy of websites while adding website requirements for an appropriate readability level and specific NLM trustworthy and JAMA quality criteria. From a practical standpoint, this might prove difficult, as the search results for each query in this study returned over a million results. Alternately, specialty societies could create accurate, readable, high-quality online sources for health-related information for individual medical complaints. This information, in turn, could be given the highest “rank” by search engines so that it appears first during search queries.Limitations
This study did not directly evaluate websites any parent accessed prior to visiting a physician's office or ED. Measuring previsit Internet use would require a survey of parents on arrival to an office or ED. This measurement is problematic because it would require parents to remember their exact query terms, the rank order of results, and the exact sites that were visited within a specified time period prior to their visit. One survey of parents who accessed the Internet 24 h prior to an ED visit found that over one-third of parents could not remember any visited website (
3
). Thus, relying on a parent's memory to analyze visited websites might be inaccurate. Without direct monitoring and measuring of Internet activity and parent behavior, it would not be possible to know which websites were viewed, what text within sites was read, what text was understood, and what effect information on these sites had on parent behavior.Only the first 20 results were analyzed for each search engine query in this study. This number was chosen because the default number of results for common search engines (Google, Yahoo, Bing) is 10, and online surveys indicate that 80% of Internet searches are ended within the first two pages of any search (
53
). We chose to study four of the five most common nontrauma complaints (59.3% of all nontrauma complaints) in children presenting to EDs (iProspect. Search engine user behavior study.
http://district4.extension.ifas.ufl.edu/Tech/TechPubs/WhitePaper_2006_SearchEngineUserBehavior.pdf
Date accessed: January 30, 2019
7
). These four complaints are also within the top 10 reasons parents search the Internet for health care information (46
). It is uncertain if alternate complaints or search terms would have yielded different results.The majority of parents visiting a pediatric clinic or ED read at or below an 8th-grade level (
54
). For this reason, we chose an 8th-grade level as the cutoff for readability. Independent of text readability, other factors not analyzed in this study may influence a website's reading ease and understandability. These include organization of the information, page design, graphics, type setting, pictures, and cultural relevance.We chose a previously described criterion, ≥ 95% correct information, to define websites as accurate vs. inaccurate based on a single study using this cutoff (
10
). It can be argued that this cutoff is arbitrary and that any cutoff < 100% might place patients at risk for harm. Realistically, an accurate website (≥95% correct) with a single incorrect important fact (e.g., use of a temperature cutoff of 38.5°C/101.3°F to define neonatal fever) might lead to patient harm, whereas an inaccurate website (<95% correct) with multiple minor unimportant incorrect facts (e.g., an incorrect appendix size on computed tomography) might cause no harm. Our study weighted information equally regardless of importance. Subsequently, websites with the same level of accuracy may have different potentials to lead to parent actions that might be harmful to a child's health.Readability scores were originally developed for analyzing technical manuals and educational material and not necessarily health-related information (
18
, 28
). Prior authors have commented that medical terminology can inflate the grade level of text (55
, 56
). However, Berland et al. found that removing medical terminology lowered the reading level of health information on studied websites by only 0.3 years (57
). Others have found that the readability of patient education material can be improved up to three grade levels by substituting multisyllabic words, adapting sentence structure, and shortening sentences (58
, 59
, 60
). Future work on the readability of parent education materials should consider incorporating this editing process while simultaneously testing whether or not parents comprehend prepared material.Conclusion
In our study, only 60% of websites directed at parents with acutely ill children were categorized as accurate. The majority of websites were of low quality, had low trustworthy scores, and were written at a grade level too difficult for most parents to understand. Inadequate and inaccurate information has the potential to adversely influence medical decisions made by parents. Measures should be taken to ensure that information on the Internet related to acute pediatric complaints is accurate, readable, and has high quality/trustworthiness.
Acknowledgments
The authors would like to thank Joe Pagane, MD for assistance with arbitrating website physician accuracy questions.
Supplementary Data
- Supplemental Table 1
- Supplemental Table 2
- Supplemental Table 3
References
- Trends & more (statistics).http://www.internetlivestats.com/statistics/Date accessed: January 30, 2019
- Dr Google in the ED: searching for online health information by adult emergency department patients.Med J Aus. 2018; 209: 342-347
- Internet usage by parents prior to seeking care at a pediatric emergency department: observation study.Interact J Med Res. 2017; 28: e17
- Empirical studies assessing the quality of health information for consumers on the world wide web: a systematic review.JAMA. 2002; 287: 2691-2700
- Quality of patient health information on the internet: reviewing a complex and evolving landscape.Australas Med J. 2014; 7: 24-28
- The top 500 sites on the web.https://www.alexa.com/topsites/category/Computers/Internet/Searching/Search_EnginesDate accessed: January 30, 2019
- A system for grouping presenting complaints: the pediatric emergency reason for visit clusters.Acad Emerg Med. 2005; 12: 723-731
- Overview of pediatric emergency department visits, 2015. Statistical brief #242. Healthcare Cost and Utilization Project (HCUP).Agency for Healthcare Research and Quality, Rockville, MD2018https://hcup-us.ahrq.gov/reports/statbriefs/sb242-Pediatric-ED-Visits-2015.jspDate accessed: January 30, 2019
- Domain age checker.https://smallseotools.com/domain-age-checker/Date accessed: January 19, 2019
- Can consumers trust web-based information about celiac disease? Accuracy, comprehensiveness, transparency, and readability of information on the internet.Interact J Med Res. 2012; 1: e1
- Quality of web-based information for the 10 most common fractures.Interact J Med Res. 2016; 17: e19
- A readability assessment of online stroke information.J Stroke Cerebrovasc Dis. 2014; 23: 1362-1367
- Readability assessment of Internet-based consumer health information.Respir Care. 2008; 53: 1310-1315
- A computer readability formula designed for machine scoring.J Appl Psychol. 1975; 60: 283-284
- TOOLkit for making written material clear and effective: section 4 – Special topics for writing and design. Part 7 – using readability formulas: a cautionary note.
- How to write easy-to-read health materials.https://medlineplus.gov/etr.htmlDate accessed: April 29, 2019
- Making health communications programs work.https://www.cancer.gov/publications/health-communication/pink-book.pdfDate accessed: April 29, 2019
- The use of readability formulas in health care.Psych Health Med. 1996; 1: 7-28
- Is Wikipedia a reliable learning resource for medical students? Evaluating respiratory topics.Adv Physiol Edu. 2015; 39: 5-14
- Inflammatory bowel disease: an evaluation of health information on the internet.World J Gastroenterol. 2017; 23: 1676-1696
- Readability of sports injury and prevention patient education materials from the American Academy of Orthopeadic Surgeons website.J Am Acad Orthop Surg Glob Res Rev. 2018; 2: e002
- Assessing the readability and patient comprehension of rheumatology medicine information sheets: a cross-sectional health literacy study.BMJ Open. 2019; 9: e024582
- Dr. Google: the readability and accuracy of patient education websites for Graves’ disease treatment.Surgery. 2017; 162: 1148-1154
- Readability of online patient educational resources found on NCI-designated cancer center web sites.J Natl Compr Canc Netw. 2016; 14: 735-740
- Readability assessment of American shoulder and elbow surgeons patient brochures with suggestions for improvement.JSES Open Access. 2018; 2: 150-154
- Online patient resources for breast reconstruction: an analysis of readability.Plast Reconstr Surg. 2014; 134: 406-413
- Readability of the 100 most-cited neuroimaging papers assessed by common readability formulae.Front Hum Neurosci. 2018; 12: 308
- How consistent are the best known-known readability equations in estimating the readability of design standards?.IEEE Trans Prof Commun. 2017; 60: 97-111
- Assessing, controlling, and assuring the quality of medical information on the internet. Caveat lector et viewor – let the reader and viewer beware.JAMA. 1997; 277: 1244-1245
- Accuracy of internet recommendations for prehospital care of venomous snake bites.Wilderness Environ Med. 2010; 21: 298-302
- Breast cancer on the World Wide Web: cross sectional survey of quality of information and popularity of websites.BMJ. 2002; 324: 577-581
- Information retrieval in respiratory care: tips to locate what you need to know.Respir Care. 2004; 49: 389-399
- Evaluating health websites.https://nnlm.gov/initiatives/topics/health-websitesDate accessed: January 30, 2019
- Health on the net’s 20 years of transparent and reliable information.Stud Health Technol Inform. 2016; 228: 700-704
- Twenty years of health on the net: committed to reliable information.Stud Health Technol Inform. 2016; 225: 738-740
- The variation in quality and content of patient-focused health information on the internet for otitis media.Child Care Health Dev. 2018; 44: 221-226
- Perthes disease: the quality and reliability of information on the internet.J Pediatr Orthop. 2015; 35: 530-535
- Googling children’s health: reliability of medical advice on the internet.Arch Dis Child. 2010; 95: 580-582
- Multiple comparisons.Handbook of biological statistics. 3rd edn. Sparky House Publications, Baltimore, MD2014: 257-263
- Internet Archive. Waybackmachine.https://archive.org/web/Date accessed: January 19, 2019
- Cell phone and computer use among parents visiting an urban pediatric emergency department.Pediatr Emerg Care. 2018; 34: 878-882
- Exploring parents’ reasons for attending the emergency department for children with minor illnesses: a mixed methods systematic review.Emerg Med J. 2019; 36: 39-46
- Health information preferences of parents in a pediatric emergency department.Clin Pediatr. 2018; 57: 519-527
- Digital inequality and developmental trajectories of low-income, immigrant, and minority children.Pediatrics. 2017; 140: S132-S136
- Internet access and electronic communication among families in an urban pediatric emergency department.Pediatr Emerg Care. 2012; 28: 553-557
- Are parents getting it right? A survey of parents’ internet use for children’s health care information.Interact J Med Res. 2015; 4: e12
- Information needs of parents for acute childhood illness: determining ‘what, how, where and when’ of safety netting using a qualitative exploration with parents and clinicians.BMJ Open. 2014; 4: e003874
- Health literacy and health information technology adoption: the potential for a new digital divide.J Med Internet Res. 2016; 18: e264
- Department of Health and Human Services. Tip 6. Be cautious about using readability formulas.https://www.ahrq.gov/professionals/quality-patient-safety/talkingquality/resources/writing/tip6.htmlDate accessed: January 19, 2019
- Facebook, facing lawmaker questions, says it may remove anti-vaccine recommendations. Bloomberg News.https://www.bloomberg.com/news/articles/2019-02-14/facebook-says-it-may-remove-anti-vaccine-recommendationsDate: 2019Date accessed: May 1, 2019
- AMA urges Amazon, Facebook, Google and Twitter to do more to limit false anti-vaccine claims. Modern Healthcare.https://www.modernhealthcare.com/safety/ama-urges-amazon-facebook-google-and-twitter-do-more-limit-false-anti-vaccine-claimsDate: 2019Date accessed: May 1, 2019
- Pinterest is blocking search results about vaccines to protect users from misinformation. Washington Post.
- iProspect. Search engine user behavior study.http://district4.extension.ifas.ufl.edu/Tech/TechPubs/WhitePaper_2006_SearchEngineUserBehavior.pdfDate accessed: January 30, 2019
- Parental literacy level and understanding of medical information.Pediatrics. 1998; 102: e25
- Patient-oriented methotrexate information sites on the Internet: a review of completeness, accuracy, format, reliability, credibility, and readability.J Rheumatol. 2009; 36: 41-49
- Using alternative methodologies for evaluating medication leaflets.Patient Educ Couns. 2002; 47: 29-35
- Health information on the internet. Accessibility, quality, and readability in English and Spanish.JAMA. 2001; 285: 2612-2621
- Readability assessment of online patient education materials provided by the European Association of Urology.Int Urol Nephrol. 2017; 49: 2111-2117
- Improving readability of patient education materials.J Commun Health Nurs. 2000; 17: 15-23
- Improving the readability of online foot and ankle patient education materials.Foot Ankle Int. 2014; 35: 1282-1286
Article info
Publication history
Published online: September 24, 2019
Accepted:
June 4,
2019
Received in revised form:
May 5,
2019
Received:
February 20,
2019
Identification
Copyright
© 2019 The Authors. Published by Elsevier Inc.
User license
Creative Commons Attribution – NonCommercial – NoDerivs (CC BY-NC-ND 4.0) | How you can reuse
Elsevier's open access license policy

Creative Commons Attribution – NonCommercial – NoDerivs (CC BY-NC-ND 4.0)
Permitted
For non-commercial purposes:
- Read, print & download
- Redistribute or republish the final article
- Text & data mine
- Translate the article (private use only, not for distribution)
- Reuse portions or extracts from the article in other works
Not Permitted
- Sell or re-use for commercial purposes
- Distribute translations or adaptations of the article
Elsevier's open access license policy
ScienceDirect
Access this article on ScienceDirectLinked Article
- Search at Your Own Risk: Online Health Queries and Your PatientJournal of Emergency MedicineVol. 57Issue 4
- PreviewDuring a recent, busy evening shift I found myself spending a bit more time in the room of a child with an uncomplicated viral upper respiratory infection. Instead of explaining my expected course of illness or supportive care, the bulk of my time was spent sitting with the young parents reviewing a collection of websites on their smartphone, all with information they had collected on caring for fever and cough for their toddler.
- Full-Text
- Preview