Big Data in Context: Addressing the Twin Perils of Data Absenteeism and Chauvinism in the Context of Health Disparities Research
Journal of Medical Internet Research 2020 • View on jmir.org
Statistical principles dictate that the solidity of conclusions arrived at by a clinical investigation depends crucially on the number of participants studied. One of the greatest challenges to a scientist who studies living human beings lies in enrolling enough participants to make the study results significant (unlike a laboratory researcher who may have at their disposal hundreds of rats, thousands of fruit flies, millions of bacteria or billions of molecules). The issue of inadequate numbers of research participants is sometimes addressed by combining the patient data from a number of smaller studies. Since no two investigations are exactly alike, aggregating data from disparate studies often comes down to the equivalent of some fancy juggling of apples and oranges (and sometimes watermelons and grapes). This commentary warns of the hazards of mushing data together, assuming that the input contributed by individual studies is similar enough and of high enough quality to generate valid results. Furthermore, the aggregation process itself can lend a false sense of security that is blind to the risk of leaving out whole segments of the population that big data are supposed to capture.
Social Scientists Battle Bots to Glean Insights Online
Nature 2020 • View on nature.com
If you’re looking to understand the chatter about medical topics that is pervasive on most all social media, be careful. Bot-driven messages, generated by machines, can swamp postings authored by real people, filling social media platforms with misinformation--about, for example, the safety of e-cigarettes or the danger of vaccination--swaying a frightening number of people to adopt unsafe practices and fooling researchers about what’s really going on in cyberspace. The authors of this piece recommend a quick education in bot-detection for anybody who does medical/social science research.
Commercial Influences onElectronic Health Records and Adverse Effects on Clinical Decision Making
JAMA Internal Medicine 2020 • View on jamanetwork.com
In the days before electronic medical records (but after color television) I was surprised to find at a practice I’d just joined that the pages on which I hand-wrote notes of patient encounters had pharmaceutical ads running along the bottom. One purveyor of drugs or another was happy to supply us with this paper which, in those days, would have cost a few dollars a ream for blank sheets. As everybody who is online today knows (which is just about everybody) advertising is insinuated into a multitude of electronic nooks and crannies, including in places where it really doesn’t belong. Advertising targeted by computers can be way more persuasive and subtle than those simple messages and logo that I found printed across the bottom of my patient progress notes.
Last year Practice Fusion, an EMR vendor, paid $145 million in criminal and civil fines for accepting kickbacks from a narcotic manufacturer to embed in the record’s automated pain protocol a low threshold for recommending long-acting opioids to treat less-than-severe pain. Over-prescription of narcotics is a perversion of standard medical practice, largely responsible for the huge opioid epidemic the US suffered has for years. Healthcare is no exception to the threat of industry-sponsored misinformation, especially under our profit-driven system.
A global review of publicly available datasets for ophthalmological imaging: barriers to access, usability, and generalizability
Lancet 2021 • View on thelancet.com
The authors of this report scoured the medical literature to find, in 94 online databases, over500,000 still images of retinas from 122,000 patients. Almost all of the datasets originated inAsia, North America and Europe. There were lots of examples of diabetic retinopathy, macular degeneration and glaucoma, eye diseases most common in the developed world, with underrepresentation of the sorts of ocular problems that afflict citizens from countries with less developed economies. Data on age, sex and ethnicity were often missing in the databases,making it that much harder to assess the applicability of these retinal images to particular populations.
An algorithmic approach toreducing unexplained pain disparities in underserved populations
Nature Medicine 2021 • View on nature.com
An artificial intelligence program was trained to “look at” knee x-rays and predict levels of pain, taking into account the race, income and education level of the patient. The program’s predictions outperformed the old simple checklists that were normed (as so many medical protocols have been)on mostly middle class white folks. The old system too often led practitioners to underestimate the suffering of people of color or of lesser privilege. We all know that the experience of pain is much more than physical. It depends on culture, context and the individual. In so many realms, not just medical, AI can serve as a thin excuse for treating certain classes of peopleless well than they deserve. On the other hand, as this report demonstrates, if applied properly, these methods can be powerful tools for taking into account the uniqueness of different groups so as to better tailor their care.