Category Archives: Education

ChatGPT And Your Research Paper

Generative artificial intelligence (AI) is the newest shiny toy. The best-known example, ChatGPT, burst onto the scene in November 2022 and caught most of us off guard. The earliest versions were interesting and showed great promise for a variety of applications.

The easiest way to think about this technology is to compare it to the auto-complete feature in your search engine. When you start to type a query, the search engine will show a list of commonly entered queries that begin the same way.

Generative AI does the same thing, just on a vastly expanded level. It looks at the question the user posed and displays its best guess on how to complete it based on the huge amount of data it has trained on. As the algorithms get better and the training data more extensive, the AI continues to improve.

However, there are drawbacks. Early iterations of ChatGPT demonstrated that the AI could very convincingly generate text that didn’t really match reality.  There are several very public cases of this. One online sports reporting service used AI extensively and found that nearly 50% of the articles it produced were not accurate. A lawyer submitted a legal brief prepared by AI without checking it. The judge determined that the cases cited in the brief were totally fabricated by the algorithms.

One of the most significant controversies facing academics and the research community is how to harness this valuable tool reliably. College students have been known to submit papers partially or fully prepared by AI. There is also a concern that it could be used to write parts or all of research papers for publication. Some enterprising startups have developed tools for spotting AI-generated text. And most journals have adopted language in their guidelines to force authors to disclose their use of this tool.

Here’s some sample language from the Elsevier Guide for Authors on their website:

“Where authors use generative artificial intelligence (AI) and AI-assisted technologies in the writing process, authors should only use these technologies to improve readability and language. Applying the technology should be done with human oversight and control, and authors should carefully review and edit the result, as AI can generate authoritative-sounding output that can be incorrect, incomplete, or biased. AI and AI-assisted technologies should not be listed as an author or co-author, or be cited as an author. Authorship implies responsibilities and tasks that can only be attributed to and performed by humans, as outlined in Elsevier’s AI policy for authors.

Authors should disclose in their manuscript the use of AI and AI-assisted technologies in the writing process by following the instructions below. A statement will appear in the published work. Please note that authors are ultimately responsible and accountable for the contents of the work.

Disclosure instructions
Authors must disclose the use of generative AI and AI-assisted technologies in the writing process by adding a statement at the end of their manuscript in the core manuscript file, before the References list. The statement should be placed in a new section entitled ‘Declaration of Generative AI and AI-assisted technologies in the writing process’.

Statement: During the preparation of this work the author(s) used [NAME TOOL / SERVICE] in order to [REASON]. After using this tool/service, the author(s) reviewed and edited the content as needed and take(s) full responsibility for the content of the publication.”

But apparently, some researchers don’t heed the guidelines. Here is a snippet of text from a case report published in an Elsevier journal earlier this year. I’ve highlighted the suspicious text at the end of the discussion section. Click the image to make it readable in full size:

It appears the author tried to use AI to generate this text, but its disclaimer logic was triggered. This represents a failure on many levels:

  • The authors apparently used AI to generate one or more parts of their manuscript.
  • The authors did not thoroughly proofread the final copy before submission to the journal.
  • The authors did not disclose their use of AI as required.
  • The editors and reviewers for the journal did not appear to read it very closely either.
  • And it’s now in the wild for everyone to see.

The big question is, how reliable is the rest of the text? How much is actually just a bunch of stuff pulled together by generative AI, and how accurate is that?

Bottom line: First, use generative AI responsibly. Some excellent tools are available that allow researchers to gather a large quantity of background information and quickly analyze it. However, it ultimately needs to be fact-checked before being used in a paper.

Second, refrain from copying and pasting AI material into your paper. After fact-checking the material, write your own thoughts independent of the AI text.

Finally, if you do use AI, be sure to follow the editorial guidelines for your journal. Failure to do so may result in your being banned from future submissions!

Reference: Successful management of an Iatrogenic portal vein and hepatic artery injury in a 4-month-old female patient: A case report and literature review. Radiology Case Reports 19(2024):2106-2111.

Creating A Virtual RTTDC Course

The Rural Trauma Team Development Course (RTTDC) was introduced by the American College of Surgeons (ACS) to improve the care of trauma patients in rural communities. It is a staple of education for Level III and IV trauma centers in rural areas. Like everything else, most courses were shut down by the COVID-19 pandemic.

Conemaugh Memorial Medical Center in Johnstown, Pennsylvania, polled its local referral hospitals and discovered that the majority felt a significant need for continuing, in-person education that was not being met. This need, coupled with the observation of an increased number of opportunities for improvement in patients transferred to them, led them to consider adapting the RTTDC to a virtual format so the course could continue.

Since RTTDC is a product of the ACS, it is no simple matter to change it in any way. The trauma program worked with the ACS to get permission to make changes to the course.  Speakers with specialization in their topic recorded all of the lectures. They contained embedded questions to be answered using the polling feature of the Zoom software used.

The most challenging adaptation was simulation development for the hands-on portions of the course. These were painstakingly recorded on video in a simulation laboratory and incorporated into the lecture material.

Preregistration was brisk, and 41 participants signed up for the course. The format consisted of a lecture with live discussion and participant questions, followed by a simulation video moderated by the course director. All questions were answered before moving on to the next module.

Several positive changes were noted in the months following the course:

  • Many facilities purchased additional equipment, such as traction splints, pelvic binders, and blood warmers.
  • Some hospitals began acquiring tranexamic acid and prothrombin concentrate.
  • One facility modified its radiographic imaging policy.
  • All hospitals tightened their performance improvement processes and began to identify more opportunities for improvement.

Of course, some downsides were also identified:

  • Production of the course was very intensive and administratively challenging.
  • There was the possibility of teleconferencing hardware/software failure.
  • It was difficult for the presenters to “read the audience” because of the Zoom headshot.
  • Truly interactive discussions were difficult to achieve.

Bottom line: This is a creative example of a rural trauma center identifying regional needs and developing an innovative solution despite the pandemic. Despite the amount of work needed to pull it off, the results were very positive. Although the course should ideally be produced in person, this may not be feasible in some very remote areas. 

Hopefully, the ACS will be able to recognize this work and the need for this format. It should create a virtual version to help spread the word to all rural trauma centers.

Reference: Virtual Rural Trauma Team Development Course: Trying To Zoom In On A Solution. J Trauma Nursing 20(3):186-190, 2023.

You’ve Been Pimped! Origins And How To Survive It!

What exactly is pimping? If you have ever been a medical student or resident in any discipline, you probably already know. It’s ostensibly a form of Socratic teaching in which an attending physician poses a (more or less) poignant question to one or more learners. The learners are then queried (often in order of their status on the seniority “totem pole”) until someone finally gets the answer. But typically, it doesn’t stop there. Frequently, the questioning progresses to the point that only the attending knows the answer.

So how did this time honored tradition in medical education come about? The first reference in the literature attributes it to none other than William Harvey, who first described the circulatory system in detail. He was disappointed with his students’ apparent lack of interest in learning about his area of expertise. He was quoted as saying “they know nothing of Natural Philosophy, these pin-heads. Drunkards, sloths, their bellies filled with Mead and Ale. O that I might see them pimped!”

Other famous physicians participated in this as well. Robert Koch, the founder of modern bacteriology, actually recorded a series of “pümpfrage” or “pimp questions” that he used on rounds. And in 1916, a visitor at Johns Hopkins noted that he “rounded with Osler today. Riddles house officers with questions. Like a Gatling gun. Welch says students call it ‘pimping.’ Delightful.”

So it’s been around a long time. And yes, it has some problems. It promotes hierarchy, because the attending almost always starts questions at the bottom of the food chain. So the trainees come to know their standing in the eyes of the attending. And they also can appreciate where their fund of (useful?) knowledge compares to their “peers.” It demands quick thinking, and can certainly create stress. And a survey published last year showed that 50% of respondents were publicly embarrassed during their clinical rotations. What portion of this might have been due to pimping was not clear.

Does pimping work? Only a few small studies have been done. Most medical students have been involved with and embarrassed by it. But they also responded that they appreciated it as a way to learn. A 2011 study compared pimping (Socratic) methods to slide presentations in radiology education. Interestingly, 93% preferred pimping, stating that they felt their knowledge base improved more when they were actively questioned, regardless of whether they knew the answer.

So here are a few guidelines that will help make this technique a positive experience for all:

For the “pimpers”:

  • Make sure that the difficulty level of questions is reasonable. You are testing your learners’ knowledge, not spotlighting your own mental encyclopedia
  • Build the level of difficulty from questions that most can answer to one or two that no one knows, then switch to didactice teaching of the esoterica
  • Don’t let one learner dominate the answers; gently exclude them and solicit answers from others so they get a chance to participate
  • Provide positive reinforcement for correct answers, but don’t resort to negative reinforcement (insults) when they are wrong
  • Go Socratic when the answer is not known. Step back and review the basic concepts involved that helps your learners arrive at the correct answer.

For the “pimpees”:

  • Read, read, read! You are in this to learn, so study all the clinical material around you.
  • Talk to your seniors to find out your attending’s areas of interest. There’s a lot of stuff to learn, and this may help you focus your rounding preparation a bit. It still doesn’t absolve you from learning about all the other stuff, though.
  • Don’t be “that guy (or gal)” who tries to dominate and answer every question
  • If all else fails, and it’s one of those “percentage” questions, use my
    “85/15 rule.”
    If the issue you are being asked about seems pretty likely, answer “85%.” If it seems unlikely, go with “15%.” It’s usually close enough to the real answer to satisfy.

Bottom line: Pimping is a time-honored tradition in medicine, but should not be considered a rite of passage. There is a real difference in attitudes and learning if carried out properly. Even attendings have a thing or two to learn about this!

Reference: The art of pimping. JAMA. 262(1):89-90, 1989.

You’ve Been Pimped!

You know what I’m talking about. It’s a mainstay of medical education for physicians. It starts in medical school, and generally never stops. And when you finish your residency,  you graduate from being pimped to being the pimper.

How did this all come to be? Is it good for education? Bad? Tune in tomorrow to learn more. In the meantime, enjoy this algorithm on how to get through a pimping session. Click to view full-size.

pimping

Source: Posted by Dr. Fizzy on The Almost Doctor’s Channel

Best Of AAST #14: Trauma Patient Health Literacy

When is the last time this has happened to you? You are called to the ED for a trauma activation. The patient was involved in a motorcycle crash and is doing fine, but he has a large midline scar on his abdomen. You inquire as to what it is. He tells you that he had been involved in another motorcycle crash about five years ago and needed an operation. When questioned about what his injuries were and what was done, he has no idea.

This is an example of health (il)literacy at its best. An earlier study from the Presley trauma center in Memphis demonstrated that less than half of their trauma patients could correctly recall their injuries or their operations.

This is not really surprising. Have you ever taken a minute to look at the sheaf of paper given to hospital patients when they are discharged? They are usually computer-generated gobbledygook and are not easily understood by any human on this earth. It is hard enough to figure out the discharge medications and followup visits. And any diagnosis or surgical procedure information is never in patient-friendly language.

The Memphis group designed a simple discharge information form to provide to their patients:

Here are the factoids:

  • Patients admitted to the trauma service over a 6-month period were studied and surveyed during their first post-discharge clinic visit
  • A total of 153 surveys were distributed, asking about income, education, and patient satisfaction and their understanding of what happened to them; 146 were returned
  • Income levels were low, with about 60% of them less than $25K and 85% less than $50K
  • About 75% had a high school education or less
  • Implementation of the form increased injury recall some or all of patient injuries from 55% to 85%, and recall of operations from 43% to 76%
  • The number of patients who could recall any of their providers’ names increased from 11% to 31% (!)
  • Injury understanding, satisfaction with injury understanding, and the overall impact on hospitalization was significantly positive

The authors concluded that introducing this simple form dramatically improved their patients’ health literacy, and their patients were able to provide more details to providers they visited post-discharge.

Here are my comments: I think the bottom line here is to know your patients! Socioeconomic and education status vary dramatically by geographic location. This certainly has an impact on the understanding and recall of hospital events by our patients. It can help us optimize processes to provide meaningful and important information that they need to know in the future.

The form used in this study was very simple, consisting of a series of blanks to be filled in by a healthcare provider. But who was this provider? All medical professionals tend to use the lingo that we learned in training. But our patients have zero understanding of them. Consider the lowly Foley catheter. Tell a patient you are going to insert one, and they will say “uh-huh.” But tell them that you are preparing to stick a big rubber tube in their penis, and the response will be much more vocal. Make sure the language is simple and lingo-free.

The recall of provider names improved only modestly. This may be due to the typical “interchangeable head” model where the various healthcare professionals change on a frequent bases. Additionally, patients are seen by a horde of nurses, physicians, APPs, residents, techs, and others during their stay so it’s easy to forget a name.

Overall, the results were very promising. This is a significant advance in patient health education and literacy. I think the next step is to provide a library of information sheets based on the common injury diagnoses and operations that occur at the trauma center. This, coupled with a more intelligible set of discharge papers in general will be of great help to our patients.

Here are my questions for the presenter and authors:

  • Why so few surveys? Your center is very busy, and the study data only involved about 25 patients per month. How did you select them, and might information obtained from all the other patients have changed your results?
  • Did you independently review the discharge forms to ensure understandable language? The intelligibility could vary significantly based on the provider filling it out.
  • How did your care model affect the patient recall of their providers? Do your residents or attending surgeons rotate on a frequent basis? What other factors might have influenced this?
  • What next? How has this information changed how you educate your patients now? What additional changes might you make in the future? How will you roll it out to more than just 25 patients per month?

This is excellent work! I’m looking forward to your live presentation later this week.