Research Spotlights: July 2023

Using content analysis to understand the usage of stigmatizing language in recent scientific literature and its harmful effect on people living with HIV

People living with HIV and experts in the field have long advocated for the use of person-first language, which is a way to emphasize that the disorder, disease, condition, or disability as only one aspect of the whole person. Outdated terms such as “HIV-infected” and “AIDS-infected” are negative and dehumanizing, with the latter being clinically inaccurate. Previous research indicates that HIV-related stigma is associated with a reduction in antiretroviral medication therapy adherence, mental health decline, decreased engagement with healthcare services, and increased substance use. Reducing the usage of HIV-related stigmatized language across scientific published works is imperative; policy makers and healthcare professionals read and utilize the same language in their daily conversations, which can increase specific health-related stigmas and perpetuate discrimination. Recently published research supported by NIAAA and NIDA performed a content analysis on the usage of HIV-related stigmatizing language in peer-reviewed scientific literature to understand and help address this issue.

The researchers collected data from peer-reviewed scientific literature from 2010 to 2020, using the 2015 Joint United Nations Programme on HIV/AIDS (UNAIDS) terminology guidelines. After inclusion/exclusion criteria were applied, the literature search strategy found a total of 26,476 articles in the MEDLINE and CINAHL databases which used variations of the stigmatizing terms “HIV/AIDS-infected”. The journal articles were combined into a single dataset and independently reviewed and coded. Additional article characteristics such as journal, publication year, and country of affiliation were classified to identify the fields of study and countries where journal policy change and guidelines were most needed to reduce stigmatizing language.

Results of this analysis showed that the overall usage of HIV-related stigmatizing language decreased by 32% between 2010 and 2020, with the largest decrease between 2018 and 2020. Variations of the stigmatizing term were used most often in journal articles in 2013 (n = 2,805) prior to UNAIDS terminology guidelines being updated in 2015. Cumulatively, 157 countries were represented in the study and the United States accounted for 36% of the total number of articles that referred to people with HIV as “HIV/AIDS-infected”. Many of the journals using the stigmatizing language were HIV/AIDS specific or related to infectious diseases, while the top journal with the greatest frequency of variables of interest was related to general science and medicine.

The study findings have some limitations. There are numerous stigmatizing terms in the UNAIDS terminology guidelines, but the only term used in this investigation was the most frequently used “HIV/AIDS-infected” and its variations. Only titles and abstracts of journal articles were reviewed for inclusion. Additionally, journal articles published in another language with an English translated abstract were included but could have originally used non-stigmatizing language prior to translation.

The authors conclude by calling on investigators to be aware of the real-world impact of their research. They recommend that authors and journal editors take action to reduce the use of stigmatizing language in scientific literature, suggesting that journals enact official policies and proofreading processes that promote the use of non-stigmatizing language in publications.

Citation:
Parisi CE, Varas-Rodriguez E, Algarin AB, Richards V, Li W, Cruz Carrillo L, Ibañez GE. A Content Analysis of HIV-Related Stigmatizing Language in the Scientific Literature, From 2010-2020: Findings and Recommendations for Editorial Policy. Health Commun. 2023 May 10:1-9. doi: 10.1080/10410236.2023.2207289. Epub ahead of print. PMID: 37161354.

“Just enough” diet tracking can still support healthy weight loss

Research has shown that lifestyle interventions such as food/calorie tracking, can be effective for achieving modest weight loss; however, it is unclear how much tracking is necessary to achieve certain weight-loss goals. Standard behavioral weight-loss interventions range from 3 – 24 months in length and often require daily dietary self-monitoring. However, this type of tracking is burdensome to the individual so it can result in a decline in adherence over time. In a recent publication, a multidisciplinary team of researchers funded by the NHLBI sought to determine the optimal number of days for diet tracking to achieve clinically significant weight loss goals.

The researchers performed a single-arm trial and collected self-reported food intake data from 153 participants for six months. They then used receiver operating characteristics (ROC) curve analysis to determine optimal thresholds for diet tracking to predict 3%, 5%, and 10% weight loss at the end of the observation period. Previous research, including clinical trials, has typically defined clinically significant weight loss as a 5% to 10% reduction of an individual’s recorded weight. The participants were on average 41 years of age, with 69.9% identifying as female, 57.5% as White, and had an average weight of 90.1 kg and BMI of 31.8 kg/m2.

The results of this study indicated that constant calorie tracking is not required to attain healthy weight loss sustainably. Individuals needed to track around 28.5% of the intervention days to achieve ≥3% weight loss, 39.4% to achieve ≥5% weight loss, and 67.1% to achieve 10% weight loss. Time trajectories were also analyzed, and participants were identified as aligning with 1 of 3 trajectories identified. Those individuals who tracked their calories most consistently (in terms of days of week over the six-month period) were found to be “super users” who lost approximately 10% of their weight. Most participants fell into the second trajectory that consisted of individuals who gradually declined in their tracking consistency, eventually tracking their calorie consumption approximately only one day a week; these individuals still lost approximately 5% of their weight. Those who fell into the low tracking trajectory, dropping to 0 days of tracking their calorie consumption, lost only about 2% of their body weight.

There are some limitations of the study such as a non-diverse study population, which may limit generalizability. Additionally, although the authors found a strong association between weight loss and food-tracking trajectories, this study did not fully tease out all possible confounding factors and therefore cannot establish causality between the intervention and weight loss. Despite these limitations, these findings have implications for digital technology that can be programmed to respond more proactively to participant behavior, and to support the health goals of the individual rather than providing a “cookie cutter” approach to calorie tracking.

Citation:
Xu R, Bannor R, Cardel MI, Foster GD, Pagoto S. How much food tracking during a digital weight-management program is enough to produce clinically significant weight loss? Obesity (Silver Spring). 2023 Jul;31(7):1779-1786. doi: 10.1002/oby.23795. Epub 2023 Jun 4. PMID: 37271576.

A mouse model of adolescent heavy drinking indicates long lasting brain alterations following binge drinking

Binge drinking, defined as a pattern of alcohol consumption that raises blood ethanol concentration to above 80mg/dL within 2 hours, is one of the most dangerous patterns of alcohol misuse and negatively impacts individuals of all ages. However, adolescents may be particularly at risk since their brains are not fully mature until approximately 25 years of age. One key brain area that is still maturing in adolescence is the prefrontal cortex, which is essential for executive functioning, risk assessment, and decision-making. In a recent study supported by NIGMS and others, researchers sought to understand how the brain changes with voluntary binge drinking during adolescence in a mouse model.

The researchers used a mouse model that is scientifically known to approximate human binge drinking. The mice (male and female) were given access to alcohol during a 30-day period that roughly corresponds to the ages 11-18 in humans. Using electrophysiology combined with optogenetic techniques, the researchers assessed neuronal populations throughout the prefrontal cortex to understand how binge drinking affected the underlying neurocircuitry.

The results of this study showed that somatostatin (SST) neurons, which inhibit neurotransmitter release from other brain cell types, were impacted long-term in mice that were exposed to binge drinking protocols, compared to mice who were only provided with water throughout this developmental period. This exposure resulted in increased excitability of the SST neurons, which persisted 30 days after alcohol consumption ended. Inhibited SST function is known to affect the role of GABA transmitters as well as production of SST itself, indicating dysregulated neuron activity that could impact executive function and decision making, including social interactions and risk responses.

In summary, this study indicated that heavy alcohol consumption may cause long-term neuronal dysfunction in the adolescent brain, affecting the brain’s ability to signal and communicate, and potentially leading to long-term behavioral changes in humans.

Citation:
Sicher AR, Starnes WD, Griffith KR, Dao NC, Smith GC, Brockway DF, Crowley NA. Adolescent binge drinking leads to long-lasting changes in cortical microcircuits in mice. Neuropharmacology. 2023 Aug 15;234:109561. doi: 10.1016/j.neuropharm.2023.109561. Epub 2023 May 1. PMID: 37137354.