top of page

Child Psychiatrist /Adult Psychiatrist

Search Results

658 results found with an empty search

  • Nutrition Crucial for Child and Adolescent Athletes

    PARIS — A child or adolescent athlete generally engages in school and extracurricular sports practice for more than 10 hours per week. High-level child or adolescent athletes engage in more than 20 hours of practice, excluding competitions. Nutrition affects these athletes' growth, risk for injury, and performance. At the congress of the French Society of Pediatrics, Dr Emmanuelle Ecochard-Dugelay, of the pediatric gastroenterology and nutrition department at Robert Debré Hospital in Paris, reviewed the key points in monitoring and advising these young athletes for the Medscape French edition. Why is it important to be concerned about the nutrition of a child or adolescent athlete? Adequate nutrition in the context of sports practice helps to avoid several problems, including energy deficit, which decreases performance; growth delay; and pubertal delay. In addition, a micronutrient deficiency can increase the risk for injuries and fatigue. Often, adolescent athletes tend to favor high-energy density foods that are low in nutrients, mainly starches, at the expense of fruits and vegetables. This leads to frequent deficiencies in micronutrients such as vitamins and trace elements. Are eating disorders more common in weight category sports? Indeed, in boxing, judo, and wrestling, young athletes may be required to maintain a specific weight for their category, which can compromise their optimal energy intake and growth. In boxing, for example, they aim for the upper limit of a weight category rather than the lower limit of a higher category. Other situations also pose nutritional risks, such as sports with low weight requirements (eg, gymnastics, dance, skating, jockeys) or those with high body representativeness (eg, rugby, weightlifting). Adolescents with exclusion diets (eg, vegans) or chronic illnesses must also adapt their diet and training. In clinical practice, I often see young athletes for issues related to growth, nutrition, or weight. Some seek nutritional follow-up to optimize their sports performance, whereas others may require special attention owing to chronic illnesses or intensive sports practice. High-level athletes can get specialized consultations, such as those offered at France's National Institute of Sport, Expertise, and Performance or within federations. What particularities should be considered in young athletes? Differences in muscle fiber types between children, adolescents, and adults have implications for nutrition. Children have a higher proportion of type I muscle fibers (progressing gradually to a predominance of type II fibers in adults). This specific childhood muscle composition promotes the ability to sustain prolonged efforts. The fibers are more resistant to fatigue and are rich in mitochondria, with a strong capacity for aerobic oxidation and low anaerobic glycolysis. Hence, they have little lactic acid. Monitoring children and adolescents to prevent nutritional deficiencies or energy deficits is essential for optimal muscle development and growth. Does their energy metabolism differ from that of adults? In children, metabolic pathways are preferentially used during physical exercise, with aerobic glycolysis taking precedence over anaerobic glycolysis very quickly during exercise and in higher proportions. As a result, the energy needs of children per unit of weight far exceed those of adults owing to their ongoing growth and development. In other words, at an equivalent level of exercise, a child's energy consumption is higher than that of an adult per unit of weight owing to increased metabolism, in addition to growth-related needs. Recommendations for energy intake include the following points: Boys (15 years old): 3640 ± 830 kcal/d Girls (15 years old): 3100 ± 720 kcal/d Bosy (15-18 years old): 3000-6000 kcal/d Girls (15-18 years old): 2200-4000 kcal/d These recommended energy intakes vary according to the sports practiced. Moreover, in children, and to a lesser extent in adolescents, thermoregulation is also more variable. Children have a larger ratio of body surface area to weight than adults do and possibly lower efficiency of sweat glands. Children also have increased sensitivity to hypoglycemia. Regarding macronutrients in adolescent athletes, what should their protein intake be? Several factors influence muscle mass gain in adolescents, and muscle recovery is essential. Consuming protein recovery products alone is not sufficient to build muscle. In fact, from the perspective of recovery, protein recovery products are not necessary for adolescents living in industrialized countries, unless they follow a specific diet (vegan, vegetarian, or celiac). During the growth and tissue-building period, recommended intakes are higher than for the general population: between 1.2 and 1.4 g/kg/d, compared with 0.9 g/kg/d (representing 12%-15% of the daily intake). Sources of good-quality protein, preferably of animal origin, are recommended, because they provide essential amino acids needed for muscle synthesis and growth. Consuming a carbohydrate-protein snack after exercise is advised to promote recovery and muscle growth. What about carbohydrate and lipid intake in these young athletes? Carbohydrate intake should generally represent between 45% and 60% of the total daily energy intake. The recommended quantity varies from 4 to 10 g/kg/d, depending on the intensity of physical activity. Adolescents have a low glycogen synthesis capacity. Their needs are usually met by a normal diet, focusing on low-glycemic index carbohydrates to optimize energy management and avoid gluconeogenesis from protein reserves. On the other hand, lipid intake should represent approximately 35% of the total energy intake. Unsaturated fatty acids, which mainly are found in oily fish, dairy products, and vegetable oils (especially rapeseed), are recommended, as is maintaining a good omega-3/omega-6 balance. Avoiding fried foods is advised to limit saturated and trans fat consumption. What are the potential micronutrient deficiencies? Although micronutrients do not play a direct role in energy production, they are essential for metabolism and most biochemical processes. A deficiency in micronutrients can lead to poor recovery, increase the risk for injuries, and affect sports performance. Among the most common micronutrient deficiencies in adolescents are deficiencies in vitamins D, C, and calcium. Vitamin D deficiency is often detected in people with insufficient sun exposure or limited dietary intake. Calcium and vitamin D play a crucial role in calcium homeostasis, which is essential for muscle contraction and bone formation. Some populations are particularly at risk: girls, who do not benefit from testosterone protection and may be subject to dietary restrictions, as well as athletes who play indoor sports. Most adolescents receive their calcium intake through dairy products, but it is advisable to ensure that they consume at least four servings per day for adequate calcium intake. Vitamin C is often deficient in people who consume a lot of starches while neglecting to eat fruits and vegetables, which are important sources of this vitamin. If in doubt or if the person's diet does not seem balanced, then these intakes should be checked through a detailed survey. Generally, micronutrient supplementation is not recommended for children who have a balanced and varied diet, because their diet should suffice to meet their needs. However, if the diet is unbalanced, if the practice is intensive, or if symptoms of deficiency appear, supplements may be necessary, followed by supplementation to ensure adequate intake. What about the use of dietary supplements? From 20% to 80% of young athletes take dietary supplements. The most common include multivitamins, vitamin C, and caffeine. Often, their use is not medically justified, and there is little rigorous evaluation of the products available for sale. Furthermore, some supplements may contain doping agents. What are your recommendations regarding hydration in young athletes? Hydration should not be overlooked, because a 1% fluid loss can lead to a 10% decrease in performance. Here are the recommendations: During activity, it is recommended to drink 13 mL/kg/h (approximately 500 mL/h for a 40-kg child). After activity, it is recommended to drink around 4 mL/kg/h for each hour practice of practice. Regarding flavored solutions containing carbohydrates and salt (NaCl), few specific studies have been done in children. What are your messages to doctors following these children? Briefly, identify risky situations and tailor advice to the type of sport and its intensity. Many children, without being high-level athletes, practice at least 10 hours of training or competition per week in addition to school sports. It is therefore common for a doctor to encounter them among their patients. It is important to ask about their sports practice to understand their specific needs and to track their growth curve and observe their pubertal development. This is not only to minimize the risk for injuries but also to preserve their weight and height status into adulthood, as well as their bone health. I recommend referring them to dietitians or specialized nutritionists, especially those from sports federations, for optimal support in terms of their sports performance and overall health. Ecochard-Dugelay reported having no relevant financial relationships. Note: This article originally appeared on Medscape .

  • Anxiety Linked to a Threefold Increased Risk for Dementia

    TOPLINE: Both chronic and new-onset anxiety are linked to a threefold increased risk for dementia onset in later life, new research shows. METHODOLOGY: A total of 2132 participants aged 55-85 years (mean age, 76 years) were recruited from the Hunter Community Study. Of these, 53% were women. Participants were assessed over three different waves, 5 years apart. Demographic and health-related data were captured at wave 1. Researchers used the Kessler Psychological Distress Scale (K10) to measure anxiety at two points: Baseline (wave 1) and first follow-up (wave 2), with a 5-year interval between them. Anxiety was classified as chronic if present during both waves, resolved if only present at wave 1, and new if only appearing at wave 2. The primary outcome, incident all-cause dementia, during the follow-up period (maximum 13 years after baseline) was identified using the International Classification of Disease-10 codes. TAKEAWAY: Out of 2132 cognitively healthy participants, 64 developed dementia, with an average time to diagnosis of 10 years. Chronic anxiety was linked to a 2.8-fold increased risk for dementia, while new onset anxiety was associated with a 3.2-fold increased risk (P = .01). Participants younger than 70 years with chronic anxiety had a 4.6-fold increased risk for dementia (P = .03), and those with new-onset anxiety had a 7.2 times higher risk for dementia (P = .004). There was no significant risk for dementia in participants with anxiety that had resolved. Investigators speculated that individuals with anxiety were more likely to engage in unhealthy lifestyle behaviors, such as poor diet and smoking, which can lead to cardiovascular disease — a condition strongly associated with dementia. IN PRACTICE: "This prospective cohort study used causal inference methods to explore the role of anxiety in promoting the development of dementia," lead author Kay Khaing, MMed, The University of Newcastle, wrote in a press release. "The findings suggest that anxiety may be a new risk factor to target in the prevention of dementia and also indicate that treating anxiety may reduce this risk." LIMITATIONS: Anxiety was measured using K10, which assessed symptoms experienced in the most recent 4 weeks, raising concerns about its accuracy over the entire observation period. The authors acknowledged that despite using a combination of the total K10 score and the anxiety subscale, the overlap of anxiety and depression might not be fully disentangled, leading to residual confounding by depression. Additionally, 33% of participants were lost to follow-up, and those lost had higher anxiety rates at baseline, potentially leading to missing cases of dementia and affecting the effect estimate. Note: This article originally appeared on Medscape .

  • Promoting Mental Health in Schools

    The declining mental health of children and adolescents is a tremendous concern. In the United States, almost 20% of those age 3 to 17 have a mental, emotional, developmental, or behavioral disorder, and suicidal behaviors among high school students increased by more than 40% from 2009 to 2019. Many children are subjected to one or more traumatic experiences that can have long-term adverse effects on their physical and mental health. School can have a major impact on children and adolescents’ mental health . School-based mental health promotion programs aim to support students’ mental health by creating a safe and open environment, focusing on the emotional and social aspects of school, and fostering inclusivity. These programs can cater to a specific student or to an entire school, working to reduce barriers that might prevent students from accessing mental health resources. They often focus on building resilience or addressing trauma. This article describes the effects of trauma in children and adolescents, and how mental health promotion programs in a school setting can optimize students’ well-being. Understanding Trauma in Children and Adolescents Adverse childhood experiences are stressful or traumatic events that one experiences before age 18 years that affect long-term health. These experiences can be categorized into 3 groups: Abuse (psychological, physical, and/or sexual); Neglect (emotional and/or physical); and Household dysfunction (divorce/separation, substance abuse, mental illness, criminal behavior, parental absence, or domestic violence). Other stressful experiences in adolescence that have long-term health effects include home, community, and school problems such as discrimination, bullying, violence, homelessness, parental stress, and poverty or economic hardship. Some of these experiences are common — an estimated 37% of adolescents experience bullying or cyberbullying. School shootings and gun violence are also a source of childhood trauma. Since 1999, approximately 150 students and educators have been killed and 300 injured by a shooting at school, and more than 236,000 students have been exposed to gun violence at their school. While the main concern is for the victims, students who witness these events are at increased risk for mental disorders, including posttraumatic stress disorder, depression, anxiety, acute stress disorder, substance abuse, and panic disorder. After a school shooting, students make lose their sense of security and safety at school, which deters learning and can lead to mental health disorders. Adverse childhood experiences can have negative consequences on a student’s emotional, behavioral, psychological, social, and cognitive abilities, all of which influence their social and academic development and engagement. Globally, the World Health Organization estimated that in 2019, 58 million children and adolescents were living with an anxiety disorder, 23 million were living with depression, and 3 million were experiencing eating disorders. When mental health conditions are not appropriately treated, children and adolescents struggle academically and are at a higher risk of substance use, suicide, incarceration, and dropping out of school. Exposure to trauma during childhood can increase the risk of negative outcomes, including poor physical and mental health, in adulthood. Mental Health Care in Schools While mental health symptoms and disorders are common among children and adolescents, many of those affected do not seek or receive treatment. Because children and adolescents spend much of their time at school, schools are a logical place to supplement traditional mental health care. School-based psychosocial interventions have been used to improve mental health services for children and adolescents . These interventions can be used to target both internalizing disorders (those with prominent anxiety, depressive, and somatic symptoms) or externalizing disorders (those with prominent impulsive, disruptive conduct, and substance abuse symptoms). More than one-half of school-based psychosocial interventions are offered in schools in low-income communities, and many are administered by teachers who do not have traditional mental health care training. School-based mental health services often focus on building resilience, which is important for preventing and reducing the severity of mental health symptoms and disorders. Although such programs often are designed to target a specific mental health concern in a specific demographic, others take a universal approach, in which the interventions are delivered to all students within a class, grade, or school. Resilience-building programs typically include lectures, demonstrations, role-playing, educational resources, clinical tools, and more. One systematic review found that the most effective school-based resiliency programs were interactive and used a variety of these methods in a supportive environment.They emphasized tools that teach coping, mindfulness, relationship-building techniques, self-efficacy, and emotional regulation. Another approach to promoting mental health in schools is trauma-informed care. Instead of using a clinical approach to target specific symptoms, trauma-informed care entails developing a holistic understanding of how adverse childhood experiences affect an individual’s health and how to respond and provide care without re-traumatizing. The 4 Rs of trauma-informed care are as follows: Realize (understanding the widespread impact of trauma); Recognize (identifying the signs and symptoms of trauma in patients and families); Respond (coordinating health care, community, and educational resources to best support the child and family); and Resist (rethinking the clinical approach to patient care, including support for those who provide the care). The aim of trauma-informed care is to promote resilience in children and adolescents who have been exposed to trauma, and to increase parent/caregiver’s knowledge, awareness, and acceptance of childhood trauma. Who Provides Mental Health Programs in Schools? As the mental health needs of children and adolescents have grown, teachers have become more involved in mental health care, usually by delivering school-based psychosocial interventions. Teachers typically deliver these interventions in the classroom, providing social skills training consisting of activities, strategies, and techniques to improve students’ mental health, functional impairment, and well-being. Evidence suggests involving teachers in mental health promotion programs at school can provide benefits in academic achievement, social and emotional skills, behavior, and anxiety and depression. School nurses also can provide school-based mental health services. Students often visit the school nurse for physical symptoms, such as a headache or abdominal pain, that might be related to a mental health disorder, such as anxiety, depression, or an eating disorder. School nurses’ familiarity with students and their families can make them vital in identifying underlying stressors and mental health concerns. In a review of 14 studies, Kaskoun et al found that school nurses emphasized the need for appropriate training so they could communicate effectively and feel confident in handling their students’ mental health. Specifically, the nurses cited training in motivational interviewing, depression interventions, and identifying bullying and cyberbullying as being necessary to improve their ability and confidence in providing school-based mental health services. In this review, school nurses also reported that earning students’ trust, particularly in regards to confidentiality, is a key to helping those with mental health disorders. Many students might not be aware of confidentiality when receiving care from a school nurse and therefore do not seek out treatment; a better understanding of confidentiality by could increase the school nurse’s ability to help them. Collaboration and support among teachers, administrative staff, and school nurses is vital to facilitate school-based mental health programs. To effectively promote students’ mental health, teachers and nurses need support and appropriate training in understanding how adverse childhood experiences affect students’ behavior. Interventions that focus on strengthening the teacher-student relationship can improve student behavioral issues and contribute to students’ long-term resilience. Avoiding Burnout Any approach to promoting students’ mental health also has to take into account the mental health of those who provide the interventions. Burnout is a long-term stress reaction characterized by emotional exhaustion, depersonalization, and a lack of sense of personal accomplishment. A review of 18 articles found that school counselors’ risk of burnout was correlated with having non-counselor duties such as clerical tasks, being assigned large caseloads, experiencing a lack of supervision, providing fewer direct student services, and having greater perceived stress. There are no easy solutions for preventing or addressing burnout in those who provide mental health care to children and adolescents. Commonly suggested strategies include identifying symptoms of burnout, focusing on self-care (for example, by exercising, eating a healthy diet, and getting adequate sleep), prioritizing a work/life balance, and advocating for change at an organizational level. Note: This article originally appeared on Psychiatry Advisor

  • Healing Beyond the Hospital: Improving Post-Op Care for Older Adults

    An 83-year-old veteran of battles with chronic disease who has been a long-time patient of mine recently faced colorectal cancer surgery. His recovery, complicated by the new reality of living with an ostomy bag, has been murky and fraught with challenges. Skilled surgeons have done their part; the wound heals, bowels work. His oncologist, the architect of his chemotherapy, awaits a follow-up in weeks. His primary care physician, though adept with his chronic conditions, has been sidelined by his own cancer treatments and remains unaware of the patient's current state and medications. This gap leaves the man's immediate postsurgical care without a clear steward. As a geriatrician at Harvard Medical School with special expertise in geriatric surgery, I have seen firsthand how critical coordinated postsurgical care is for older adults. My research and clinical work, including nationally recognized leadership in geriatric surgery, often intersects with cases like this patient's, for whom the transition from hospital to home care requires meticulous follow-up. Home from the hospital, the patient navigates through unfamiliarity: pain, poor appetite, and new drugs. His family, supportive yet unprepared, feels anxious about managing his altered state. Barely coping with his heart failure and diabetes amid these changes, and experiencing confusion and decline, he was readmitted to the hospital 10 days after discharge. This readmission highlights the shortcomings of a healthcare system in this country that excels in acute intervention but stumbles in the continuum of care. This scenario is a reality for many older adults, who are vulnerable due to the complexities of aging physiology, multiple comorbidities, and the effects of high-risk medications after major surgery. Inside the hospital, the care may seem well-structured, but it proves inadequate in the unpredictable environment of everyday life after discharge. Our fragmented system fails to address the whole person, leaving a significant gap in care when continuity is critically needed. This disjointed nature of the US healthcare system is costing families and patients dearly, not just in the quality of human life but also financially, with an estimated annual toll of $180 million due to readmissions after colorectal surgeries in older patients alone. Research published recently in JAMA Network Open indicates that nearly 1 in 8 older adults (12%) who undergo surgery are readmitted to the hospital within 30 days, and more than one quarter (28%) are readmitted within 6 months. Research shows up to 80% of patients do not remember the information provided to them at surgical discharge, and among those who do, 50% recall the information incorrectly. This gap is not just a lapse in care but a critical failure that affects the lives of older patients. The risk is even higher in patients who are frail or who have dementia, with frail patients readmitted at a rate of about 37% within 6 months and patients with dementia at a rate of 39%. These readmission rates significantly impact a senior's independence and function, highlighting the need for comprehensive pre- and postsurgical care planning. The US Census Bureau reports there are nearly 60 million people over age 65 years in this country. According to the Centers for Disease Control and Prevention , many of these individuals have multiple chronic conditions and have trouble caring for themselves. The costs and outcomes are even worse when comparing patients across different races and ethnicities. For instance, Black and Hispanic older adults often receive less effective care and face more significant barriers in the healthcare system compared with their White counterparts. The Kaiser Family Foundation reports these disparities lead to higher rates of complications and readmissions. These facts are a clarion call for urgent reform. Bridging the chasm between hospital discharge and home recovery in this vulnerable population requires valuing and incentivizing proactive and preventive care as much as cutting-edge technology and acute crisis-driven approaches such as interventional medicine. Preventive care, often neglected, must receive equal recognition and financial incentives. The proactive approach works. The Optimization of Senior Care and Recovery (OSCAR) program, which I developed, brings geriatrics-focused care and a proactive approach through multidisciplinary care as part of routine surgery care following colorectal surgery. This program emphasizes geriatrics care in collaboration with surgeons, through comprehensive assessment and care of high-risk, older patients following colorectal surgery. It focuses on optimizing their medications, comorbidities, nutrition, mobility, and cognition to prevent complications. The OSCAR program showed a 15% decrease in transfers to intensive care units and heart rhythm problems and significantly reduced the risk for confusion in older and sicker patients while saving up to $17,000 per case. Engaging a multidisciplinary team of healthcare professionals — including surgeons, geriatricians, primary care physicians, nurses, social workers, therapists, and nutritionists — as soon as surgery is decided for a patient ensures the goals of postdischarge care will be concrete and prioritizes what matters most to the patient, which differs for every patient based on what they hope to achieve from the surgery. Evaluating a patient's mental, cognitive, mood, functional, and nutritional status before surgery provides a baseline to tailor postdischarge plans and set realistic recovery goals. Integrating geriatrics care into surgical teams addresses unique challenges, such as cognitive dysfunction and malnutrition, that can be preemptively diagnosed and managed before those issues result in any complications like delayed wound healing due to malnutrition. Similarly, understanding a patient's social support system and anticipated postdischarge needs before surgery allows the alignment of resources and local programs to provide necessary support, preventing issues like inadequate self-care and nutrition due insufficient social support following discharge. Educating patients and caregivers with tailored, easy-to-understand, culturally adapted materials and dedicated sessions for discussing postdischarge care improves outcomes. Keeping the primary care physician informed about the patient's surgical and medical course, medication changes throughout the perioperative period, and potential issues that need follow-up ensures any emerging issues are promptly addressed following discharge, which is crucial given the multiple comorbidities and high-risk medications older patients often have. Coaching and mentoring patients and families clarifies their roles and expectations before, during, and after surgery. Additionally, establishing two-way, functional communication systems between healthcare providers, with standards and checklists through shared electronic health records and scheduled postdischarge follow-ups, could significantly enhance overall care. Bridging the gap between postsurgery hospital discharge and home recovery has never been more important. With an exponentially growing aging population and rising surgical rates in this vulnerable group, the need for effective, continuous care is more urgent than ever. The success of this endeavor hinges on valuing and incentivizing a proactive approach and preventive care to ensure older adults receive the comprehensive support they need to thrive after surgery. Note: This article originally appeared on Medscape .

  • Alzheimer's Blood Test in Primary Care Could Slash Diagnostic, Treatment Wait Times

    As disease-modifying treatments for Alzheimer's disease (AD) become available, equipping primary care physicians with a highly accurate blood test could significantly reduce diagnostic wait times. Currently, the patient diagnostic journey is often prolonged owing to the limited number of AD specialists, causing concern among healthcare providers and patients alike. Now, a new study suggests that use of high-performing blood tests in primary care could identify potential patients with AD much earlier, possibly reducing wait times for specialist care and receipt of treatment. "We need to triage in primary care and send preferentially the ones that actually could be eligible for treatment, and not those who are just worried because their grandmother reported that she has Alzheimer's," lead researcher Soeren Mattke, MD, DSc, told Medscape Medical News. "By combining a brief cognitive test with an accurate blood test of Alzheimer's pathology in primary care, we can reduce unnecessary referrals, and shorten appointment wait times," said Mattke, director of the Brain Health Observatory at the University of Southern California in Los Angeles. The findings were presented on July 28 at the Alzheimer's Association International Conference (AAIC) 2024. Projected Wait Times 100 Months by 2033 The investigators used a Markov model to estimate wait times for patients eligible for AD treatment, taking into account constrained capacity for specialist visits. The model included the projected US population of people aged 55 years or older from 2023 to 2032. It assumed that individuals would undergo a brief cognitive assessment in primary care and, if suggestive of early-stage cognitive impairment, be referred to a AD specialist under three scenarios: no blood test, blood test to rule out AD pathology, and blood test to confirm AD pathology. According to the model, without an accurate blood test for AD pathology, projected wait times to see a specialist are about 12 months in 2024 and will increase to more than 100 months in 2033, largely owing to a lack of specialist appointments. In contrast, with the availability of an accurate blood test to rule out AD, average wait times would be just 3 months in 2024 and increase to only about 13 months in 2033, because far fewer patients would need to see a specialist. Availability of a blood test to rule in AD pathology in primary care would have a limited effect on wait times because 50% of patients would still undergo confirmatory testing based on expert assumptions, the model suggests. Prioritizing Resources "Millions of people have mild memory complaints, and if they all start coming to neurologists, it could completely flood the system and create long wait times for everybody," Mattke told Medscape Medical News . The problem, he said, is that brief cognitive tests performed in primary care are not particularly specific for mild cognitive impairment. "They work pretty well for manifest advanced dementia but for mild cognitive impairment, which is a very subtle, symptomatic disease, they are only about 75% accurate. One quarter are false-positives. That's a lot of people," Mattke said. He also noted that although earlier blood tests were about 75% accurate, they are now about 90% accurate, "so we are getting to a level where we can pretty much say with confidence that this is likely Alzheimer's," Mattke said. Commenting on this research for Medscape Medical News , Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer's Association, said it is clear that blood tests, "once confirmed, could have a significant impact on the wait times" for dementia assessment. "After an initial blood test, we might be able to rule out or rule in individuals who should go to a specialist for further follow-up and testing. This allows us to really ensure that we're prioritizing resources accordingly," said Snyder, who was not involved in the study. Note: This article originally appeared on Medscape .

  • Tau Blood Test Flags Preclinical Alzheimer's Disease

    Plasma phosphorylated (p)-tau217 testing can help identify preclinical Alzheimer's disease (AD), which could aid clinical trial recruitment. Recruiting preclinical AD participants for clinical research is challenging owing to a lack of symptoms and the high cost and invasiveness of cerebrospinal fluid (CSF) tests and brain amyloid PET imaging. Plasma p-tau217 has consistently shown high performance in detecting AD pathology in patients with mild cognitive impairment and dementia, but there has been concern that it may have lower accuracy in cognitively unimpaired adults, said lead investigator Gemma Salvadó, PhD, with the Clinical Memory Research Unit, Lund University, Sweden. However, "our study shows that plasma p-tau217, alone or in combination with invasive tests, can be used accurately to assess amyloid-positivity in cognitively unimpaired participants, to streamline the inclusion of these participants in preventive clinical trials," she told Medscape Medical News. The findings were presented on July 28 at the Alzheimer's Association International Conference (AAIC) 2024 . Correlation to CSF, PET Amyloid Status The investigators assessed the clinical accuracy of plasma p-tau217 as a prescreening method in 2917 cognitively unimpaired adults (mean age, 67 years; 57% women) across 12 independent cohorts who had available plasma p-tau217 and amyloid beta PET imaging or CSF samples. They found that plasma p-tau217 levels correlated with amyloid beta CSF status and PET load. As a standalone test, plasma p-tau217 identified amyloid beta PET–positive cognitively normal adults with a positive predictive value of 80% or greater. The positive predictive value increased to 95% or greater when amyloid beta CSF or PET was used to confirm a positive plasma p-tau217 result. As a first step, plasma p-tau217 could significantly reduce the number of invasive tests performed because only individuals with a positive p-tau217 test would go on to PET imaging or CSF sampling, Salvadó told conference attendees. This may reduce trial recruitment costs and get more patients enrolled. Although the study had a large sample size, "these results should be replicated in independent studies, [in] more heterogenous participants, and coming from the clinical setting instead of observational studies to avoid possible bias," Salvadó added. A New Diagnostic Era Commenting on the research for Medscape Medical News, Heather Snyder, PhD, vice president of medical and scientific relations at the Alzheimer's Association, said what's particularly interesting about this study is that the researchers examined multiple cohorts of cognitively unimpaired individuals and "consistently" found that plasma p-tau217 could identify individuals with amyloid-positive PET and CSF with high accuracy. "This may reduce the need for more expensive and more invasive scans or lumbar punctures to confirm if an individual has the biology," Snyder said. "Blood tests are revolutionizing Alzheimer's detection, diagnosis and ultimately treatment," Howard Fillit, MD, co-founder and chief science officer of the Alzheimer's Drug Discovery Foundation, added in a statement sent to Medscape Medical News . He predicted that blood tests will "soon replace more invasive and costly PET scans as the standard of care and serve as the first line of defense in diagnosing the disease." "After many years of research, the field is in a place where we have novel biomarkers and diagnostics to support a diagnosis," the way cholesterol is used to help detect heart disease, said Fillit. "The diagnostic framework for Alzheimer's — an incredibly complex disease — is constantly evolving. As we usher in the new era of care, we are moving closer to the day when blood tests will be complemented by digital tools to provide precise and timely diagnoses and risk assessments backed by numerous data points, complementing existing cognitive tests," he added. Note: This article originally appeared on Medscape .

  • Ozempic Curbs Hunger – And Not Just for Food

    Welcome to Impact Factor, your weekly dose of commentary on a new medical study. I'm Dr F. Perry Wilson of the Yale School of Medicine. If you've been paying attention only to the headlines, when you think of "Ozempic" you'll think of a few things: a blockbuster weight loss drug or the tip of the spear of a completely new industry — why not? A drug so popular that the people it was invented for (those with diabetes) can't even get it. Ozempic and other GLP-1 receptor agonists are undeniable game changers. Insofar as obesity is the number-one public health risk in the United States, antiobesity drugs hold immense promise even if all they do is reduce obesity. But if you've been looking a bit deeper than the headline-grabbing stories, reading some of the case reports or listening to your patients, you'll start to wonder whether Ozempic is doing something more. In 2023, an article in Scientific Reports presented data suggesting that people on Ozempic might be reducing their alcohol intake, not just their total calories. A 2024 article in Molecular Psychiatry found that the drug might positively impact cannabis use disorder. An article from Brain Sciences suggests that the drug reduces compulsive shopping. A picture is starting to form, a picture that suggests these drugs curb hunger both literally and figuratively. That GLP-1 receptor agonists like Ozempic and Mounjaro are fundamentally anticonsumption drugs. In a society that — some would argue — is plagued by overconsumption, these drugs might be just what the doctor ordered. If only they could stop people from smoking. Oh, wait — they can. At least it seems they can, based on a new study appearing in the Annals of Internal Medicine. Before we get too excited, this is not a randomized trial. There actually was a small randomized trial of exenatide (Byetta), which is in the same class as Ozempic but probably a bit less potent, with promising results for smoking cessation. But Byetta is the weaker drug in this class; the market leader is Ozempic. So how can you figure out whether Ozempic can reduce smoking without doing a huge and expensive randomized trial? You can do what Nora Volkow and colleagues from the National Institute on Drug Abuse did: a target trial emulation study. A target trial emulation study is more or less what it sounds like. First, you decide what your dream randomized controlled trial would be and you plan it all out in great detail. You define the population you would recruit, with all the relevant inclusion and exclusion criteria. You define the intervention and the control, and you define the outcome. But you don't actually do the trial. You could if someone would lend you $10-50 million, but assuming you don't have that lying around, you do the next best thing, which is to dig into a medical record database to find all the people who would be eligible for your imaginary trial. And you analyze them. The authors wanted to study the effect of Ozempic on smoking among people with diabetes; that's why all the comparator agents are antidiabetes drugs. They figured out whether these folks were smoking on the basis of a medical record diagnosis of tobacco use disorder before they started one of the drugs of interest. This code is fairly specific: If a patient has it, you can be pretty sure they are smoking. But it's not very sensitive; not every smoker has this diagnostic code. This is an age-old limitation of using EHR data instead of asking patients, but it's part of the tradeoff for not having to spend $50 million. After applying all those inclusion and exclusion criteria, they have a defined population who could be in their dream trial. And, as luck would have it, some of those people really were treated with Ozempic and some really were treated with those other agents. Although decisions about what to prescribe were not randomized, the authors account for this confounding-by-indication using propensity-score matching. You can find a little explainer on propensity-score matching in an earlier column here. It's easy enough, using the EHR, to figure out who has diabetes and who got which drug. But how do you know who quit smoking? Remember, everyone had a diagnosis code for tobacco use disorder prior to starting Ozempic or a comparator drug. The authors decided that if the patient had a medical visit where someone again coded tobacco-use disorder, they were still smoking. If someone prescribed smoking cessation meds like a nicotine patch or varenicline, they were obviously still smoking. If someone billed for tobacco-cessation counseling, the patient is still smoking. We'll get back to the implications of this outcome definition in a minute. Let's talk about the results, which are pretty intriguing. When Ozempic is compared with insulin among smokers with diabetes, those on Ozempic were about 30% more likely to quit smoking. They were about 18% more likely to quit smoking than those who took metformin. They were even slightly more likely to quit smoking than those on other GLP-1 receptor antagonists, though I should note that Mounjaro, which is probably the more potent GLP-1 drug in terms of weight loss, was not among the comparators. This is pretty impressive for a drug that was not designed to be a smoking cessation drug. It speaks to this emerging idea that these drugs do more than curb appetite by slowing down gastric emptying or something. They work in the brain, modulating some of the reward circuitry that keeps us locked into our bad habits. There are, of course, some caveats. As I pointed out, this study captured the idea of "still smoking" through the use of administrative codes in the EHR and prescription of smoking cessation aids. You could see similar results if taking Ozempic makes people less likely to address their smoking at all; maybe they shut down the doctor before they even talk about it, or there is too much to discuss during these visits to even get to the subject of smoking. You could also see results like this if people taking Ozempic had fewer visits overall, but the authors showed that that, at least, was not the case. I'm inclined to believe that this effect is real, simply because we keep seeing signals from multiple sources. If that turns out to be the case, these new "weight loss" drugs may prove to be much more than that; they may turn out to be the drugs that can finally save us from ourselves. Note: This article originally appeared on Medscape .

  • Red Meat Tied to Increased Dementia Risk

    Higher intake of processed red meat, including bacon, hot dogs, and sausages, is associated with an elevated dementia risk, preliminary research shows. Study participants who consumed 0.25 or more servings of processed meat per day, or roughly two servings per week, had a 15% higher risk for dementia compared with those who consumed less than 0.10 serving per day, which is about three servings per month. "Our study found a higher intake of red meat — particularly processed red meat — was associated with a higher risk of developing dementia, as well as worse cognition," study author Yuhan Li, MHS, research assistant, Channing Division of Network Medicine, Brigham and Women's Hospital, Boston, told Medscape Medical News. However, the study also showed that replacing processed red meat with nuts and legumes could potentially lower this increased risk. The findings were presented on July 31 at the Alzheimer's Association International Conference (AAIC) 2024 . Inconsistent Research Previous studies have shown an inconsistent association between red meat intake and cognitive health. To assess the relationship between diet and dementia, the researchers used data from the Nurses' Health Study, which began recruiting female registered nurses aged 30-55 years in 1976, and the Health Professionals Follow-Up Study, which began recruiting male health professionals aged 40-75 in 1986. They assessed processed red meat intake by validated semi-quantitative food frequency questionnaires administered every 2-4 years. Participants were asked how often they consumed a serving of processed red meat. Investigators also assessed intake of unprocessed red meat, including beef, pork, or lamb as a main dish, in a sandwich or hamburger, or in a mixed dish. The investigators also looked at participants' intake of nuts and legumes. Dementia outcome was a composite endpoint of self-reported dementia and dementia-related death. "Specifically, participants reported a physician diagnosis of Alzheimer's disease or other forms of dementia by questionnaire. Deaths were identified through state vital statistics records, the National Death Index, family reports, and the postal system," said Li. Three Cognitive Outcomes Researchers examined three outcomes: dementia, subjective cognitive decline, and objective cognitive function. For dementia, they ascertained incident cases in 87,424 individuals in the UK's National Health Service database without Parkinson's disease or baseline dementia, stroke, or cancer. They longitudinally collected information on subjective cognitive decline from 33,908 Nurses' Health Study participants and 10,058 participants in the Health Professionals Follow-Up Study. Cognitive function was assessed using the Telephone Interview for Cognitive Status (1995-2008) in a subset of 17,458 Nurses' Health Study participants. Over a follow-up of 38 years (1980-2018), there were 6856 dementia cases in the Nurses' Health Study. Participants with processed red meat intake ≥ 0.25 serving/day, compared with < 0.10 serving/day, had 15% higher risk for dementia (hazard ratio [HR], 1.15; 95% CI, 1.08-1.23; P < .001). In addition to an increased risk for dementia , intake of processed red meat was associated with accelerated cognitive aging in global cognition (1.61 years per 1–serving/day increment; 95% CI, 0.20, 3.03) and verbal memory (1.69 years per 1–serving/day increment; 95% CI, 0.13, 3.25; both P = .03). Participants with processed red meat intake ≥ 0.25 serving/day had a 14% higher likelihood of subjective cognitive decline compared with those with intake < 0.10 serving/day (odds ratio [OR], 1.14; 95% CI, 1.04-1.24; P = .004). For unprocessed red meat, consuming ≥ 1.00 serving/day vs < 0.50 serving/day was associated with a 16% higher likelihood of subjective cognitive decline (OR, 1.16; 95% CI, 1.04-1.30; P = .02). Substitution Analysis Researchers modeled the effects of replacing 1 serving/day of processed red meat with 1 serving/day of nuts and legumes on cognitive outcomes. They did this by treating food intakes as continuous variables and calculating the differences in coefficients of the two food items. They found that substituting legumes and nuts was associated with a 23% lower risk for dementia (HR, 0.77; 95% CI, 0.69-0.86), 1.37 fewer years of cognitive aging (95% CI, -2.49 to -0.25), and 20% lower odds of subjective cognitive decline (OR, 0.80, 95% CI, 0.69-0.92). The research cannot determine whether it's the processing method itself or the type of red meat that affects cognition, Li cautioned. "Our study is an epidemiologic study, not a biological mechanism study, but based on our findings, red meat may be related to worse cognition, and processed red meat may add additional risk," she said. She also noted that because the study focused solely on red meats, the study cannot determine the potential on the impact of other processed meats on cognition. Although the study doesn't address a possible mechanism linking processed red meat with cognition, Li said it's possible such meats have high levels of relatively harmful substances, such as nitrites, N-nitroso compounds, and sodium, and that "these carry the additional risk to brain health." There are currently no specific guidelines regarding the "safe" amount of processed meat consumption specifically related to cognition, said Li. The study is important because of its large sample size, long follow-up period, and inclusion of repeated measurements of diet, the investigators noted. In addition, researchers assessed both processed and unprocessed red meat and evaluated multiple cognitive outcomes. The investigators plan to assess the association between other modifiable factors and cognitive health. Experts Weigh In Commenting on the research for Medscape Medical News , Claire Sexton, DPhil, senior director of scientific programs and outreach at the Alzheimer's Association, agreed past studies on the topic have been "mixed," with only some studies reporting links between cognition or dementia and processed red meat. Another unique aspect of the study, said Sexton, was the replacement analysis showing the brain benefits of eating nuts and legumes in place of processed red meat. "So, it's not just suggesting to people what not to do, but also what they can be doing instead." That's why this large study with more than 130,000 adults that tracked individuals for close to 40 years in some cases "is so valuable," she added. In a release from the Science Media Centre in the United Kingdom, several other experts commented on the study. Among them, Kevin McConway, PhD, emeritus professor of applied statistics at the Open University in the UK, said that "it's pretty well impossible to get a clear message from the information that is available so far about this research. It is a conference paper, and all we have seen so far is a press release, a brief summary of the research, and a diagram. There isn't a detailed, peer-reviewed research report, not yet anyway. Putting out limited information like this isn't the right way to report science." McConway also noted that the observational study recorded participants' diets and dementia diagnoses over several years without assigning specific diets. Those who ate more red processed meat had higher rates of dementia and cognitive decline. However, it's unclear if these differences are due to red meat consumption or other factors, such as diet, age, ethnicity, or location. Researchers typically adjust for these factors, but the available information doesn't specify what adjustments were made or their impact, he noted, and without detailed data, it's impossible to evaluate the study's quality. Although eating more red processed meat might increase dementia risk, more research is needed to confirm this, McConway added. Also commenting, Sebastian Walsh, an NIHR doctoral fellow who researches population-level approaches to dementia risk reduction at University of Cambridge in the UK, said that without seeing the full paper, it's difficult to know exactly what to make of the study's findings. "On the surface, this is a large and long study. But it isn't clear how the analysis was done — specifically what other factors were taken into account when looking at this apparent relationship between red meat and dementia. "Despite a lot of research looking at specific foods and different diseases, the basic public health advice that eating a healthy, balanced diet is good for health is essentially unchanged. Most people know and accept this. What is most important is to find ways of supporting people, particularly those from poorer backgrounds, to follow this advice and address the obesity epidemic," said Walsh. Note: This article originally appeared on Medscape .

  • Almost 50% of Global Dementia Cases May Be Preventable

    PHILADELPHIA – Nearly half of dementia cases worldwide could theoretically be prevented or delayed by eliminating 14 modifiable risk factors during an individual's lifetime, a report from the Lancet Commission on dementia prevention, intervention, and care. The report adds two new modifiable risk factors for dementia — high cholesterol and vision loss — to the 12 risk factors identified in the 2020 Lancet Commission report, which were linked to about 40% of all dementia cases. The original Lancet Commission report, published in 2017, identified nine modifiable risk factors that were estimated to be responsible for one third of dementia cases. "Our new report reveals that there is much more that can and should be done to reduce the risk of dementia. It's never too early or too late to act, with opportunities to make an impact at any stage of life," lead author Gill Livingston, MD, from University College London, UK, said in a statement. The 57-page report was published online July 31 in The Lancet Neurology to coincide with its presentation at the Alzheimer's Association International Conference 2024. 'Compelling' New Evidence The 12 risk factors cited in the 2020 report are lower levels of education, hearing loss, hypertension, smoking, obesity, depression, physical inactivity, diabetes, excessive alcohol consumption, traumatic brain injury (TBI), air pollution, and social isolation. According to the authors of the current report, there is "new compelling evidence" that untreated vision loss and elevated low-density lipoprotein (LDL) cholesterol are also risk factors for dementia. These two added risk factors are associated with 9% of all dementia cases — with an estimated 7% of cases due to high LDL cholesterol from about age 40 years, and 2% of cases due to untreated vision loss in later life, the authors said. Out of all 14 risk factors, those tied to the greatest proportion of dementia in the global population are hearing impairment and high LDL cholesterol (7% each), along with less education in early life, and social isolation in later life (5% each), the report estimates. The new report also outlines 13 recommendations aimed at individuals and governments to help guard against dementia. They include preventing and treating hearing loss, vision loss, and depression; being cognitively active throughout life; using head protection in contact sports; reducing vascular risk factors (high cholesterol, diabetes, obesity, hypertension); improving air quality; and providing supportive community environments to increase social contact. Dr. Tara Spires-Jones, PhD, president of the British Neuroscience Association, emphasized that while this research doesn't directly link specific factors to dementia, it supports evidence that a healthy lifestyle — encompassing education, social activities, exercise, cognitive engagement, and avoiding head injuries and harmful factors for heart and lung health — can enhance brain resilience and prevent dementia. In an interview with Medscape Medical News, Heather M. Snyder, PhD, senior vice president of medical and scientific relations, Alzheimer's Association, said, "Our brains are complex and what happens throughout our lives may increase or decrease our risk for dementia as we age. Protecting brain health as we age requires a comprehensive approach that includes discussions on diet, exercise, heart health, hearing, and vision." Also weighing in on the new report, Shaheen Lakhan, MD, PhD, neurologist and researcher based in Miami, Florida, said the addition of high cholesterol is "particularly noteworthy as it reinforces the intricate connection between vascular health and brain health — a link we've long suspected but can now target more effectively." As for vision loss, "it's not just a matter of seeing clearly; it's a matter of thinking clearly. Untreated vision loss can lead to social isolation, reduced physical activity, and cognitive decline," said Lakhan. Dementia Is Not Inevitable In his view, "the potential to prevent or delay nearly half of dementia cases by addressing these risk factors is nothing short of revolutionary. It shifts our perspective from viewing dementia as an inevitable part of aging to seeing it as a condition we can actively work to prevent," Lakhan added. He said the report's emphasis on health equity is also important. "Dementia risk factors disproportionately affect socioeconomically disadvantaged groups and low- and middle-income countries. Addressing these disparities isn't just a matter of fairness in the fight against dementia, equality in prevention is as important as equality in treatment," Lakhan commented. While the report offers hope, it also presents a challenge, he said. Implementing the recommended preventive measures requires a "coordinated effort from individuals, healthcare systems, and policymakers. The potential benefits, both in terms of quality of life and economic savings, make this effort not just worthwhile but imperative. Preventing dementia is not just a medical imperative — it's an economic and humanitarian one," Lakhan said. Masud Husain, PhD, with University of Oxford, UK, agreed. The conclusions in this report are "very important for all of us, but particularly for health policy makers and government," he told the Science Media Centre. "If we did simple things well such as screening for some of the factors identified in this report, with adequate resources to perform this, we have the potential to prevent dementia on a national scale. This would be far more cost effective than developing high-tech treatments, which so far have been disappointing in their impacts on people with established dementia," Husain said. Note: This article originally appeared on Medscape .

  • An Effective Nondrug Approach to Improve Sleep in Dementia, Phase 3 Data Show

    A multicomponent nonpharmaceutical intervention improves sleep in people with dementia living at home, early results of a new phase 3 randomized controlled trial (RCT) show. The benefits of the intervention — called DREAMS-START — were sustained at 8 months and extended to caregivers, the study found. "We're pleased with our results. We think that we were able to deliver it successfully and to a high rate of fidelity," said study investigator Penny Rapaport, PhD, Division of Psychiatry, University College London, United Kingdom. The findings were presented on July 30 at the Alzheimer's Association International Conference (AAIC) 2024 . Sustained, Long-Term Effect Sleep disturbances are very common in dementia. About 26% of people with all types of dementia will experience sleep disturbances, and that rate is higher in certain dementia subtypes, such as dementia with Lewy bodies, said Rapaport. Such disturbances are distressing for people living with dementia as well as for those supporting them, she added. They're "often the thing that will lead to people transitioning and moving into a care home." Rapaport noted there has not been full RCT evidence that any nonpharmacologic interventions or light-based treatments are effective in improving sleep disturbances. Medications such as antipsychotics and benzodiazepines aren't recommended as first-line treatment in people with dementia "because often these can be harmful," she said. The study recruited 377 dyads of people living with dementia (mean age, 79.4 years) and their caregivers from 12 national health service sites across England. "We were able to recruit an ethnically diverse sample from a broad socioeconomic background," said Rapaport. Researchers allocated the dyads to the intervention or to a treatment as usual group. About 92% of participants were included in the intention-to-treat analysis at 8 months, which was the primary time point. The intervention consists of six 1-hour interactive sessions that are "personalized and tailored to individual goals and needs," said Rapaport. It was delivered by supervised, trained graduates, not clinicians. The sessions focused on components of sleep hygiene (healthy habits, behaviors, and environments); activity and exercise; a tailored sleep routine; strategies to manage distress; natural and artificial light; and relaxation. A whole session was devoted to supporting sleep of caregivers. The trial included masked outcome assessments, "so the people collecting the data were blinded to the intervention group," said Rapaport. The primary outcome was the Sleep Disorders Inventory (SDI) score. The SDI is a questionnaire about frequency and severity of sleep-disturbed behaviors completed by caregivers; a higher score indicates a worse outcome. The study adjusted for baseline SDI score and study site. The adjusted mean difference between groups on the SDI was -4.7 points (95% CI, -7.65 to -1.74; P = .002) at 8 months. The minimal clinically important difference on the SDI is a 4-point change, noted Rapaport. The adjusted mean difference on the SDI at 4 months (a secondary outcome) was - 4.4 points (95% CI, -7.3 to -1.5; P = .003). Referring to illustrative graphs, Rapaport said that SDI scores decreased at both 4 and 8 months. "You can see statistically, there's a significant difference between groups at both time points," she said. "We saw a sustained effect, so not just immediately after the intervention, but afterwards at 8 months." As for other secondary outcomes, the study found a significant reduction in neuropsychiatric symptoms among people with dementia at 8 months in the intervention arm relative to the control arm. In addition, sleep and anxiety significantly improved among caregivers after 8 months. This shows "a picture of things getting better for the person with dementia, and the person who's caring for them," said Rapaport. She noted the good adherence rate, with almost 83% of people in the intervention arm completing four or more sessions. Fidelity to the intervention (ie, the extent to which it is implemented as intended) was also high, "so we feel it was delivered well," said Rapaport. Researchers also carried out a health economics analysis and looked at strategies for implementation of the program, but Rapaport did not discuss those results. Encouraging Findings Commenting for Medscape Medical News, Alex Bahar-Fuchs, PhD, Faculty of Health, School of Psychology, Deakin University, Victoria, Australia, who co-chaired the session featuring the research,said the findings of this "well-powered" RCT are "encouraging," both for the primary outcome of sleep quality and for some of the secondary outcomes for the care-partner. "The study adds to the growing evidence behind several nonpharmacological treatment approaches for cognitive and neuropsychiatric symptoms of people with dementia," he said. The results "offer some hope for the treatment of a common disturbance in people with dementia which is associated with poorer outcomes and increased caregiver burden," he added. An important area for further work would be to incorporate more objective measures of sleep quality, said Bahar-Fuchs. Because the primary outcome was measured using a self-report questionnaire (the SDI) completed by care-partners, and because the intervention arm could not be blinded, "it remains possible that some detection bias may have affected the study findings," said Bahar-Fuchs. He said he would like to see the research extended to include an active control condition "to be able to better ascertain treatment mechanisms." Note: This article originally appeared on Medscape .

  • Mental Health and Diabetic Complications: A Two-Way Link

    TOPLINE: Mental health disorders increase the likelihood of developing chronic diabetic complications and vice versa across all age groups in patients with type 1 diabetes (T1D) or type 2 diabetes (T2D). METHODOLOGY: Understanding the relative timing and association between chronic diabetic complications and mental health disorders may aid in improving diabetes screening and care. Researchers used a US national healthcare claims database (data obtained from 2001 to 2018) to analyze individuals with and without T1D and T2D, who had no prior mental health disorder or chronic diabetic complication. The onset and presence of chronic diabetic complications and mental health disorders were identified to determine their possible association. Individuals were stratified by age: 0-19, 20-39, 40-59, and ≥ 60 years. TAKEAWAY: Researchers analyzed 44,735 patients with T1D (47.5% women) and 152,187 with T2D (46.0% women), who were matched with 356,630 individuals without diabetes (51.8% women). The presence of chronic diabetic complications increased the risk for a mental health disorder across all age groups, with the highest risk seen in patients aged ≥ 60 years (hazard ratio [HR], 2.9). Similarly, diagnosis of a mental health disorder increased the risk for chronic diabetic complications across all age groups, with the highest risk seen in patients aged 0-19 years (HR, 2.5). Patients with T2D had a significantly higher risk for a mental health disorder and a lower risk for chronic diabetic complications than those with T1D across all age groups, except those aged ≥ 60 years. The bidirectional association between mental health disorders and chronic diabetic complications was not affected by the diabetes type (P > .05 for all interactions). IN PRACTICE: "Clinicians and healthcare systems likely need to increase their focus on MHDs [mental health disorders], and innovative models of care are required to optimize care for both individuals with type 1 diabetes and those with type 2 diabetes," the authors wrote. LIMITATIONS: The study relied on International Classification of Diseases 9th and 10th revision codes, which might have led to misclassification of mental health conditions, chronic diabetes complications, and diabetes type. The data did not capture the symptom onset and severity. The findings may not be generalizable to populations outside the United States. DISCLOSURES: The study was supported by the Juvenile Diabetes Research Foundation (now Breakthrough T1D). Some authors reported receiving speaker or expert testimony honoraria and research support, and some declared serving on medical or digital advisory boards or as consultants for various pharmaceutical and medical device companies. Note: This article originally appeared on Medscape .

  • Common Antidepressants Ranked by Potential for Weight Gain

    Eight commonly used antidepressants have been ranked by their weight gain potential. Results of a large observational study showed small differences in short- and long-term weight change in patients prescribed one of eight antidepressants, with bupropion associated with the lowest weight gain and escitalopram, paroxetine, and duloxetine associated with the greatest. Escitalopram, paroxetine, and duloxetine users were 10%-15% more likely to gain at least 5% of their baseline weight compared with those taking sertraline, which was used as a comparator. Investigators noted that the more clinicians and patients know about how a particular antidepressant may affect patients’ weight, the better informed they can be about which antidepressants to prescribe. “Patients and their clinicians often have several options when starting an antidepressant for the first time. This study provides important real-world evidence regarding the amount of weight gain that should be expected after starting some of the most common antidepressants,” lead author Joshua Petimar, ScD, assistant professor of population medicine in the Harvard Pilgrim Health Care Institute at Harvard Medical School, Boston, said in a press release. The findings were published online in Annals of Internal Medicine . Real-World Data Though weight gain is a commonly reported side effect of antidepressant use and may lead to medication nonadherence and worse outcomes, there is a lack of real-world data about weight change across specific medications. Investigators used electronic health records from eight health care systems across the United States spanning from 2010 to 2019. The analysis included information on 183,118 adults aged 20-80 years who were new users of one of eight common first-line antidepressants. Investigators measured their weight at baseline and at 6, 12, and 24 months after initiation to estimate intention-to-treat (ITT) effects of weight change. At baseline, participants were randomly assigned to begin sertraline, citalopram, escitalopram, fluoxetine, paroxetine, bupropion, duloxetine, or venlafaxine. The most common antidepressants prescribed were sertraline, citalopram, and bupropion. Approximately 36% of participants had a diagnosis of depression, and 39% were diagnosed with anxiety. Among selective serotonin reuptake inhibitors (SSRIs), escitalopram and paroxetine were associated with the greatest 6-month weight gain, whereas bupropion was associated with the least weight gain across all analyses. Using sertraline as a comparator, 6-month weight change was lower for bupropion (difference, 0.22 kg) and higher for escitalopram (difference, 0.41 kg), duloxetine (difference, 0.34 kg), paroxetine (difference, 0.37 kg), and venlafaxine (difference, 0.17 kg). Escitalopram, paroxetine, and duloxetine users were 10%-15% more likely to gain at least 5% of their baseline weight compared with sertraline users. Investigators noted little difference in adherence levels between medications during the study except at 6 months, when it was higher for those who took bupropion (41%) than for those taking other antidepressants (28%-36%). The study included data only on prescriptions and investigators could not verify whether the medications were dispensed or taken as prescribed. Other limitations included missing weight information because most patients did not encounter the health system at exactly 6, 12, and 24 months; only 15%-30% had weight measurements in those months. Finally, the low adherence rates made it difficult to attribute relative weight change at the 12- and 24-month time points to the specific medications of interest. “Clinicians and patients could consider these differences when making decisions about specific antidepressants, especially given the complex relationships of obesity and depression with health, quality of life, and stigma,” the authors wrote. The study was funded by the National Institute of Diabetes and Digestive and Kidney Diseases. Disclosures are noted in the original article. Note: This article originally appeared on MDedge .

bottom of page