Archives
Meta-Analysis: Long Half-Life Phosphodiesterase Inhibitors Reduce HbA1c in Adults with Elevated Baseline Levels
6 Jan, 2025 | 08:00h | UTCBackground: Phosphodiesterase type 5 (PDE5) inhibitors are traditionally used to treat erectile dysfunction and pulmonary arterial hypertension. Recent evidence suggests that PDE5 inhibitors could also be repurposed to lower hemoglobin A1c (HbA1c) in patients with type 2 diabetes. Given the disparity in half-lives among these agents, this meta-analysis focused on whether longer half-life PDE5 inhibitors (tadalafil, PF-00489791) produce a more sustained HbA1c reduction compared to short half-life PDE5 inhibitors (sildenafil, avanafil).
Objective: To evaluate the effect of PDE5 inhibitors on HbA1c levels in individuals with baseline values above 6%, comparing agents with short and long half-lives to assess differential clinical benefits in glycemic control.
Methods: This systematic review and meta-analysis included only randomized controlled trials (RCTs) in which participants received any PDE5 inhibitor for at least four weeks, with control or placebo for comparison. Major databases (Cochrane CENTRAL, PubMed Central, ClinicalTrials.gov, and WHO ICTRP) were searched through September 2024 without language restrictions. Statistical analyses were performed using a random-effects model, reporting mean differences in HbA1c. Secondary outcomes (HOMA-IR, lipid profiles, fasting glucose, and others) were also explored.
Results: Thirteen RCTs were eligible (N=1083). Long half-life agents showed a significant mean reduction of approximately −0.40% in HbA1c (p=0.002) in the overall analysis, whereas short half-life PDE5 inhibitors exhibited no significant change. In more stringent subgroup analyses (≥8 weeks’ duration, exclusive type 2 diabetes, baseline HbA1c ≥6.5%), long half-life PDE5 inhibitors maintained a significant decrease (−0.50%), while short half-life agents paradoxically showed a slight but significant increase (+0.36%, p=0.03). In trials enrolling patients with poorly controlled diabetes (baseline HbA1c near 10%), tadalafil’s HbA1c reductions were considerably larger, aligning with the efficacy of other standard oral antidiabetic medications.
Conclusions: Long half-life PDE5 inhibitors appear to confer meaningful reductions in HbA1c, comparable to established oral antidiabetic agents, particularly in patients whose HbA1c is inadequately controlled. In contrast, short half-life PDE5 inhibitors did not show a consistent benefit and may paradoxically raise HbA1c in certain subgroups, although further large-scale studies are warranted to confirm these findings.
Implications for Practice: Long half-life PDE5 inhibitors could serve as an adjunctive therapy in type 2 diabetes management, especially in individuals with higher baseline HbA1c. Yet, caution is advised given limited data on adverse events and the short duration of most included trials. Physicians should remain prudent until more robust evidence, especially in populations with markedly elevated HbA1c, becomes available.
Study Strengths and Limitations: Strengths include a direct comparison between short and long half-life PDE5 inhibitors in a clinically relevant population, plus systematic subgroup analyses. Limitations involve heterogeneity in trial designs, relatively low baseline HbA1c in most participants, and a lack of long-term follow-up data or major clinical endpoints.
Future Research: Subsequent trials should target populations with poorly controlled diabetes (HbA1c ≥9.0%) and assess longer durations (≥3 months) to capture the full impact of PDE5 inhibitor therapy. A deeper examination of combination regimens, pharmacokinetic optimization, and clinical outcomes like cardiovascular events would further clarify the role of these agents in diabetes care.
Reference: Kim J, Zhao R, Kleinberg LR, Kim K. (2025) “Effect of long and short half-life PDE5 inhibitors on HbA1c levels: a systematic review and meta-analysis.” eClinicalMedicine, 80, 103035. Available at: DOI: http://doi.org/10.1016/j.eclinm.2024.103035
AGA Clinical Practice Update on Managing Portal Vein Thrombosis in Cirrhotic Patients: Expert Review
3 Jan, 2025 | 10:00h | UTCIntroduction: This summary highlights key recommendations from an AGA expert review on portal vein thrombosis (PVT) in cirrhotic patients. PVT is common in cirrhosis, with an estimated five-year incidence of around 11%, and may worsen portal hypertension and elevate mortality. Management is challenging because of limited evidence, the potential complications of both PVT and anticoagulation, and significant heterogeneity regarding clot characteristics, host factors, and cirrhosis severity. This review presents the latest guidance on identifying clinically relevant PVT, selecting anticoagulation, and considering endovascular interventions, including TIPS (transjugular intrahepatic portosystemic shunt).
Key Recommendations:
- No Routine Screening: Asymptomatic patients with compensated cirrhosis do not require regular screening for PVT in the absence of suggestive clinical changes.
- Imaging Confirmation: When Doppler ultrasound reveals suspected PVT, contrast-enhanced CT or MRI is recommended to confirm the diagnosis, exclude malignancy, and characterize clot extent and occlusion.
- Hypercoagulability Testing: Extensive thrombophilia workup is not indicated unless there is family or personal history of thrombotic events, or associated laboratory abnormalities.
- Intestinal Ischemia Management: Patients who develop PVT with evidence of intestinal ischemia should receive prompt anticoagulation and, ideally, multidisciplinary team care involving gastroenterology, hepatology, interventional radiology, hematology, and surgery.
- Observation of Minor or Recent Thrombi: In cirrhotic patients without ischemia, with recent (<6 months) thrombi that are <50% occlusive, close imaging follow-up every three months is a reasonable option to track potential spontaneous clot regression.
- Anticoagulation for Significant PVT: Consider anticoagulation for more extensive or obstructive (>50%) recent PVT, especially if the main portal vein or mesenteric vessels are involved. Candidates for liver transplantation and those with inherited thrombophilia may derive additional benefit.
- Chronic Cavernous PVT: Anticoagulation is generally not advised in patients with long-standing (>6 months) complete occlusion and well-formed collateral channels.
- Variceal Screening: Perform endoscopic screening or ensure prophylaxis for varices. Avoid delays in initiating anticoagulation, as timeliness is essential for better recanalization outcomes.
- Choice of Anticoagulant: Vitamin K antagonists, low-molecular-weight heparin, and direct oral anticoagulants (DOACs) are all viable options in cirrhosis. DOACs may be appropriate in well-compensated (Child-Turcotte-Pugh class A or certain class B) cirrhosis but should be avoided in class C. Treatment selection should consider patient preferences, monitoring feasibility, and risk of bleeding.
- Duration of Therapy: Reassess clot status with cross-sectional imaging every three months. Continue anticoagulation for transplant-eligible individuals who show partial or complete recanalization, and consider discontinuation in nonresponders after six months if futility is evident.
- TIPS Revascularization: Portal vein revascularization using TIPS may be pursued in patients who have other TIPS indications (like refractory ascites or variceal bleeding) or to improve transplant feasibility by recanalizing portal flow.
Conclusion: PVT in cirrhosis remains a complex clinical issue requiring careful evaluation of clot extent, timing, and the potential need for transplantation. The recommendations presented here underscore prompt imaging, timely anticoagulation for high-risk thrombi, and individualized therapy based on Child-Turcotte-Pugh classification and bleeding risk. When necessary, multidisciplinary collaboration is key to achieving optimal patient outcomes. Prospective randomized trials and standardized classifications of PVT will be instrumental in refining future guidelines.
Reference:
Davis JPE, Lim JK, Francis FF, Ahn J. AGA Clinical Practice Update on Management of Portal Vein Thrombosis in Patients With Cirrhosis: Expert Review. Gastroenterology. 2024. DOI: http://doi.org/10.1053/j.gastro.2024.10.038
Cohort study: Higher Telehealth Use Linked to Lower Rates of Select Low-Value Services in Medicare
3 Jan, 2025 | 09:30h | UTCBackground: Telehealth has rapidly expanded in recent years, potentially transforming how primary care is delivered. However, questions remain regarding its impact on low-value services—tests or procedures that confer minimal benefit and might be wasteful. Previous research raised concerns that virtual encounters could either reduce or increase unnecessary care, but rigorous data on this matter have been limited.
Objective: To assess whether a primary care practice’s adoption of telehealth is associated with changes in the rate of eight established low-value services, comprising office-based procedures, laboratory tests, imaging studies, and mixed-modality interventions.
Methods: This retrospective cohort study used Medicare fee-for-service claims from 2019 through 2022 for 577,928 beneficiaries attributed to 2,552 primary care practices in Michigan. Practices were grouped into low, medium, or high tertiles of telehealth volume in 2022. A difference-in-differences approach was performed, comparing annualized low-value service rate changes between the prepandemic (2019) and postpandemic (2022) periods.
Results: Overall, high-telehealth practices demonstrated reduced rates of certain office-based low-value services, specifically cervical cancer screening (−2.9 services per 1000 beneficiaries, 95% CI −5.3 to −0.4) among older women. Additionally, high-telehealth practices showed lower rates of select low-value thyroid tests (−40 per 1000 beneficiaries, 95% CI −70 to −9). For five other measures—including imaging for low back pain, imaging for uncomplicated headache, and PSA tests in older men—no significant association was observed between greater telehealth use and low-value service rates. Notably, telehealth volume increased markedly from 2019 to 2022, while in-person visits generally decreased.
Conclusions: These findings suggest that widespread telehealth adoption in Michigan primary care was not associated with elevated low-value service use. In fact, certain office-based low-value tests appeared to decline, possibly owing to fewer face-to-face opportunities to perform unnecessary interventions. Nonetheless, caution is warranted in generalizing these findings, as telehealth’s effects may vary across different clinical contexts.
Implications for Practice: Health care systems should consider structured telehealth protocols that encourage judicious testing and minimize overuse. While telehealth can broaden access, clinicians must remain vigilant to avoid missing necessary care. Clear guidelines, effective triage, and patient education might help balance convenience with quality.
Study Strengths and Limitations: Strengths include a large Medicare population and established low-value service metrics, enhancing the study’s validity. Limitations include a single-state focus (Michigan) and reliance on claims data without detailed clinical information, restricting the scope of outcomes assessed.
Future Research: Further investigation is needed to verify whether these trends extend to other states, different insurance models, and additional low-value services (including medications). Evaluations of telehealth’s role in both low-value and high-value care could offer deeper insights into its broader effects on cost and quality.
Reference: Liu T, Zhu Z, Thompson MP, et al. Primary Care Practice Telehealth Use and Low-Value Care Services. JAMA Network Open. 2024;7(11):e2445436. DOI: http://doi.org/10.1001/jamanetworkopen.2024.45436
RCT: Chlorthalidone Shows No Renal Advantage Over Hydrochlorothiazide Under Equivalent Dosing in Older Adults With Hypertension
3 Jan, 2025 | 09:00h | UTCBackground: Hypertension is a critical factor in chronic kidney disease (CKD) progression and cardiovascular risk. Thiazide-type diuretics, such as chlorthalidone and hydrochlorothiazide, are first-line antihypertensive treatments. However, whether one agent confers stronger renal protection remains contested, especially at doses considered pharmacologically comparable. Prior observational studies suggested potential discrepancies in kidney outcomes and hypokalemia incidence. This secondary analysis of the Diuretic Comparison Project (DCP) further clarifies the comparative effectiveness of chlorthalidone versus hydrochlorothiazide on renal endpoints.
Objective: To evaluate whether chlorthalidone (12.5–25 mg/day) prevents CKD progression more effectively than hydrochlorothiazide (25–50 mg/day) in adults ≥65 years with hypertension and no pre-specified exclusion by renal function.
Methods: The DCP is a pragmatic, open-label randomized clinical trial embedded in Veterans Affairs (VA) facilities across the United States. Between June 1, 2016, and December 31, 2023, patients already receiving hydrochlorothiazide (25 or 50 mg/day) for hypertension were randomized either to continue that medication or switch to chlorthalidone (12.5–25 mg/day), reflecting equivalent potency.
The prespecified primary kidney outcome was a composite of doubling of serum creatinine, a terminal estimated glomerular filtration rate (eGFR) <15 mL/min, or dialysis initiation. Secondary measures included ≥40% eGFR decline, incident CKD (new eGFR <60 mL/min), eGFR slope, and relevant adverse events. Laboratory data were obtained through usual clinical care rather than protocol-driven testing.
Results: Among 13,523 randomized participants, 12,265 had analyzable renal data (mean [SD] age, 71 [4] years; 96.8% male). The mean (SD) follow-up was 3.9 (1.3) years. Chlorthalidone did not demonstrate superiority over hydrochlorothiazide for the composite kidney endpoint (6.0% vs 6.4%; hazard ratio, 0.94; 95% CI, 0.81–1.08; P=.37). Additional analyses showed no differences in CKD incidence, ≥40% eGFR decline, or eGFR slope. Hypokalemia occurred more frequently in chlorthalidone users (overall ~2% higher rate of low potassium measurements), and hospitalizations for hypokalemia also trended higher.
Conclusions: Under dosing regimens designed to achieve equivalent antihypertensive potency, chlorthalidone provided no measurable renal benefit over hydrochlorothiazide but posed a modestly elevated risk of hypokalemia. These findings reinforce the clinical interchangeability of both agents for long-term blood pressure management in older adults, provided serum potassium is monitored.
Implications for Practice: Clinicians can confidently employ either chlorthalidone or hydrochlorothiazide in older patients with hypertension, including those with mild or moderate CKD, since renal deterioration rates did not differ significantly. Importantly, the trial used half the milligram amount of chlorthalidone (12.5–25 mg/day) to match the usual doses of hydrochlorothiazide (25–50 mg/day). Recognizing this equivalence helps guide therapy transitions and dosing decisions. Vigilant monitoring of electrolytes remains essential, particularly when prescribing chlorthalidone, given the slightly higher incidence of hypokalemia.
Study Strengths and Limitations: Strengths include the randomized design, broad participant inclusion, and pragmatic structure that mirrors real-world prescribing. Limitations involve potential underestimation or overestimation of renal events due to reliance on routine (rather than scheduled) lab tests. Also, nearly all participants had prior hydrochlorothiazide exposure, which may have influenced tolerance and adherence patterns.
Future Research: Further clinical trials focusing on more advanced CKD stages, distinct comorbidities, or combination regimens (e.g., with potassium-sparing agents) would expand our understanding of how thiazide-type diuretics influence long-term kidney outcomes. Extended follow-up or additional subgroup analyses could also shed light on the interplay of dose-response effects in highly vulnerable populations.
Reference: Ishani A, Hau C, Raju S, et al. “Chlorthalidone vs Hydrochlorothiazide and Kidney Outcomes in Patients With Hypertension: A Secondary Analysis of a Randomized Clinical Trial.” JAMA Netw Open. 2024;7(12):e2449576. DOI: http://doi.org/10.1001/jamanetworkopen.2024.49576
Cohort Study: One in Four Patients Demonstrates Covert Cognition Despite Behavioral Unresponsiveness
3 Jan, 2025 | 08:30h | UTCBackground: Cognitive motor dissociation (CMD) refers to the presence of specific neuroimaging or electrophysiological responses to commands in patients otherwise incapable of voluntary behavioral output. Detecting CMD is clinically relevant because its underdiagnosis may lead to premature decisions regarding goals of care, life-sustaining treatment, and rehabilitation efforts. Although several single-center studies have suggested that CMD may exist in 10–20% of patients with disorders of consciousness, larger multinational data were lacking, particularly using both functional magnetic resonance imaging (fMRI) and electroencephalography (EEG).
Objective: To determine how often CMD occurs in a large, multinational cohort of adults with impaired consciousness and to evaluate the clinical variables potentially associated with this phenomenon.
Methods: This prospective cohort study included 353 adults with disorders of consciousness recruited from six international centers between 2006 and 2023. Enrolled participants had at least one behavioral assessment using the Coma Recovery Scale–Revised (CRS-R) and underwent task-based fMRI, EEG, or both. Sites utilized validated analytic pipelines and automated data processing to minimize false positives. Participants were divided into two groups: those without observable responses to verbal commands (coma, vegetative state, or minimally conscious state–minus) and those with observable responses (minimally conscious state–plus or emerged). CMD was defined as the absence of any observable behavioral response to commands, combined with a positive command-following signal on fMRI or EEG.
Results: Among 241 participants with no overt command-following, 25% showed CMD through either fMRI alone, EEG alone, or both. CMD was more common in younger patients, those assessed later after injury, and those with traumatic brain injury. Interestingly, in 112 participants who did exhibit command-following on bedside exams, only 38% demonstrated confirmatory responses on fMRI or EEG. These findings support the notion that the tasks used for neuroimaging and electrophysiological assessments may require more sustained cognitive engagement than typical bedside evaluations.
Conclusions: CMD was identified in about one in four patients who lacked behavioral command-following. Combining fMRI with EEG likely increases detection rates compared to either modality alone. The results highlight the need for increased awareness of covert cognitive activity in this population, given potential ramifications for prognosis, family counseling, and clinical care.
Implications for Practice: Clinicians should consider the possibility of CMD in patients who appear unresponsive at the bedside. When feasible, employing both fMRI and EEG might reveal hidden cognitive capacities that can guide patient-centered decisions, encourage targeted therapies, and allow healthcare teams to respect potential consciousness and autonomy. However, such technologies remain limited to specialized centers.
Study Strengths and Limitations: Strengths include a diverse sample from multiple international sites and the integration of two complementary neurodiagnostic techniques. Limitations involve heterogeneous recruitment practices, variations in local data acquisition methods, and potential selection biases toward patients who survived until advanced testing was available. Additionally, the absence of standardized paradigms across sites reduced consistency of results.
Future Research: Further large-scale investigations should standardize fMRI and EEG protocols and determine whether earlier and more consistent identification of CMD affects functional outcomes. Efforts to refine and validate automated analytic pipelines could facilitate widespread adoption of these techniques in routine clinical settings.
Reference: Bodien YG, Allanson J, Cardone P, et al. Cognitive Motor Dissociation in Disorders of Consciousness. New England Journal of Medicine. 2024;391:598-608. DOI: http://doi.org/10.1056/NEJMoa2400645
RCT: Discontinuing First-Line DMT in Long-Term Stable Relapsing MS Leads to Recurrence of Disease Activity
3 Jan, 2025 | 08:00h | UTCBackground: Increasing numbers of patients with relapsing-onset multiple sclerosis (MS) are receiving first-line disease-modifying therapies (DMTs) to control inflammatory lesions and reduce disability progression. Yet, extended therapy raises questions regarding overtreatment, adverse effects, and costs, especially in older or clinically stable individuals.
Objective: This randomized, multicenter, rater-blinded clinical trial (DOT-MS) assessed whether discontinuing first-line DMT in adults with MS who had at least five years of clinical and radiological stability is safe in terms of recurrence of significant inflammatory disease activity.
Methods: Eighty-nine participants (median age 54 years, 67% female) were randomly assigned 1:1 to either continue or discontinue their first-line DMT. Inclusion required relapse-onset MS without new, sizeable brain MRI lesions (≤1 new T2 lesion in the previous five years or ≤3 new T2 lesions in the past ten years). Follow-up included clinical evaluations, gadolinium-enhanced brain MRI at baseline and months 3, 6, 12, 18, and 24, and optional unscheduled visits. The primary endpoint was significant inflammatory disease activity, defined as clinical relapse and/or ≥3 new T2 lesions or ≥2 contrast-enhancing lesions. The trial was prematurely terminated by the data safety monitoring board due to higher-than-expected disease activity in the discontinue arm.
Results: After a median follow-up of 15.3 months, none of the 44 participants in the continue group had significant disease activity, versus 8 of 45 (17.8%) in the discontinue group (95% CI for difference, 0.09–0.32). Nearly all events were MRI-detected new or enhancing lesions, though two participants experienced clinical relapses. Most individuals with reactivation were able to regain stability upon DMT reintroduction. Serum neurofilament light levels rose during episodes of inflammatory activity, but neither NfL nor GFAP levels reliably predicted disease recurrence at baseline. No difference in serious adverse events emerged between groups.
Conclusions: In this trial, discontinuation of first-line DMT in long-term stable MS was associated with a significant risk (about 20%) of inflammatory reactivation, particularly noted in participants under 55 years old. As a result, close clinical and MRI monitoring should be mandatory if discontinuation is attempted, with early DMT reinitiation if necessary.
Implications for Practice: Clinicians may consider stopping first-line DMT in select patients who exhibit prolonged stability, especially in older adults, but must remain vigilant. Rapid detection of any radiological or clinical sign of reactivation is crucial, as reintroducing therapy may reestablish disease control.
Study Strengths and Limitations: Strengths include its randomized design, systematic imaging schedule, and thorough biomarker sampling. Limitations involve early trial termination, restricting subgroup analyses and definitive noninferiority testing. Routine spinal cord imaging was not performed, potentially underestimating subclinical disease activity.
Future Research: Longer-term observational follow-up is ongoing to clarify how these participants fare over time and to identify biomarkers that may better predict risk of rebound activity. Evaluating cost-effectiveness and long-term clinical outcomes will guide future decisions about DMT discontinuation.
Reference: Coerver EME, et al. Discontinuation of First-Line Disease-Modifying Therapy in Patients With Stable Multiple Sclerosis: The DOT-MS Randomized Clinical Trial. JAMA Neurology. 2024. DOI: http://doi.org/10.1001/jamaneurol.2024.4164
Phase 2 RCT: CRISPR-Based Therapy Reduces Attacks in Hereditary Angioedema
2 Jan, 2025 | 10:00h | UTCBackground: Hereditary angioedema (HAE) is a rare autosomal dominant disorder characterized by unpredictable attacks of angioedema involving cutaneous tissues, the gastrointestinal tract, and, potentially, the larynx, posing a risk of asphyxiation. Current prophylactic treatments require frequent administration, often leading to suboptimal adherence and ongoing disease burden. NTLA-2002 is an in vivo CRISPR-Cas9–based therapy designed to permanently inactivate the KLKB1 gene in hepatocytes, thereby reducing plasma kallikrein levels and, hypothetically, lowering attack frequency in patients with HAE.
Objective: To evaluate whether a single intravenous infusion of NTLA-2002 (25 mg or 50 mg) would safely and effectively decrease HAE attack rates and reduce plasma kallikrein protein levels over a 16-week primary observation period, as compared with placebo.
Methods: This phase 2, randomized, double-blind, placebo-controlled trial included 27 adults with confirmed type 1 or type 2 HAE. Participants were assigned in a 2:2:1 ratio to receive a one-time dose of 25 mg or 50 mg of NTLA-2002 or placebo. The primary endpoint was the investigator-confirmed number of angioedema attacks per month from Week 1 through Week 16. Secondary endpoints included the number of moderate-to-severe attacks, use of on-demand therapy, adverse events, and changes in total plasma kallikrein protein levels (analyzed by immunoassays). Exploratory measures encompassed patient-reported outcomes using the Angioedema Quality of Life (AE-QoL) questionnaire.
Results: During the 16-week period, the mean monthly attack rate decreased by 75% in the 25 mg group and 77% in the 50 mg group relative to placebo (estimated rates of 0.70 vs. 0.65 vs. 2.82 attacks per month, respectively). Notably, 4 of 10 patients (40%) in the 25 mg group and 8 of 11 (73%) in the 50 mg group reported no attacks or further prophylaxis use after dosing. Placebo recipients showed only a 16% reduction from baseline. Adverse events were predominantly mild to moderate; headache, fatigue, and nasopharyngitis were most common. Infusion-related reactions occurred in a few patients but resolved without sequelae. A single transient grade 2 elevation in alanine aminotransferase was recorded in one participant given 25 mg of NTLA-2002. By Week 16, total plasma kallikrein levels decreased by 55% in the 25 mg group and 86% in the 50 mg group, with no meaningful changes in placebo.
Conclusions: A single intravenous infusion of NTLA-2002 significantly lowered attack frequency and reduced total plasma kallikrein levels in HAE. Most patients treated at 50 mg experienced no attacks, suggesting that long-term prophylaxis might be unnecessary for many. Longer observation supports durability, yet cost and potential long-term effects of gene editing warrant cautious interpretation.
Implications for Practice: If confirmed by larger phase 3 trials, this gene-editing approach could alter the management of HAE, reducing or eliminating the need for continuous prophylaxis. However, clinicians must weigh the high upfront cost, possible unpredictable immune responses, and the novelty of CRISPR-based therapies before integrating them into standard care.
Study Strengths and Limitations: Strengths include a placebo-controlled design, meaningful improvement in patient-reported outcomes, and robust plasma kallikrein protein reduction. Limitations are the small sample size, short primary observation period, and uncertain long-term safety in diverse populations.
Future Research: Ongoing phase 3 studies with larger cohorts and extended follow-up are essential to confirm safety, long-term efficacy, and cost-effectiveness.
Reference: Cohn DM, Gurugama P, Magerl M, et al. CRISPR-Based Therapy for Hereditary Angioedema. New England Journal of Medicine. 2024; DOI: http://doi.org/10.1056/NEJMoa2405734
- Editorial: Musunuru K. A Milestone for Gene-Editing Therapies. New England Journal of Medicine. 2024; DOI: http://doi.org/10.1056/NEJMe2412176
Dose-Response Meta-Analysis: At Least 150 Weekly Minutes of Aerobic Exercise Needed for Significant Waist and Fat Reduction
2 Jan, 2025 | 09:30h | UTCBackground: Elevated body weight and adiposity remain major public health concerns worldwide, with overweight and obesity affecting nearly half of the adult population. Although various guidelines advocate for aerobic exercise as a core strategy in weight management, robust meta-analyses exploring dose-response relationships are scarce.
Objective: To clarify how different doses and intensities of supervised aerobic exercise affect body weight, waist circumference, and body fat in adults with overweight or obesity.
Methods: This systematic review and meta-analysis encompassed 116 randomized clinical trials (RCTs) including a total of 6880 participants (mean [SD] age, 46 [13] years). All studies involved supervised continuous aerobic interventions (e.g., walking or running) for at least 8 weeks. Comparisons were made against sedentary or usual-activity controls. Frequency, duration (minutes per week), and intensity (moderate, vigorous, or combined) of aerobic sessions were extracted.
Results: Across all trials, each additional 30 minutes per week of aerobic exercise was linked to a mean reduction of 0.52 kg in body weight (95% CI, −0.61 to −0.44), 0.56 cm in waist circumference, and 0.37 percentage points in body fat. Body weight and waist circumference showed largely linear decreases with increasing weekly exercise, whereas body fat percentage displayed a pattern suggesting that at least 150 minutes per week may be required to achieve clinically meaningful reductions (>2% reduction in body fat). Aerobic training was generally well tolerated, although a modest increase in mild musculoskeletal complaints was noted (risk difference, 2 more events per 100 participants).
Conclusions: Engaging in up to 300 minutes per week of aerobic exercise was associated with progressively greater benefits for weight control, waist circumference, and body fat. While even small doses yielded modest improvements, these findings suggest that an intensity of at least moderate level and a duration of at least 150 minutes per week may be necessary to achieve clinically important reductions in central obesity and fat percentage.
Implications for Practice: Clinicians managing patients with overweight or obesity can recommend a minimum of 150 minutes per week of moderate-to-vigorous aerobic training to achieve significant anthropometric changes. Gradual progression is essential to balance effectiveness and safety, especially in individuals with musculoskeletal constraints.
Study Strengths and Limitations: Strengths include the large number of RCTs, robust dose-response analyses, and consistent directions of effects. However, high heterogeneity, publication bias for certain fat measures, and limited data on medication use and health-related quality of life in longer trials were noted.
Future Research: Further trials should explore additional subgroup analyses (e.g., older adults, individuals with chronic comorbidities), longer durations of follow-up, and the integration of resistance training to optimize cardiometabolic outcomes.
Reference: Jayedi A, Soltani S, Emadi A, et al. Aerobic Exercise and Weight Loss in Adults: A Systematic Review and Dose-Response Meta-Analysis. JAMA Network Open. 2024;7(12):e2452185. DOI: http://doi.org/10.1001/jamanetworkopen.2024.52185
RCT: Pembrolizumab with Preoperative Radiotherapy Shows Potential in Stage III Soft Tissue Sarcoma
2 Jan, 2025 | 09:00h | UTCBackground: Patients with locally advanced, high-grade soft tissue sarcomas of the extremity often face a high risk of metastatic disease, despite curative-intent surgery and radiotherapy. Traditional doxorubicin-based chemotherapy provides variable benefits and can cause considerable toxicity, prompting investigations into alternative strategies. Emerging data from smaller trials hinted that immune checkpoint inhibitors might offer targeted benefit in sarcomas, but robust evidence in the neoadjuvant setting remains limited.
Objective: To assess whether adding neoadjuvant and adjuvant pembrolizumab to preoperative radiotherapy and surgical resection could enhance disease-free survival (DFS) in individuals with resectable grade 2 or 3, stage III undifferentiated pleomorphic sarcoma or liposarcoma of the extremity and limb girdle.
Methods: This open-label, randomized trial (SU2C-SARC032) enrolled 143 participants at 20 academic centers in Australia, Canada, Italy, and the USA. Eligible patients were 12 years or older, presented with primary tumors >5 cm, and did not receive chemotherapy as part of this protocol. Participants were randomized 1:1 to standard preoperative radiotherapy (50 Gy/25 fractions) plus surgery (control) or the same radiotherapy combined with pembrolizumab (200 mg every three weeks) in the neoadjuvant setting, followed by up to 14 adjuvant cycles. The primary endpoint was disease-free survival, analyzed in a modified intention-to-treat cohort of 127 evaluable patients, with a median follow-up of 43 months.
Results: The pembrolizumab group demonstrated a higher 2-year DFS rate (67%) compared with controls (52%), suggesting a favorable hazard ratio (0.61) for recurrence or death. Nonetheless, grade 3 or higher adverse events were more common in the pembrolizumab arm (56% vs 31%). Secondary endpoints, including distant disease-free survival and overall survival, also appeared to favor the pembrolizumab arm, but these comparisons were not powered for definitive conclusions.
Conclusions: Neoadjuvant and adjuvant pembrolizumab in combination with radiotherapy and surgery demonstrated a DFS advantage in stage III undifferentiated pleomorphic sarcoma or liposarcoma, reinforcing the potential role of immunotherapy in high-risk settings. However, given the increased incidence of adverse events and the relatively short follow-up for overall survival, cautious interpretation is warranted. Further evidence is required to determine long-term benefits and confirm whether these findings extend to other sarcoma subtypes.
Implications for Practice: These data suggest that incorporating pembrolizumab could be considered in selected patients, particularly those with large, high-grade tumors unresponsive to or unsuitable for traditional chemotherapy. Clinicians must balance the incremental risk of immunotherapy-induced side effects against the observed gains in disease-free survival and the ongoing need for extended follow-up.
Study Strengths and Limitations: Strengths include a multicenter randomized design, focus on a high-risk population, and a robust primary endpoint. Limitations encompass a relatively small sample size, inherent to rare cancers, underpowered subgroup analyses, and absence of long-term survival data. Confirmation of these early signals in larger cohorts and over more extended follow-up periods remains necessary.
Future Research: Additional trials should explore optimal radiotherapy fractionation, potential synergies with cytotoxic or targeted agents, and predictive biomarkers of response. Understanding immune correlates, including circulating tumor DNA and tumor microenvironmental factors, may refine treatment selection and enhance therapeutic outcomes.
Reference: Mowery YM, et al. Safety and efficacy of pembrolizumab, radiation therapy, and surgery versus radiation therapy and surgery for stage III soft tissue sarcoma of the extremity (SU2C-SARC032): an open-label, randomised clinical trial. The Lancet. 2024;404(10467). DOI: http://doi.org/10.1016/S0140-6736(24)01812-9
Systematic Review and Bayesian Meta-Analysis: Higher Protein Delivery May Increase Mortality in Critically Ill Patients
2 Jan, 2025 | 08:30h | UTCBackground: Nutritional guidelines often recommend higher protein doses (approximately 1.2–2.0 g/kg/d) to mitigate muscle loss in critically ill patients. However, recent multicenter trials have raised concerns that elevated protein targets might increase mortality and adversely affect patient-centered outcomes. This study applied a Bayesian approach to synthesize current evidence regarding the effect of higher versus lower protein delivery on mortality, infections, mechanical ventilation duration, and health-related quality of life in critically ill adults.
Objective: To estimate the probability of beneficial or harmful effects of increased protein delivery on clinically important outcomes, with emphasis on quantifying the likelihood of mortality benefit versus risk.
Methods: A systematic review and Bayesian meta-analysis were conducted according to a preregistered protocol (PROSPERO CRD42024546387) and PRISMA 2020 guidelines. Twenty-two randomized controlled trials comparing higher (mean 1.5 g/kg/d) versus lower (mean 0.9 g/kg/d) protein delivery in adult ICU patients were included, ensuring similar energy intake in both groups. A hierarchical random-effects Bayesian model was applied, using vague priors to estimate relative risks for mortality and infections, mean differences for ventilator days, and standardized mean differences for quality of life.
Results: A total of 4,164 patients were analyzed. The posterior probability that higher protein intake increases mortality was 56.4%, compared with a 43.6% probability of any mortality benefit. Probabilities for a clinically relevant (≥5%) mortality decrease were low (22.9%), while the probability of at least a 5% increase reached 32.4%. Infections were slightly more likely with higher protein, although the likelihood of a major detrimental effect remained modest. The probability of a clinically meaningful difference in ventilator days was negligible, suggesting near equivalence for that endpoint. Conversely, quality of life might be negatively impacted by higher protein dosing, although few trials measured this outcome.
Conclusions: Under a Bayesian framework, current evidence suggests that high protein delivery in critically ill patients might pose a meaningful risk of increased mortality. Although a beneficial effect cannot be fully excluded, its probability appears comparatively small. These findings challenge the longstanding assumption that more protein universally translates to better outcomes.
Implications for Practice: Clinicians should exercise caution when aiming for higher protein targets. Individual patient characteristics, such as severity of illness, renal function, and underlying comorbidities, may modulate outcomes. The data support considering a personalized protein prescription rather than routinely pushing intake beyond conventional targets.
Study Strengths and Limitations: Strengths include a robust Bayesian analysis that evaluates probabilities of both benefit and harm across multiple thresholds, as well as the inclusion of recently published large trials. Limitations involve heterogeneity in protein dosing strategies, potential publication bias (indicated by Egger’s test), and limited data on quality of life.
Future Research: Ongoing trials, such as TARGET Protein and REPLENISH, will provide valuable insights into optimal protein dosing, particularly in specific subgroups. Further investigation should explore mechanistic underpinnings of how high protein intake could adversely affect recovery in critically ill patients.
Reference: Heuts S, Lee ZY, Lew CCH, et al. Higher Versus Lower Protein Delivery in Critically Ill Patients: A Systematic Review and Bayesian Meta-Analysis. Critical Care Medicine. December 27, 2024. DOI: http://doi.org/10.1097/CCM.0000000000006562
Scoping Review of RCTs: AI Interventions Show Positive but Varied Impact in Clinical Practice
28 Dec, 2024 | 00:04h | UTCBackground: The rapid expansion of artificial intelligence (AI) in health care has stimulated a growing number of randomized controlled trials (RCTs) intended to validate AI’s clinical utility. However, many AI models previously tested in retrospective or simulated settings lack real-world evidence. Investigating the breadth and depth of these RCTs is key to understanding the current status of AI in clinical practice.
Objective: This scoping review aimed to identify, classify, and evaluate RCTs that integrate modern AI (non-linear computational models, including deep learning) into patient management. The primary goal was to assess geographic distribution, trial design, outcomes measured (diagnostic performance, care management, patient behavior, clinical decision making), and the overall success rate of AI-based interventions.
Methods: The authors systematically searched PubMed, SCOPUS, CENTRAL, and the International Clinical Trials Registry Platform from January 2018 to November 2023. Included studies featured an AI intervention integrated into clinical workflows, with patient outcomes influenced by clinician–AI interactions or standalone AI systems. Exclusions encompassed linear-risk models and non-English publications. After screening 10,484 records, 86 RCTs were ultimately included. Simple descriptive statistics summarized trial characteristics, endpoints, and results.
Results: Most RCTs (63%) were single-center studies with a median sample size of 359 patients. Gastroenterology (43%) and Radiology (13%) were leading specialties, often focusing on deep learning algorithms for endoscopic or imaging tasks. The USA led in overall trial volume, followed by China, with 81% of all trials reporting positive primary outcomes (improvements or non-inferiority). Diagnostic yield or performance metrics predominated (54%), though some studies evaluated patient-centered endpoints such as adherence or symptom reduction. Despite these promising findings, 60% of trials measuring operational time showed mixed effects—some reported reduced procedural times (p<0.05), while others noted significant increases (p<0.05).
Conclusions: AI-driven interventions generally improved diagnostic measures and care processes, demonstrating potential for augmenting clinical decision making. Nevertheless, the prevalence of single-center designs limits the generalizability of outcomes. Publication bias remains a concern, given that negative or null results may be underreported. More extensive multicenter RCTs, greater demographic transparency, and standardized reporting are critical to fully determine AI’s clinical relevance.
Implications for Practice: AI tools might enhance screening, detection rates, and therapeutic monitoring in areas like gastroenterology, radiology, and cardiology. Clinicians should remain mindful of possible workflow inefficiencies and biases. Thorough validation and robust implementation strategies are essential before widespread adoption can be justified.
Study Strengths and Limitations: Strengths include a timely review capturing diverse RCTs up to late 2023 and strict inclusion criteria requiring true AI integration into patient care. Limitations include the English-only search and reliance on published results, potentially omitting unpublished or negative trials.
Future Research: Further investigations should prioritize multicenter, large-scale RCTs with meaningful clinical endpoints—quality of life, survival, and long-term safety. Enhanced adherence to reporting standards (CONSORT-AI) and recruitment of ethnically diverse populations are necessary steps to advance the field.
Reference: Han R, Acosta JN, Shakeri Z, Ioannidis JPA, Topol EJ, Rajpurkar P, et al. “Randomised controlled trials evaluating artificial intelligence in clinical practice: a scoping review.” The Lancet Digital Health. 2024;6(5). DOI: http://doi.org/10.1016/S2589-7500(24)00047-5
Cohort Study: Higher Telehealth Intensity May Reduce Certain Office-Based Low-Value Services in Medicare Primary Care
2 Jan, 2025 | 08:00h | UTCBackground: The rapid expansion of telehealth has raised concerns about its potential to foster wasteful services, especially in primary care. While telehealth can eliminate certain in-person interventions, it might also increase unnecessary laboratory or imaging requests, given the more limited physical exam. Evaluating how telehealth intensity affects the provision of low-value care is crucial for guiding future policy and clinical practice.
Objective: To determine whether higher telehealth utilization at the practice level is associated with changes in the rates of common low-value services among Medicare fee-for-service beneficiaries in Michigan.
Methods: Using Medicare claims data from January 1, 2019, to December 31, 2022, this retrospective cohort employed a difference-in-differences design. A total of 577,928 beneficiaries attributed to 2,552 primary care practices were included. Practices were stratified into low, medium, or high telehealth tertiles based on the volume of virtual visits per 1,000 beneficiaries in 2022. Eight low-value services relevant to primary care were grouped into four main categories: office-based (e.g., cervical cancer screening in women older than 65), laboratory-based, imaging-based, and mixed-modality services.
Results: Among the 577,928 beneficiaries (332,100 women; mean age, 76 years), practices with high telehealth utilization had a greater reduction in office-based cervical cancer screening (−2.9 [95% CI, −5.3 to −0.4] services per 1,000 beneficiaries) and low-value thyroid testing (−40 [95% CI, −70 to −9] tests per 1,000 beneficiaries), compared with low-utilization practices. No significant association emerged for other laboratory- or imaging-based low-value services, including PSA testing for men over 75 or imaging for uncomplicated low back pain. These findings suggest that while telehealth can lower certain office-based low-value services, it does not appear to substantially increase other types of wasteful care.
Conclusions: High telehealth intensity was linked to reductions in specific low-value procedures delivered in-office, without raising the overall rates of other potentially unnecessary interventions. These data may alleviate some policy concerns that telehealth drives excessive or wasteful care due to its convenience. Instead, substituting certain in-person visits with virtual encounters might curtail opportunities for procedures with minimal clinical benefit.
Implications for Practice: For clinicians and policymakers, these results underscore the possibility that carefully implemented telehealth may reduce some low-value services. Nonetheless, sustained monitoring is needed to confirm whether telehealth encourages or discourages appropriate clinical decision-making across a broader range of interventions.
Study Strengths and Limitations: Strengths include a sizable cohort, a pre- versus post-pandemic time frame, and comprehensive analysis of multiple low-value outcomes. Limitations involve the exclusive focus on beneficiaries in Michigan, the inability to capture prescription-related low-value practices (e.g., antibiotic overuse), and the reliance on claims-based measures, which lack clinical details.
Future Research: Subsequent studies should expand to different geographic areas, assess additional low-value endpoints such as overtreatment with medications, and explore whether demographic or socioeconomic factors modify telehealth’s impact on care quality.
Reference: Liu T, Zhu Z, Thompson MP, et al. Primary Care Practice Telehealth Use and Low-Value Care Services. JAMA Netw Open. 2024;7(11):e2445436. DOI: http://doi.org/10.1001/jamanetworkopen.2024.45436
Meta-analysis: One-day Low-residue Diet Achieves Comparable Bowel Cleansing Compared to Multi-day Regimens
26 Dec, 2024 | 18:21h | UTCBackground: Colorectal cancer remains a leading cause of cancer-related morbidity worldwide, making early detection through colonoscopy essential. Adequate bowel preparation is crucial to maximize mucosal visibility and detect lesions effectively. Although low-residue diets (LRDs) are commonly recommended before colonoscopy, guidelines vary regarding the optimal duration (one day versus multiple days). This systematic review and meta-analysis evaluated whether a one-day LRD regimen is non-inferior to multi-day protocols in achieving satisfactory bowel cleansing and lesion detection.
Objective: To compare the efficacy of 1-day versus >1-day LRD regimens for bowel preparation in adult patients undergoing elective colonoscopy, focusing on bowel cleanliness, polyp detection, and adenoma detection rates.
Methods: A comprehensive search of PubMed, Cochrane Central Register of Controlled Trials, ScienceDirect, Scopus, and ClinicalTrials.gov was conducted for randomized controlled trials (RCTs) comparing 1-day with >1-day LRD regimens. Six RCTs involving 2,469 participants met inclusion criteria. Patients were randomized to either a 1-day LRD (n=1,237) or a multi-day LRD (n=1,232). Adequate bowel preparation was primarily defined by a Boston Bowel Preparation Scale (BBPS) score ≥2 in each segment or total BBPS ≥6. Secondary outcomes included polyp detection rate (PDR), adenoma detection rate (ADR), withdrawal time, cecal intubation rate, and cecal intubation time.
Results: Both groups demonstrated similar rates of adequate bowel preparation (87.2% in the 1-day LRD vs. 87.1% in the multi-day group), with no significant difference (OR=1.03, 95% CI, 0.76–1.41; p=0.84; I2=0%). PDR was likewise comparable (OR=0.91, 95% CI, 0.76–1.09; p=0.29; I2=16%), as was ADR (OR=0.87, 95% CI, 0.71–1.08; p=0.21; I2=0%). Withdrawal time did not differ (MD=–0.01 minutes, 95% CI, –0.25 to 0.24; p=0.97; I2=63%), and cecal intubation parameters were also statistically similar. Across studies, the pooled mean global BBPS revealed minimal difference (MD=0.16, 95% CI, –0.02 to 0.34; p=0.08; I2=15%), confirming the non-inferiority of a shorter LRD protocol.
Conclusions: A one-day LRD achieves bowel cleansing outcomes comparable to those of multi-day LRDs, without compromising polyp or adenoma detection. This shorter regimen may help optimize patient adherence, reduce dietary restriction burden, and simplify procedural logistics, especially for busy endoscopy practices.
Implications for Practice: Adopting a 1-day LRD can streamline preparation, improve patient satisfaction, and maintain high-quality visualization. Clinicians should weigh individual patient factors such as chronic constipation or comorbidities but may generally favor a shorter dietary restriction period to enhance compliance and comfort.
Study Strengths and Limitations: This meta-analysis included only RCTs, strengthening its internal validity. Heterogeneity for primary outcomes was minimal. However, the included trials employed varied dietary protocols and bowel preparation solutions. Additionally, some studies lacked uniform reporting of cecal intubation endpoints, limiting direct comparisons. Future investigations with standardized outcome measures could offer more definitive guidance.
Future Research: Further large-scale RCTs should assess cost-effectiveness, patient-reported outcomes, and LRD composition in specific populations. Identifying optimal dietary instructions for individuals with slower colonic transit or specific nutritional needs would refine colonoscopy preparation guidelines and potentially increase detection of precancerous lesions.
Reference: Putri RD, et al. One-day low-residue diet is equally effective as the multiple-day low-residue diet in achieving adequate bowel cleansing: a meta-analysis of randomized controlled trials. Clinical Endoscopy. 2024. DOI: https://doi.org/10.5946/ce.2024.061
RCT: Daratumumab Monotherapy Prevents Progression in High-Risk Smoldering Multiple Myeloma
26 Dec, 2024 | 15:44h | UTCBackground: Smoldering multiple myeloma (SMM) is an asymptomatic plasma cell disorder that can progress to active multiple myeloma, especially when risk factors place patients in a high-risk subset. Although daratumumab has been approved for multiple myeloma, no treatments have been approved for high-risk SMM. This study (AQUILA) examined whether subcutaneous daratumumab could prevent or significantly delay progression to symptomatic myeloma.
Objective: To evaluate the effectiveness of subcutaneous daratumumab monotherapy versus active monitoring in prolonging time to disease progression (defined by SLiM–CRAB criteria) or death in patients with high-risk SMM.
Methods: In this open-label phase 3 trial, 390 patients with high-risk SMM were randomly assigned (1:1) to either daratumumab (1800 mg subcutaneously) or active monitoring. Daratumumab was administered weekly for cycles 1–2, every two weeks for cycles 3–6, and then every four weeks for up to 39 cycles (36 months) or until confirmed disease progression. Active monitoring followed the same schedule for disease evaluations without any specific therapy. The primary endpoint was progression-free survival (PFS), assessed by an independent committee. Secondary endpoints included overall survival, response rates, and time to subsequent therapy.
Results: After a median follow-up of 65.2 months, disease progression or death occurred in 34.5% of patients in the daratumumab group compared to 50.5% in the active-monitoring group (HR, 0.49; 95% CI, 0.36–0.67; p<0.001). At five years, the PFS rate was 63.1% with daratumumab and 40.8% with active monitoring. Overall survival was also higher in the daratumumab arm: 93.0% versus 86.9% at five years (HR for death, 0.52; 95% CI, 0.27–0.98). Treatment discontinuation due to adverse events was low (5.7%), and no new safety signals emerged. Grade 3 or 4 adverse events, primarily hypertension (5.7% vs. 4.6%), occurred at similar rates in both arms. Infections of grade 3 or 4 were more frequent with daratumumab (16.1% vs. 4.6%), including COVID-19 pneumonia, yet overall tolerability remained acceptable. Patient-reported outcomes, including quality-of-life measures, were largely preserved in both groups during the study.
Conclusions: Subcutaneous daratumumab monotherapy substantially delayed progression to symptomatic multiple myeloma and improved overall survival among patients with high-risk SMM. The safety profile was consistent with prior daratumumab studies, suggesting a favorable risk–benefit balance. Early intervention with daratumumab may thus alter the disease trajectory for select patients, sparing them from end-organ damage and improving long-term clinical outcomes.
Implications for Practice: While active monitoring has been the standard of care for high-risk SMM, these findings support early therapeutic intervention for patients with multiple high-risk features. Clinicians should remain cautious, however, when generalizing across different risk stratification models. Additional research on optimal treatment durations, combination strategies, and real-world outcomes will further refine patient selection and management of high-risk SMM.
Study Strengths and Limitations: This trial featured robust follow-up (median of over five years) and clear outcome definitions. However, the classification of high-risk features has evolved, and certain populations (e.g., Black patients) were underrepresented. These factors may limit the generalizability of the findings in broader clinical settings.
Future Research: Ongoing trials are investigating alternative dosing schedules, combination regimens (e.g., daratumumab-based quadruplets), and the role of minimal residual disease monitoring to optimize patient outcomes. Additional studies will clarify whether more intense or shorter treatments might maintain efficacy with fewer side effects.
Reference: Dimopoulos MA, Voorhees PM, Schjesvold F, Cohen YC, Hungria V, Sandhu I, Lindsay J, +29, for the AQUILA Investigators. Daratumumab or Active Monitoring for High-Risk Smoldering Multiple Myeloma. New England Journal of Medicine. 2024; DOI: http://doi.org/10.1056/NEJMoa2409029
Three Phase 3, Placebo-Controlled Trials Show Rapid Benefits of Oral Atogepant for Migraine Prevention
26 Dec, 2024 | 12:17h | UTCBackground: Preventive therapies for migraine often require long titration and may take weeks to achieve their full effect. This analysis integrates data from three randomized, placebo-controlled Phase 3 trials (ADVANCE, ELEVATE, PROGRESS) assessing atogepant 60 mg once daily (QD) over 12 weeks, focusing on the first four weeks. A key point is that atogepant was compared only to placebo, not to other well-established migraine preventives.
Objective: To determine whether atogepant provides early efficacy in reducing migraine frequency and improving functional outcomes within the initial weeks of therapy, for both episodic and chronic migraine.
Methods: All three studies enrolled participants aged 18–80 years with a ≥1-year history of migraine. ADVANCE and ELEVATE focused on episodic migraine (EM; 4–14 monthly migraine days), while PROGRESS studied chronic migraine (CM; ≥15 monthly headache days, ≥8 of which met migraine criteria). In ELEVATE, participants had previously failed 2–4 classes of oral migraine preventives. Throughout each trial, patients recorded daily migraine-related data and completed validated functional assessments (AIM-D and EQ-5D-5L). For this pooled analysis, only the atogepant 60 mg QD and placebo arms were examined.
Results: Atogepant recipients had a significantly lower proportion of patients with a migraine day on day 1 in all three trials, suggesting a rapid onset of benefit. Reductions in weekly migraine days (WMDs) emerged as early as week 1 and remained consistently greater than placebo over the first four weeks. Functional measures improved within this same timeframe, with patients on atogepant reporting reductions in activity impairment and enhanced self-rated health. These positive findings were observed in EM (with or without prior prophylaxis failures) and in CM populations.
Conclusions: Atogepant 60 mg QD was linked to early and significant reductions in migraine days, as well as enhancements in physical functioning and daily activities, across three placebo-controlled studies. The data suggest that atogepant may offer clinically meaningful, rapid-onset prophylactic benefits.
Implications for Practice: Clinicians may consider atogepant for patients seeking a preventive migraine therapy that demonstrates a potentially faster impact on symptom frequency and daily functioning. However, direct comparisons with established active treatments are lacking, and appropriate caution in interpreting the early onset of benefit is recommended.
Study Strengths and Limitations: Major strengths include robust, double-blind methodologies and consistent findings across diverse migraine populations. A key limitation is the exclusive use of placebo as the comparator, so the relative advantage over standard preventives remains unknown. The predominantly female and White study cohorts also restrict generalizability.
Future Research: Further investigations should evaluate atogepant in direct comparisons with existing active migraine preventives, examine long-term outcomes, and recruit more diverse populations. Such efforts could better define the therapy’s place in routine migraine care.
Reference: Lipton RB, et al. Early Improvements With Atogepant for the Preventive Treatment of Migraine: Results From 3 Randomized Phase 3 Trials. Neurology. 2025;104(2). DOI: https://doi.org/10.1212/WNL.0000000000210212
Management of Adult Sepsis in Resource-Limited Settings: A Global Delphi-Based Consensus
26 Dec, 2024 | 02:06h | UTCIntroduction: This summary presents key points from a recent expert consensus on managing adult sepsis under limited-resource conditions, where patients may lack access to an ICU bed, advanced monitoring technologies, or sufficient staffing. The statements were developed through a Delphi process involving an international panel of clinicians, aiming to complement existing sepsis guidelines by focusing on pragmatic approaches and context-specific adaptations. These consensus statements address unique challenges such as limited diagnostic tests, alternative strategies for hemodynamic monitoring, and management of sepsis in areas with tropical infections.
Key Recommendations:
- Location of Care and Transfer
- When an ICU bed is unavailable, care can be provided in a non-ICU setting if minimum monitoring (neurological status, blood pressure, peripheral perfusion) is ensured.
- Before transferring a patient, ensure airway patency, initiate intravenous fluids and antimicrobials, and maintain safe transport conditions.
- Incorporate telemedicine or phone consultation with critical care specialists whenever feasible.
- Diagnostic Considerations
- Employ screening tools (e.g., qSOFA) in areas with limited resources, acknowledging its diagnostic constraints.
- Use clinical parameters like altered mental state, capillary refill time (CRT), and urine output to gauge tissue perfusion when lactate measurement is unavailable.
- Insert an indwelling urinary catheter in septic shock to monitor urine output accurately, balancing infection risks against close monitoring needs.
- Hemodynamic Management
- Rely on clinical indicators (CRT, urine output) to guide fluid resuscitation when serum lactate is not accessible.
- Use fluid responsiveness tests (e.g., passive leg raising, pulse pressure variation) if advanced hemodynamic monitoring is impractical.
- Consider balanced solutions such as Ringer’s lactate or Hartmann’s solution for fluid resuscitation.
- Recognize that patients with tropical infections (e.g., malaria, dengue) may require cautious fluid volumes to avoid overload.
- Initiate epinephrine if norepinephrine or vasopressin is unavailable, and use vasopressors through peripheral lines if central access cannot be established.
- Antimicrobial Therapy
- Administer antibiotics without delay (ideally within one hour) in suspected sepsis or septic shock.
- In severe infections of parasitic origin (e.g., malaria), start antiparasitic agents promptly.
- In settings where laboratory investigations are limited, begin broad-spectrum antimicrobial coverage when infection cannot be ruled out.
- De-escalate or discontinue therapy based on clinical improvement, declining white blood cell counts, and adequate source control.
- Respiratory Support
- For acute hypoxemic respiratory failure in septic patients, noninvasive ventilation (NIV) can be used if high-flow nasal oxygen is not available, provided close monitoring for potential failure is ensured.
Conclusion: These consensus-based statements offer practical guidance for clinicians treating sepsis in resource-limited environments. By adapting globally accepted recommendations and incorporating alternative strategies—such as clinical markers of perfusion, use of peripheral vasopressors, and prioritizing immediate antimicrobial therapy—these principles aim to improve patient outcomes where healthcare resources are scarce. Further research and context-specific adaptations will be essential to address remaining uncertainties and refine these expert recommendations.
Reference:
Thwaites, L., Nasa, P., Abbenbroek, B. et al. Management of adult sepsis in resource-limited settings: global expert consensus statements using a Delphi method. Intensive Care Medicine (2024). https://doi.org/10.1007/s00134-024-07735-7
VisionFM: A Generalist AI Surpasses Single-Modality Models in Ophthalmic Diagnostics
25 Dec, 2024 | 13:41h | UTCBackground: Ophthalmic AI models typically address single diseases or modalities. Their limited generalizability restricts broad clinical application. This study introduces VisionFM, a novel foundation model trained on 3.4 million images from over 500,000 individuals. It covers eight distinct ophthalmic imaging modalities (e.g., fundus photography, OCT, slit-lamp, ultrasound, MRI) and encompasses multiple diseases. Compared with prior single-task or single-modality approaches, VisionFM’s architecture and large-scale pretraining enable diverse tasks such as disease screening, lesion segmentation, prognosis, and prediction of systemic markers.
Objective: To develop and validate a generalist ophthalmic AI framework that can handle multiple imaging modalities, recognize multiple diseases, and adapt to new clinical tasks through efficient fine-tuning, potentially easing the global burden of vision impairment.
Methods: VisionFM employs individual Vision Transformer–based encoders for each of the eight imaging modalities, pretrained with self-supervised learning (iBOT) focused on masked image modeling. After pretraining, various task-specific decoders were fine-tuned for classification, segmentation, and prediction tasks. The model was evaluated on 53 public and 12 private datasets, covering eight disease categories (e.g., diabetic retinopathy, glaucoma, cataract), five imaging modalities (fundus photographs, OCT, etc.), plus additional tasks (e.g., MRI-based orbital tumor segmentation). Performance metrics included AUROCs, Dice similarity coefficients, F1 scores, and comparisons with ophthalmologists of varying clinical experience.
Results: VisionFM achieved an average AUROC of 0.950 (95% CI, 0.941–0.959) across eight disease categories in internal validation. External validation showed AUROCs of 0.945 (95% CI, 0.934–0.956) for diabetic retinopathy and 0.974 (95% CI, 0.966–0.983) for AMD, surpassing baseline deep learning approaches. In a 12-disease classification test involving 38 ophthalmologists, VisionFM’s accuracy matched intermediate-level specialists. It successfully handled modality shifts (e.g., grading diabetic retinopathy on previously unseen OCTA), with an AUROC of 0.935 (95% CI, 0.902–0.964). VisionFM also predicted glaucoma progression (F1, 72.3%; 95% CI, 55.0–86.3) and flagged possible intracranial tumors (AUROC, 0.986; 95% CI, 0.960–1.00) from fundus images.
Conclusions: VisionFM offers a versatile, scalable platform for comprehensive ophthalmic tasks. Through self-supervised learning and efficient fine-tuning, it extends specialist-level performance to multiple clinical scenarios and imaging modalities. The study demonstrates that large-scale, multimodal pretraining can enable robust generalization to unseen data, potentially reducing data annotation burdens and accelerating AI adoption worldwide.
Implications for Practice: VisionFM may help address global shortages of qualified ophthalmologists and expand care in low-resource settings, though clinical decision-making still requires appropriate human oversight. Further multicenter studies are needed before widespread implementation, especially for higher-risk use cases such as tumor detection.
Study Strengths and Limitations: Strengths include its unique multimodal design, large-scale pretraining, and extensive external validation. Limitations involve demographic bias toward Chinese datasets, the need for larger cohorts in certain applications (e.g., intracranial tumor detection), and the challenges of matching real-world clinical complexity when only image-based data are used.
Future Research: Further validation in diverse populations, integration of new imaging modalities (e.g., widefield imaging, ultrasound variants), and expansion to additional diseases are planned. Hybridization with large language models could facilitate automatic generation of clinical reports.
Reference: Qiu J, Wu J, Wei H, et al. Development and Validation of a Multimodal Multitask Vision Foundation Model for Generalist Ophthalmic Artificial Intelligence. NEJM AI 2024;1(12). DOI: http://doi.org/10.1056/AIoa2300221
RCT: Avoiding Prophylactic Drain Increases Postoperative Invasive Procedures After Gastrectomy
25 Dec, 2024 | 12:47h | UTCBackground: Prophylactic abdominal drainage following gastrectomy for gastric cancer has been debated for decades. While some Enhanced Recovery After Surgery (ERAS) guidelines discourage routine drains, many surgeons still advocate their use to detect and manage intra-abdominal collections before they become severe. Previous trials were small and underpowered, thus failing to provide robust evidence regarding the real need for prophylactic drains.
Objective: To determine whether omitting a prophylactic drain in gastric cancer surgery leads to a higher likelihood of postoperative invasive procedures (reoperation or percutaneous drainage) within 30 days.
Methods: In this multicenter randomized clinical trial, 404 patients from 11 Italian centers were randomly assigned to either prophylactic drain placement or no drain at the end of subtotal or total gastrectomy. Both academic and community hospitals participated. The primary composite outcome was the rate of reoperation or percutaneous drainage within 30 postoperative days, analyzed via a modified intention-to-treat approach. Secondary endpoints included overall morbidity, anastomotic leaks, length of hospital stay, and 90-day mortality. A parallel invited commentary addressed methodological and clinical perspectives.
Results: Among the 390 patients who underwent resection, 196 had a prophylactic drain and 194 did not. By postoperative day 30, 7.7% of patients in the drain group required reoperation or percutaneous drainage, compared with 15% in the no-drain group. This statistically significant difference was driven by a higher reoperation rate in patients without drains. Both groups had similar anastomotic leak rates (approximately 4% overall). However, patients without prophylactic drains had a higher in-hospital mortality (4.6% vs 0.5%) and were more likely to require escalation of care. There were few drain-related complications, indicating a low risk associated with drain placement. Length of hospital stay and readmission rates were comparable between groups.
Conclusions: Omitting prophylactic drains in gastrectomy was associated with an increased need for postoperative invasive interventions, particularly reoperations. While prior guidelines have recommended against routine drain placement, these findings challenge that stance for total and even subtotal gastrectomies. Surgeons may wish to revisit existing protocols, especially in facilities with fewer resources or lower patient volumes, given the potential reduction in reoperation risk associated with prophylactic drainage.
Implications for Practice: Clinicians should carefully balance possible benefits (earlier detection of fluid collections and reduced reoperations) against potential drawbacks of drain usage. Routine placement may be reconsidered, at least in higher-risk cases or in institutions less equipped for complex salvage procedures.
Study Strengths and Limitations: Key strengths include its robust sample size and standardized criteria for complications. Limitations involve the unblinded nature of postoperative management and the lack of drain fluid amylase measurements to guide removal protocols. Additionally, differentiating total from subtotal gastrectomies might refine selection criteria for prophylactic drainage.
Future Research: Further studies could focus on stratified risk profiles for total vs subtotal gastrectomy and on biomarkers in drain fluid to identify subgroups most likely to benefit from prophylactic drainage.
Meta-analysis: Incidence Rate Difference of Adverse Events from Canabinoids in Middle-Aged and Older Adults
25 Dec, 2024 | 12:19h | UTCBackground: Growing evidence suggests that cannabinoid-based medicines (CBMs) are increasingly prescribed to individuals aged 50 years and above for various clinical conditions. While these agents may offer therapeutic benefits, questions remain about the incidence of adverse events (AEs), particularly in older adults with multiple comorbidities. This systematic review and meta-analysis aims to quantify the incidence rate difference (IRD) of AEs and determine whether weekly doses of delta-9-tetrahydrocannabinol (THC) and cannabidiol (CBD) are associated with any dose-dependent increase in risk.
Objective: To evaluate whether adults aged ≥50 years exposed to CBMs, including THC-alone formulations and THC combined with CBD, experience a higher incidence of AEs than controls, and to assess how variations in weekly THC and CBD doses might affect AE rates.
Methods: Researchers searched MEDLINE, PubMed, EMBASE, CINAHL, PsychInfo, Cochrane Library, and ClinicalTrials.gov from January 1, 1990, to June 12, 2023. Randomized clinical trials involving middle-aged and older adults (mean age ≥50 years) using medicinal CBMs for all indications were included. Data on common and serious AEs, withdrawals, and deaths were extracted and pooled using a random-effects model. Further meta-regression analyses examined THC and CBD weekly doses as predictors of AEs in THC-only and THC:CBD trials.
Results: Fifty-eight randomized clinical trials (n=6611) met inclusion criteria, with 3450 participants receiving CBMs. Compared to controls, individuals on THC-alone experienced significantly higher incidence of dizziness, somnolence, impaired coordination, dissociative symptoms, and dry mouth, often in a dose-dependent manner. Similarly, THC:CBD combinations increased nausea, vomiting, fatigue, dizziness, and disorientation. The incidence of serious AEs, withdrawals, or mortality did not differ significantly between CBM and control groups, although neurological or psychiatric side effects were more pronounced with higher THC doses.
Conclusions: THC-containing CBMs can provoke dose-related gastrointestinal, neurological, and psychiatric adverse events, posing additional risks in older adults susceptible to falls and cognitive disturbances. However, the meta-analysis found no significant increases in serious AEs or deaths. Clinicians should weigh potential benefits against the likelihood of common side effects, especially when prescribing higher THC doses or combining cannabinoids with other medications frequently used by older patients.
Implications for Practice:
- Physicians should exercise caution when initiating or escalating THC-based therapies in middle-aged and older adults, monitoring for neurological or psychiatric AEs.
- Using lower THC doses, titrating gradually, and adding CBD may mitigate some side effects.
- Though severe AEs are uncommon, vigilance is warranted in individuals with complex medication regimens.
Study Strengths and Limitations:
- Strength: This review merges diverse clinical conditions and provides a comprehensive assessment of THC vs. THC:CBD. Its large pooled population allows for more precise IRD estimates.
- Limitation: Short treatment durations in many trials limit understanding of long-term toxicity, and some studies lacked rigorous reporting of randomization and outcome measures, potentially introducing bias.
Future Research:
- Longer-duration trials focused on older populations are needed to clarify chronic safety profiles.
- Studies exploring drug-drug interactions between CBMs and medications commonly prescribed to older adults will further elucidate real-world tolerability.
Reference: Velayudhan L, Pisani S, Dugonjic M, McGoohan K, Bhattacharyya S. Adverse events caused by cannabinoids in middle aged and older adults for all indications: a meta-analysis of incidence rate difference. Age and Ageing. 2024;53(11):afae261. DOI: https://doi.org/10.1093/ageing/afae261
Bayesian Network Meta-Analysis: Chlorpromazine IV/IM Emerges as a Top Choice for Acute Migraine Relief in the ED
25 Dec, 2024 | 11:18h | UTCBackground: Acute migraine is a prevalent cause of emergency department (ED) visits, necessitating prompt pain control. Although numerous drugs are available, there is debate about the most effective and safest options. Traditional pairwise meta-analyses fail to capture all treatment comparisons in a single framework, making network meta-analyses, particularly Bayesian, an appealing approach to inform clinical decision-making.
Objective: This systematic review and Bayesian network meta-analysis aimed to compare multiple pharmacologic therapies—single agents or combinations—for acute migraine relief in adults presenting to the ED. The goal was to identify those most likely to achieve adequate pain relief, reduce rescue medication use, and minimize significant adverse reactions.
Methods: The authors searched MEDLINE, Embase, and Web of Science from inception to February 9, 2024, for randomized controlled trials comparing any pharmacologic therapy to another or to placebo in ED patients with migraine. Four primary outcomes were analyzed: (1) adequate pain relief at two hours, (2) change in pain intensity at one hour, (3) need for rescue drug at two hours, and (4) significant adverse reaction (eg, sedation, akathisia, hypotension).
Results: Twenty-four to twenty-seven trials contributed to each outcome network. Chlorpromazine IV/IM was ranked highest for adequate pain relief (SUCRA=87.3%) and also significantly reduced the need for rescue medication (SUCRA=93.2%). Ibuprofen IV and valproate IV emerged among the least effective for pain relief, while dexamethasone IV was the most probable to cause fewer serious adverse reactions (SUCRA=79.5%). However, most comparisons were of low or very low certainty, limiting the strength of the findings.
Conclusions: Chlorpromazine IV/IM appears among the most effective single agents for acute migraine in the ED, although it may carry higher risks of sedation or hypotension. Certain analgesics (eg, ibuprofen IV, valproate IV, and possibly ketorolac IV/IM) demonstrated lower efficacy. Due to variability in trial size, dosing, and participant characteristics, the overall certainty of evidence remains limited.
Implications for Practice: Clinicians may consider parenteral chlorpromazine for rapid migraine relief, balancing its adverse event profile with potential efficacy. Dexamethasone’s lower probability of serious side effects could make it a complementary option. The findings highlight the need for individualized treatment, taking into account patient comorbidities and preferences.
Study Strengths and Limitations: This network meta-analysis offers a broad comparative perspective on diverse pharmacologic interventions for ED-based migraine management. Nonetheless, there is notable heterogeneity in study methodologies, small sample sizes, and sparse direct comparisons for many interventions, all of which reduce certainty in the estimates.
Future Research: Larger, more standardized trials are needed to confirm these results and directly compare drugs like chlorpromazine, prochlorperazine, and metoclopramide-NSAID combinations. Rigorous safety reporting is crucial to clarify adverse reaction risks for various agents, especially those with less available evidence.
Reference: deSouza IS, Anthony N, Thode H Jr, et al. Effectiveness and Safety of Pharmacologic Therapies for Migraine in the Emergency Department: A Systematic Review and Bayesian Network Meta-analysis. Annals of Emergency Medicine. DOI: http://doi.org/10.1016/j.annemergmed.2024.11.004
AGA Clinical Practice Update on Screening and Surveillance in High-Risk US Populations for Gastric Cancer: Expert Review
25 Dec, 2024 | 11:02h | UTCIntroduction:
This American Gastroenterological Association (AGA) Clinical Practice Update provides guidance on primary and secondary prevention strategies for gastric cancer (GC) among high-risk groups in the United States. GC disproportionately affects racial and ethnic minorities, certain first-generation immigrants from countries with elevated GC incidence, and individuals with specific hereditary syndromes or family histories of GC. Given ongoing disparities in diagnosis and outcomes, this document outlines best practices for recognizing at-risk individuals, performing high-quality endoscopic screening, and establishing surveillance protocols for gastric precancerous conditions.
Key Recommendations:
- Identify High-Risk Groups: Consider screening among first-generation immigrants from high-incidence regions, people with a family history of GC in a first-degree relative, individuals with hereditary gastrointestinal syndromes, and patients with multiple risk factors (eg, chronic Helicobacter pylori infection, smoking, diets high in salt and processed meats).
- Preferred Screening Modality: Upper endoscopy is considered the best method for detecting precancerous lesions (atrophic gastritis and intestinal metaplasia) and early malignancies. It allows direct visualization of the gastric mucosa, systematic biopsy, and accurate histologic staging.
- High-Quality Endoscopic Examination: Essential elements include high-definition endoscopes, optimal mucosal cleansing and insufflation, adequate inspection time, systematic photodocumentation, and biopsy protocols (such as the updated Sydney System) to detect and characterize precancerous changes or early cancer.
- H. pylori Eradication: Opportunistic screening for H. pylori and its eradication are key adjunctive measures in preventing GC development. Family-based testing—screening adult household members of H. pylori–positive individuals—may further reduce reinfection rates and disease progression.
- Systematic Biopsy Protocols: When atrophic gastritis or intestinal metaplasia is suspected, obtain at least five biopsies (antrum/incisura and corpus in separate containers). Any suspicious lesion should be sampled independently.
- Recognition of Metaplasia and Dysplasia: Endoscopists should be trained to accurately identify visual patterns associated with gastric intestinal metaplasia (GIM) and dysplasia. Artificial intelligence may hold promise, but current data are insufficient to recommend routine use.
- Risk Stratification and Surveillance Intervals: Patients with confirmed GIM or dysplasia, especially those with severe or extensive metaplasia, may require follow-up endoscopy every three years. Individuals with multiple risk factors or severe metaplastic changes could benefit from shorter intervals.
- Management of Dysplasia and Early GC: All dysplasia should be reviewed by an expert gastrointestinal pathologist. Visible high-grade dysplasia or early GC lesions generally warrant endoscopic submucosal dissection (ESD) at specialized centers to achieve en bloc, R0 resection and enable accurate pathology.
- Post-Resection Surveillance: Individuals with successfully resected dysplasia or early cancer need ongoing endoscopic surveillance to detect metachronous lesions. Surveillance intervals vary depending on pathology results and patient-level factors.
- De-Escalation of Screening: Discontinue screening or surveillance when the patient is no longer fit for potential endoscopic or surgical treatment.
- Equity and Sustainability: To reduce GC mortality, it is crucial to address modifiable risk factors, enhance patient access to endoscopy and skilled practitioners, and integrate research advances, especially in noninvasive biomarker development and improved endoscopic technologies.
Conclusion:
An effective US-based GC screening and surveillance program requires robust preprocedural identification of high-risk individuals, intraprocedural adherence to quality endoscopy standards, and consistent postprocedural follow-up to ensure equitable access to treatment. By refining these clinical practices and prioritizing research, meaningful reductions in GC incidence and mortality can be achieved, ultimately improving patient outcomes and addressing healthcare disparities.
Reference:
Shah SC, Wang AY, Wallace MB, Hwang JH. AGA Clinical Practice Update on Screening and Surveillance in Individuals at Increased Risk for Gastric Cancer in the United States: Expert Review. Gastroenterology. Published online December 23, 2024.
https://doi.org/10.1053/j.gastro.2024.11.001
RCT: Levofloxacin for the Prevention of Multidrug-Resistant Tuberculosis in Vietnam
24 Dec, 2024 | 12:53h | UTCBackground:
Multidrug-resistant (MDR) and rifampin-resistant tuberculosis pose significant global health challenges. Preventing active disease among contacts exposed to resistant strains is critical, yet limited evidence exists on targeted chemopreventive interventions. This study investigated whether a six-month course of daily levofloxacin could reduce the incidence of bacteriologically confirmed tuberculosis among household contacts of individuals with confirmed MDR or rifampin-resistant tuberculosis in Vietnam.
Objective:
To assess if levofloxacin prophylaxis decreases the 30-month incidence of active tuberculosis among high-risk contacts. Primary endpoints included bacteriologically confirmed disease, and secondary outcomes encompassed adverse events, mortality, and development of fluoroquinolone-resistant Mycobacterium tuberculosis.
Methods:
Researchers conducted a double-blind, placebo-controlled, randomized trial. Eligible participants were household contacts of persons who had started MDR tuberculosis treatment within the previous three months, had a positive tuberculin skin test or immunosuppressive condition, and showed no clinical or radiographic signs of active disease. Enrolled individuals received weight-based oral levofloxacin (up to 750 mg/day) or an identical placebo for 180 days. Monthly visits supported adherence and monitored adverse events. Participants underwent follow-up visits every six months until 30 months for tuberculosis screening, chest radiography, and sputum testing where indicated.
Results:
Of 2041 randomized contacts, 1995 (97.7%) completed 30 months of follow-up or reached a primary endpoint. Confirmed tuberculosis was diagnosed in 6 participants (0.6%) in the levofloxacin group and 11 (1.1%) in the placebo group (incidence rate ratio, 0.55; 95% CI, 0.19–1.62), a difference that did not achieve statistical significance. Severe (grade 3 or 4) adverse events were infrequent in both groups, while mild adverse events were more common with levofloxacin (31.9% vs. 13.0%). Acquired fluoroquinolone resistance was not detected.
Conclusions:
Daily levofloxacin for six months showed a numerically lower incidence of tuberculosis than placebo, but the difference was not statistically significant due to lower-than-expected case counts. Treatment was generally well tolerated; however, higher discontinuation rates occurred among levofloxacin recipients, often due to mild musculoskeletal complaints. Further studies may clarify the role of fluoroquinolone-based regimens in preventing MDR tuberculosis across diverse epidemiologic contexts.
Implications for Practice:
These findings suggest that levofloxacin prophylaxis could benefit contacts at high risk of MDR tuberculosis, albeit with caution regarding adherence challenges and low-grade side effects. Broader implementation would require diligent screening, consideration of background fluoroquinolone resistance, and strategies to manage mild adverse events that could undermine treatment completion.
Study Strengths and Limitations:
Strengths include a rigorous double-blind, placebo-controlled design, nearly complete follow-up, and thorough exclusion of prevalent tuberculosis at baseline. Limitations involve an unexpectedly low incidence of confirmed disease, limiting statistical power, and a study population with low HIV prevalence, which may reduce generalizability.
Future Research:
Further research is necessary to confirm these findings in diverse settings, explore alternative or shorter regimens (including newer agents like delamanid), and investigate optimal approaches for patients with fluoroquinolone-resistant strains. The long-term impact on transmission dynamics and microbiome shifts also warrants additional investigation.
Guideline: Metformin to Prevent Antipsychotic-Induced Weight Gain
23 Dec, 2024 | 20:55h | UTCIntroduction:
This guideline was developed to address a pressing need for strategies to prevent antipsychotic-induced weight gain (AIWG), a frequent and troubling adverse effect of treatment in individuals with severe mental illness (SMI). Although metformin has shown consistent benefits in mitigating weight gain when initiated alongside antipsychotics, clinical uptake remains limited. The guideline follows the AGREE II framework and synthesizes both randomized and observational research, including Cochrane and meta-analytic data. The primary objective is to outline explicit indications, dosing approaches, and duration for using metformin to avert AIWG.
Key Recommendations:
- Co-initiation With High-Risk Agents: In patients requiring higher-risk antipsychotics (olanzapine, clozapine), start metformin simultaneously. Evidence suggests that concurrent treatment may lessen weight gain by 3 to 5 kg in the early months, potentially yielding greater benefits over time.
- Co-initiation With Medium-Risk Agents: For individuals prescribed quetiapine, paliperidone, or risperidone who have at least one cardiometabolic risk factor (such as diabetes, prediabetes, hypertension, or BMI above 25) or who are 10 to 25 years old, begin metformin at antipsychotic initiation to curb rapid weight changes.
- Initiation During the First Year: If, at any point in the first year of antipsychotic treatment, weight gain exceeds 3% over baseline, consider adding metformin regardless of the antipsychotic being used.
- Titration Schedule and Safety: The guideline advises starting at 500 mg once daily, then moving to 500 mg twice daily after about two weeks, with subsequent increases every two weeks up to 1 g twice daily (2 g/day) as tolerated. Metformin must be discontinued if lactic acidosis is suspected, if BMI falls below 20, or if the antipsychotic is stopped. Avoid its use in harmful alcohol consumption.
- Additional Treatment Options: In cases of obesity (BMI ≥30) or comorbid metabolic disorders, clinicians should consider adding glucagon-like peptide-1 receptor agonists (GLP-1) where available. If cost, supply, or access is limited, metformin remains a practical alternative.
Conclusion:
This is the first evidence-based guideline focused on preventing AIWG by starting metformin at the time of antipsychotic initiation or upon early weight gain signs. By reducing the magnitude of weight increase, metformin may alleviate health risks tied to obesity, as well as psychological distress and nonadherence to treatment. Implementing the guideline involves continuous weight monitoring, structured dose adjustments, and shared decision-making. Ensuring clear communication about benefits and potential side effects will be crucial for sustaining adherence and improving patient outcomes.
Reference:
Carolan A, Hynes-Ryan C, Agarwal SM, Bourke R, Cullen W, Gaughran F, Hahn MK, Krivoy A, Lally J, Leucht S, et al. Metformin for the Prevention of Antipsychotic-Induced Weight Gain: Guideline Development and Consensus Validation. Schizophrenia Bulletin. 2024; sbae205.
DOI: https://doi.org/10.1093/schbul/sbae205
Additional Commentaries:
- Psychiatric News Alert: https://alert.psychnews.org/2024/12/new-guideline-advises-metformin-to.html
- Zagorski N. Metformin May Reduce Weight Gain in Youth Taking Antipsychotics. Psychiatric News. 2024; 59(01). https://psychiatryonline.org/doi/full/10.1176/appi.pn.2024.01.1.22
2025 ASA Practice Advisory for the Perioperative Care of Older Adults Undergoing Inpatient Surgery
23 Dec, 2024 | 20:27h | UTCIntroduction: This summary outlines the American Society of Anesthesiologists (ASA) 2025 advisory on optimizing perioperative care for older adults (age 65 years or older) undergoing inpatient surgery. It focuses on preoperative, intraoperative, and postoperative measures to mitigate cognitive complications, especially delirium and longer-term cognitive decline, in a population that is highly vulnerable to functional deterioration and loss of independence. The recommendations are based on systematic reviews and meta-analyses, supplemented by expert consensus where evidence is limited. Although not intended as strict standards of care, these advisory statements provide practical guidance that can be adapted to local contexts and patient-specific needs.
Key Recommendations:
- Expanded Preoperative Evaluation:
- Incorporate frailty assessment, cognitive screening, and psychosocial or nutritional evaluations into routine preoperative workups for older patients.
- Patients identified with frailty or cognitive deficits should receive targeted interventions, such as geriatric co-management, deprescribing when indicated, and early family education about delirium risks.
- Evidence suggests a modest decrease in postoperative delirium when such evaluations are included.
- Choice of Primary Anesthetic (Neuraxial vs. General):
- Current studies do not demonstrate a clear advantage of neuraxial over general anesthesia in reducing postoperative delirium risk.
- Both approaches are acceptable; individualize decisions based on patient factors, surgical requirements, and preference-sensitive discussions.
- Maintenance of General Anesthesia (Total Intravenous vs. Inhaled Agents):
- Data are inconclusive regarding delirium prevention, with no significant difference between total intravenous anesthesia (TIVA) and inhaled volatile agents.
- Some low-level evidence indicates TIVA might reduce short-term cognitive decline, but this effect is inconsistent over longer follow-up.
- Dexmedetomidine for Delirium Prophylaxis:
- Moderate-level evidence supports dexmedetomidine for reducing delirium incidence in older patients, yet its use may increase bradycardia and hypotension.
- Optimal dosing and timing remain uncertain, and baseline patient vulnerability should inform decisions.
- Medications with Potential Central Nervous System Effects:
- Drugs such as benzodiazepines, antipsychotics, anticholinergics, ketamine, and gabapentinoids warrant careful risk-benefit analysis.
- Current findings are inconclusive, suggesting neither a firm endorsement nor outright disapproval; preexisting conditions and polypharmacy should guide individualized treatment plans.
Conclusion: Preserving cognitive function and independence in older adults undergoing inpatient surgery is a growing priority. These recommendations highlight the importance of comprehensive preoperative screenings (frailty, cognition, and psychosocial domains), shared decision-making when choosing anesthetic techniques, and thoughtful use of pharmacologic agents. While dexmedetomidine shows promise in mitigating delirium, vigilance regarding hypotension and bradycardia is essential. Ultimately, these strategies aim to reduce anesthesia-related complications in older patients by addressing the multifaceted determinants of postoperative cognitive outcomes.
Network Meta-Analysis: Triplet and Quadruplet Chemotherapy Regimens for Advanced Pancreatic Cancer
22 Dec, 2024 | 17:38h | UTCBackground: Advanced pancreatic ductal adenocarcinoma (PDAC) is associated with poor survival, as most patients present with metastatic or unresectable disease. While gemcitabine monotherapy was the mainstay for decades, subsequent trials found survival gains with multi-agent regimens such as FOLFIRINOX and gemcitabine plus nab-paclitaxel. However, head-to-head data comparing these regimens are limited. This systematic review and Bayesian network meta-analysis aimed to consolidate available evidence on first-line treatment options, focusing on both efficacy and safety outcomes.
Objective: To compare different chemotherapy regimens for unresectable locally advanced or metastatic PDAC, determining which options confer the greatest benefit in progression-free survival (PFS), overall survival (OS), objective response rate (ORR), and toxicity profiles.
Methods: Phase 2–3 randomized controlled trials published since 2000 were searched in PubMed, Cochrane Central, and Embase, and from oncology conference proceedings. Eligible studies enrolled previously untreated patients with advanced PDAC, examining at least one regimen containing gemcitabine, fluorouracil, oxaliplatin, irinotecan, nab-paclitaxel, or liposomal irinotecan. Hazard ratios (HRs) and odds ratios (ORs) were computed within a Bayesian framework, ranking therapies using surface under the cumulative ranking curves. Risk of bias was assessed with Cochrane’s RoB 2 tool; evidence certainty was appraised using GRADE.
Results: Seventy-nine trials (22,168 patients) were included. PFS analysis showed gemcitabine plus nab-paclitaxel alternating FOLFOX (HR 0.32), PAXG (0.35), and NALIRIFOX (0.43) offered the highest benefit versus gemcitabine alone, followed by FOLFIRINOX (0.55) and gemcitabine plus nab-paclitaxel (0.62). For OS, the top three regimens were PAXG (HR 0.40), gemcitabine plus nab-paclitaxel alternating FOLFOX (0.46), and NALIRIFOX (0.56), with FOLFIRINOX (0.66) and gemcitabine plus nab-paclitaxel (0.67) close behind. NALIRIFOX demonstrated relatively lower rates of hematologic toxicity compared with FOLFIRINOX and gemcitabine plus nab-paclitaxel, although diarrhea and neuropathy remained frequent across most multi-drug regimens. Overall, the certainty of evidence was low, largely because of indirect comparisons and high heterogeneity among trials.
Conclusions: Among triplet options, NALIRIFOX and FOLFIRINOX appear most favorable for patients who can tolerate intensified therapy, with gemcitabine plus nab-paclitaxel remaining a viable option for less fit patients. Quadruplet regimens, either administered concurrently or sequentially, show promising efficacy but warrant validation in phase 3 trials.
Implications for Practice: Clinical decision-making should balance potential survival benefits with toxicity, patient performance status, and cost-effectiveness considerations. While more intensive combinations can prolong survival, adverse effects and patient quality of life must be carefully monitored.
Study Strengths and Limitations: This is the largest systematic review and network meta-analysis to date in this setting, synthesizing 79 trials. Nevertheless, the evidence base is limited by phase 2 studies, inconsistent reporting of toxicities, and relatively few direct head-to-head comparisons. Most trials lacked central radiographic review.
Future Research: Prospective phase 3 trials evaluating quadruplet or sequential regimens are needed. Additional biomarker-driven strategies, personalized chemotherapy scheduling, and earlier integration of palliative care may also help enhance survival and preserve quality of life in this population.