Editor's Choice
RCT: Sequential Oral Agents Not Noninferior to Insulin for Gestational Diabetes
8 Jan, 2025 | 11:05h | UTCBackground: Gestational diabetes mellitus (GDM) affects a growing number of pregnant individuals worldwide. While insulin has long been the standard pharmacological treatment, oral glucose-lowering agents (metformin and glyburide) have gained traction.
Objective: This trial investigated whether a sequential oral glucose-lowering regimen—beginning with metformin and adding glyburide as needed—was noninferior to an insulin-based strategy in reducing the risk of infants born large for gestational age (LGA).
Methods: This open-label, randomized, noninferiority trial enrolled 820 participants with singleton pregnancies at 16 to 34 weeks of gestation across 25 Dutch centers. Participants were randomized 1:1 to either (1) metformin initiated at 500 mg once daily and increased every three days up to 1000 mg twice daily or the highest tolerated dose with glyburide at 2.5 mg 30-60 minutes before each meal (with a dose increase up to a maximum of 5 mg three times per day) added if needed, and insulin added only if both failed, discontinuing glyburide, or (2) standard insulin therapy. The primary outcome was LGA (>90th percentile for gestational age and sex).
Results: Among those allocated to oral therapy (n=409), 79% achieved glycemic control without insulin. However, 23.9% of infants in the oral-therapy group were LGA vs 19.9% in the insulin group (absolute risk difference 4.0%; 95% CI, −1.7% to 9.8%). This exceeded the predefined 8% absolute risk difference noninferiority margin (P = .09 for noninferiority). Maternal hypoglycemia occurred more often with oral agents (20.9% vs 10.9%; absolute risk difference, 10.0%; 95% CI, 3.7%-21.2%), and neonatal intravenous glucose therapy was administered more frequently to those randomized to oral agents (6.4% vs 3.2%). Exploratory analysis not powered for definitive conclusions of participants requiring only metformin (no glyburide) showed a somewhat lower LGA rate (19.7%).
Conclusions: A sequential oral pharmacotherapy strategy—beginning with metformin and adding glyburide if needed—did not meet noninferiority criteria compared to insulin for preventing LGA births in GDM. While oral agents can reduce the overall need for insulin, the higher rate of maternal hypoglycemia, the higher rate of neonatal hypoglycemia requiring intravenous glucose therapy, and the borderline higher LGA incidence underscore the continued importance of insulin-based strategies, especially considering that the results support a larger body of evidence that glyburide is a suboptimal treatment for gestational diabetes. These results reinforce that insulin remains the preferred first-line pharmacological treatment for GDM, in line with current guidelines. Although patient satisfaction can be higher with oral agents, clinicians should carefully weigh the risks. Further research is needed to clarify the role of metformin-only approaches in GDM management.
Strengths and Limitations: Strengths include a large multicenter design and a clear noninferiority framework. Limitations include the open-label design, which introduces the possibility of bias in treatment allocation and outcome assessment, the reliance on local clinical protocols for insulin adjustments, and variations in diagnostic criteria.
Future Research: Ongoing trials are examining whether metformin alone might match insulin’s efficacy for GDM. Further studies should address long-term offspring outcomes.
Reference:
Rademaker D, de Wit L, Duijnhoven RG, et al. Oral Glucose-Lowering Agents vs Insulin for Gestational Diabetes: A Randomized Clinical Trial. JAMA. Published online January 6, 2025. DOI: http://doi.org/10.1001/jama.2024.23410
Powe CE. For Gestational Diabetes Pharmacotherapy, Insulin Reigns Supreme (Editorial). JAMA. Published online January 6, 2025. DOI: http://doi.org/10.1001/jama.2024.27148
RCT: Assessing Procalcitonin-Based Antibiotic Management in Critically Ill Patients With Sepsis
7 Jan, 2025 | 14:00h | UTCBackground: Optimal antibiotic duration for sepsis remains uncertain. Procalcitonin (PCT) and C-reactive protein (CRP) are thought to support shorter courses, but prior research was small-scale or at risk of bias. This multicenter, randomized trial (ADAPT-Sepsis) evaluated whether daily PCT- or CRP-guided protocols could reduce antibiotic use without increasing 28-day all-cause mortality in critically ill adults with suspected sepsis.
Objective: To determine if daily biomarker-guided (PCT or CRP) strategies decrease total antibiotic days among critically ill adults while maintaining acceptable 28-day mortality, compared with standard care.
Methods: From 2018 to 2024 (with enrollment paused March–August 2020 due to COVID-19), 2760 adults (≥18 years) on intravenous antibiotics for suspected sepsis (acute organ dysfunction and presumed infection) and likely to continue antibiotics for at least 72 hours were randomized across 41 UK NHS ICUs within 24 hours of antibiotic initiation. They were assigned in a 1:1:1 ratio to (1) daily PCT-guided advice (n=918), (2) daily CRP-guided advice (n=924), or (3) standard care (n=918). Biomarker results were concealed; clinicians received automated daily prompts recommending continuation or discontinuation. The co-primary outcomes were (1) total antibiotic duration (randomization to day 28) and (2) 28-day all-cause mortality. Secondary measures included antibiotic duration for the initial sepsis episode, 90-day mortality, readmissions, and length of stay.
Results: Among 2760 participants (mean age, 60.2 years; 60.3% men; ~50% with septic shock), over 96% provided 28-day data. Patients in the PCT-guided arm had a statistically significant mean reduction in total antibiotic duration vs standard care (9.8 vs 10.7 days; difference, 0.88 days; 95% CI, 0.19–1.58; p=0.01). The PCT strategy met the prespecified 5.4% noninferiority margin for 28-day mortality (20.9% vs 19.4%; absolute difference, 1.57; 95% CI, –2.18 to 5.32; p=0.02), implying noninferiority but not fully excluding a small risk of excess mortality. CRP-guided protocols did not shorten total antibiotic use (10.6 vs 10.7 days; p=0.79) and were inconclusive for noninferiority regarding mortality (21.1% vs 19.4%; difference, 1.69; 95% CI, –2.07 to 5.45; p=0.03). Notably, 90-day mortality also showed no significant differences. A post-trial commentary (PulmCCM) emphasized that some uncertainty remains with the 5.4% margin and warned that patient-level randomization could subtly discourage earlier antibiotic discontinuation in standard care, which received no explicit “stop” prompts.
Conclusions: In critically ill patients with suspected sepsis, a PCT-guided antibiotic discontinuation protocol shortened overall antibiotic use by nearly one day without exceeding the predefined noninferiority threshold for 28-day mortality. However, the chosen 5.4% margin allows for the possibility of clinically relevant harm. A CRP-guided protocol did not reduce total antibiotic use and showed inconclusive mortality findings.
Implications for Practice: Adopting PCT-based stewardship may modestly decrease antibiotic exposure without a clear short-term mortality penalty, potentially limiting antibiotic resistance. Clinicians should remain vigilant, recognizing the risk tolerance implied by the 5.4% margin. PCT results should complement, not replace, comprehensive clinical judgment.
Study Strengths and Limitations: Strengths include the large sample size, multi-center design, blinded biomarker allocation, and distinct emphasis on both effectiveness and safety outcomes. Limitations include the acceptance of a 5.4% potential excess mortality as the noninferiority threshold, uncertainty about rare but significant harms, and the possibility of bias introduced by patient-level randomization. Generalizability to lower-resource settings may also be limited.
Future Research: Further randomized trials with lower noninferiority margins or cluster-level allocation are needed to better define the safety and efficacy of PCT-guided strategies for reducing antibiotic duration in sepsis. Additional investigations are needed for long-term patient-centered outcomes, cost-effectiveness, and the role of alternative biomarkers or combined strategies in sepsis care.
Reference:
Dark P, Hossain A, McAuley DF, et al. Biomarker-Guided Antibiotic Duration for Hospitalized Patients With Suspected Sepsis: The ADAPT-Sepsis Randomized Clinical Trial. JAMA. 2024; published online December 9. DOI: http://doi.org/10.1001/jama.2024.26458
PulmCCM Commentary: “Is procalcitonin ‘safe’ to guide antibiotic use in patients with sepsis? ADAPT-Sepsis tests the strategy in the U.K., with global ambitions.” Jan 02, 2025. https://www.pulmccm.org/p/is-procalcitonin-safe-to-guide-antibiotic
Joint ATS/CDC/ERS/IDSA Guideline Recommends Shorter, All-Oral Regimens for Drug-Susceptible and Drug-Resistant TB
5 Jan, 2025 | 11:30h | UTCIntroduction: This summary outlines new clinical practice guidelines from the American Thoracic Society, U.S. Centers for Disease Control and Prevention, European Respiratory Society, and Infectious Diseases Society of America on updated treatment regimens for tuberculosis (TB) in low-incidence settings. These recommendations build on recent clinical trials, World Health Organization (WHO) guidance, and were developed using the GRADE and GRADE-ADOLOPMENT methodology. The guidelines aim to shorten treatment duration, reduce pill burden, and improve patient outcomes for both drug-susceptible (DS) and drug-resistant (DR) TB, and they apply to settings where mycobacterial cultures, molecular and phenotypic drug susceptibility tests, and radiographic studies are routinely available. A separate news release from CIDRAP highlights the significance of these shorter, all-oral regimens for adults and children. Directly observed therapy (DOT) remains the standard of care.
Key Recommendations:
Four-Month Regimen for DS-TB in Adults:
- For people aged 12 years or older with isoniazid- and rifampin-susceptible pulmonary TB, a new four-month regimen of isoniazid, rifapentine, moxifloxacin, and pyrazinamide (2HPZM/2HPM) is conditionally recommended. This shortened course is based on a large, randomized trial (Study 31/A5349) demonstrating noninferior efficacy compared to the standard six-month regimen (84.6% vs 85.4% cure, respectively), no increase in adverse events, and potential benefits in completion rates. Exclusions include TB meningitis and other complicated forms of extrapulmonary TB, and clinicians should obtain rapid fluoroquinolone susceptibility tests before initiating this regimen.
Four-Month Regimen for DS-TB in Children:
- For children and adolescents aged 3 months to 16 years with nonsevere, drug-susceptible pulmonary TB, a four-month regimen of isoniazid, rifampin, pyrazinamide, and ethambutol for the initial phase, followed by isoniazid and rifampin, is strongly recommended. Evidence from the SHINE trial showed high success (97.1% vs 96.9%) and similar safety with the shorter course compared to the 6-month regimen. Nonsevere TB generally excludes extensive cavitary disease, advanced extrapulmonary TB, or complicated forms. Close clinical and radiographic follow-up is important to confirm effective cure.
Six-Month BPaL Regimen for Rifampin-Resistant, Fluoroquinolone-Resistant or Intolerant TB:
- For rifampin-resistant (RR) pulmonary TB with resistance or patient intolerance to fluoroquinolones in adolescents aged 14 and older and adults, a six-month all-oral bedaquiline, pretomanid, and linezolid (BPaL) regimen is strongly recommended, replacing much longer regimens that often included injectables. Clinical trials (Nix-TB, ZeNix) demonstrated higher cure rates and lower toxicity with this regimen compared to longer regimens, though vigilance is needed for linezolid-related adverse events (e.g., neuropathy, myelosuppression). Baseline and monthly lab and ECG checks are advised.
Six-Month BPaLM Regimen for Rifampin-Resistant, Fluoroquinolone-Susceptible TB:
- For RR pulmonary TB that remains fluoroquinolone-susceptible in adolescents aged 14 and older and adults, a six-month bedaquiline, pretomanid, linezolid, and moxifloxacin (BPaLM) regimen is strongly recommended over traditional 15-month or longer regimens in patients with MDR/RR-TB. Data from the TB-PRACTECAL trial showed high success rates and fewer serious adverse events. BPaLM is the first-line recommendation for this group. Close monitoring of cardiac status (QTc prolongation) and blood counts is advised.
Both BPaL and BPaLM regimens require detailed drug susceptibility testing and cautious management of potential drug–drug interactions, particularly for patients with comorbidities or HIV infection. Of note, the certainty of evidence for the outcomes in the DR-TB trials was rated as very low, due to multiple factors including bias, small event numbers, lack of blinding, and inconsistent outcomes.
Conclusion: These new recommendations markedly shorten TB treatment courses for adults and children in low-incidence settings with access to appropriate diagnostic tools, while avoiding injectables and reducing serious toxicities. By replacing older, more complex regimens with all-oral, shorter-duration therapy, and using DOT as the standard of care, the guidelines aim to improve adherence, lessen the burden on healthcare systems, and enhance patient quality of life. Ongoing research will further refine dosing, safety for special populations (e.g., pregnant individuals), and the role of advanced drug susceptibility testing.
Reference:
Jussi J. Saukkonen, Raquel Duarte, Sonal S. Munsiff, et al. “Updates on the Treatment of Drug-Susceptible and Drug-Resistant Tuberculosis: An Official ATS/CDC/ERS/IDSA Clinical Practice Guideline.” American Journal of Respiratory and Critical Care Medicine, (2025). https://doi.org/10.1164/rccm.202410-2096ST
News release commentary: “New guidelines expand recommendations for shorter, all-oral TB treatments” (CIDRAP). https://www.cidrap.umn.edu/tuberculosis/new-guidelines-expand-recommendations-shorter-all-oral-tb-treatments
Meta-analysis: Therapeutic-Dose Heparin Improves 28-Day Mortality in COVID-19 Hospitalized Patients
6 Jan, 2025 | 12:00h | UTCBackground: High rates of thrombotic events and systemic inflammation among COVID-19 hospitalized patients led researchers to test whether intensified anticoagulation strategies could reduce morbidity and mortality. Previous trials yielded conflicting results, partly due to varying doses of anticoagulants—prophylactic, intermediate, or therapeutic—and heterogeneous patient severity. This comprehensive investigation, conducted by the WHO Rapid Evidence Appraisal for COVID-19 Therapies (REACT) Working Group, aimed to clarify the benefits and risks of escalated anticoagulation dosing in patients hospitalized for COVID-19.
Objective: To estimate whether higher-dose anticoagulation (therapeutic or intermediate) improves 28-day all-cause mortality compared with lower-dose anticoagulation (prophylactic or intermediate), and to evaluate secondary outcomes, including progression to mechanical ventilation, thromboembolic events, and major bleeding.
Methods: This prospective meta-analysis included randomized trials comparing higher- versus lower-dose anticoagulation for hospitalized COVID-19 patients. Investigators collected trial-level summary data, focusing primarily on heparins. Dosing categories—therapeutic, intermediate, and prophylactic—were predefined. The main outcome was 28-day mortality; secondary outcomes included progression to invasive mechanical ventilation (IMV), venous or arterial thrombotic events, and major hemorrhage. Data were analyzed using a fixed-effects model, with odds ratios (ORs) pooled across trials.
Results: Overall, 22 trials (over 11 000 total participants) contributed data, primarily evaluating heparins. For therapeutic versus prophylactic-dose heparin, 28-day mortality was significantly reduced (OR, 0.77; 95% CI, 0.64–0.93), especially among patients requiring low-flow oxygen or no supplemental oxygen. Therapeutic dose reduced thromboembolic events (OR 0.48; 95% CI, 0.36-0.64) but increased major bleeding (OR 1.90; 95% CI, 1.19-3.05) compared to prophylactic dose. In contrast, when therapeutic was compared to intermediate-dose heparin, the summary OR for 28-day mortality was 1.21 (CI, 0.93–1.58), suggesting a potential trend toward higher mortality that did not reach statistical significance. Intermediate versus prophylactic-dose comparisons revealed no conclusive mortality difference (OR, 0.95; CI, 0.76–1.19). Across all higher-dose arms, thromboembolic events decreased, while the risk of major bleeding increased, underscoring the delicate risk–benefit balance. Subgroup analyses by respiratory support level, D-dimer, and baseline severity did not indicate strong interaction effects, although sample sizes were limited in more severe illness subgroups.
Conclusions: Therapeutic-dose heparin reduces 28-day mortality relative to prophylactic-dose in hospitalized patients with COVID-19, mainly among those not requiring invasive ventilation. Mortality was similar or potentially worse when therapeutic was compared to intermediate-dose. Clinicians must weigh the lower rate of thrombotic complications against the higher bleeding risk, particularly in critically ill patients.
Implications for Practice: Although higher anticoagulant dosing appears beneficial for certain hospitalized COVID-19 patients, especially those with mild to moderate respiratory compromise, individualized assessment remains key. Current guidelines broadly recommend prophylactic dosing for the critically ill and suggest considering higher doses only in carefully selected patients. Evolving viral variants and changes in standard of care further complicate direct application of these findings to present-day hospital settings.
Study Strengths and Limitations: Strengths include prospective planning, collaboration with multiple trials, and a large pooled sample. Limitations encompass heterogeneity in dose definitions, partial reliance on published data where individual-level parameters could not be fully harmonized, and potential temporal changes in COVID-19 clinical profiles. Moreover, bleeding severity beyond major hemorrhage was not universally reported, limiting robust safety assessments.
Future Research: Further studies should focus on individualized anticoagulant strategies that consider biomarkers (for example, D-dimer) and evolving treatment protocols. Investigations examining optimal timing, duration, and post-discharge management will help refine anticoagulation practices.
Reference:
The WHO Rapid Evidence Appraisal for COVID-19 Therapies (REACT) Working Group. Anticoagulation Among Patients Hospitalized for COVID-19: A Systematic Review and Prospective Meta-analysis. Annals of Internal Medicine. DOI: https://doi.org/10.7326/ANNALS-24-00800
Shappell CN, Anesi GL. Anticoagulation for COVID-19: Seeking Clarity and Finding Yet More Gray. Annals of Internal Medicine. DOI: https://doi.org/10.7326/ANNALS-24-03244
Review: Nutritional Support in Critically Ill Patients
6 Jan, 2025 | 11:00h | UTCIntroduction: This summary is derived from a state-of-the-art review on nutritional support in the intensive care unit (ICU) published in The BMJ. Critically ill patients experience metabolic disturbances, inflammation, and profound muscle wasting. Nutritional therapy aims to mitigate these effects, though recent randomized controlled trials (RCTs) challenge the dogma of early, aggressive provision of high-calorie and high-protein diets for all ICU patients. Instead, emerging evidence indicates that moderate energy and protein restriction, particularly during the first week, may enhance recovery and reduce complications such as hospital-acquired infections, muscle weakness, and ICU-acquired morbidity. Nonetheless, identifying ideal feeding strategies remains complex, given the dynamic nature of critical illness and the interplay with other interventions such as sedation and physical rehabilitation.
Key Recommendations:
- Individualized Timing and Dose: Limit caloric and protein loads during the acute phase (roughly the first seven days), especially in patients with hemodynamic instability or shock. Later, as patients transition to recovery, gradually increase macronutrient delivery to meet evolving metabolic needs.
- Preferred Feeding Route: Enteral nutrition is generally recommended when the gastrointestinal tract is functional, particularly after shock resolution. Parenteral nutrition can be reserved for prolonged gut dysfunction or inability to meet needs enterally. Studies comparing enteral versus parenteral feeding have shown no clear outcome differences, but early enteral feeding is often favored for physiological and cost reasons.
- Avoid Overfeeding and Overzealous Protein Provision: Several large RCTs (including EFFORT-Protein, EDEN, and NUTRIREA-3) observed no mortality benefit—and in some instances, worse outcomes—when patients received full or high doses of energy and protein in the first week. Metabolic “resistance” and inhibition of protective processes such as autophagy might explain why restricted early feeding sometimes confers advantages.
- Monitoring and Assessment: Traditional tools (NUTRIC, NRS-2002) and biomarkers (albumin, prealbumin) do not reliably predict who benefits from higher or lower feeding levels. Ultrasound or computed tomography to assess muscle mass may hold promise, but no validated approach exists to guide individualized macronutrient targets.
- Micronutrients and Specialized Formulations: Broad-spectrum pharmaconutrients (glutamine, antioxidants, etc.) have not improved outcomes in well-powered trials. Instead, standard vitamin and trace element supplementation consistent with recommended daily allowances appears sufficient in most cases.
- Long-term Rehabilitation: Combined nutritional support and physical exercise are critical for mitigating long-term impacts of ICU-acquired weakness and functional decline. Evidence increasingly highlights the need for prolonged, structured rehabilitation to optimize muscle recovery and quality of life.
Conclusion: Although nutritional support remains central to critical care, it is most effective when carefully adapted to disease phase, patient comorbidities, and evolving organ dysfunction. Key evidence suggests a more conservative approach to energy and protein during the acute phase, followed by gradual escalation and integration with rehabilitation. Ongoing research seeks to identify physiological markers that distinguish when to intensify nutritional therapy and how best to align macronutrient delivery with other therapies to promote muscle function and reduce complications.
Reference: Reignier J, Rice TW, Arabi YM, Casaer M. Nutritional Support in the ICU. BMJ. 2025;388:e077979. DOI: https://doi.org/10.1136/bmj-2023-077979
Avian Influenza A(H5N1) Outbreak Among US Farm Exposures: Clinical Findings and Early Treatment Outcomes
2 Jan, 2025 | 17:01h | UTCBackground: Highly pathogenic avian influenza A(H5N1) has reemerged in the United States with documented infections in poultry and dairy cows since 2021. From March through October 2024, 46 human cases were identified, most of whom were workers engaged in poultry depopulation or dairy-farm activities where infected or presumably infected animals were present.
Objective: To characterize the clinical presentations, exposure settings, and outcomes of individuals with laboratory-confirmed H5N1 infection and to investigate potential routes of transmission, disease severity, and risk to public health.
Methods: Using a standardized case-report form, data were collected on exposure history, symptom onset, and use of personal protective equipment (PPE). Respiratory and conjunctival swabs from symptomatic persons underwent real-time RT-PCR for H5 subtyping at both state laboratories and the Centers for Disease Control and Prevention (CDC). Genetic sequencing was performed on available samples. Investigators also monitored close household contacts to evaluate the risk of secondary transmission. An additional hospitalized patient with no identifiable exposure source was detected through routine influenza surveillance.
Results: Of the 46 adult case patients, 20 were exposed to infected poultry, 25 to infected or presumably infected dairy cows, and 1 had unknown exposure. Among the 45 occupationally exposed patients, illness was mild, with no hospitalizations or deaths. Conjunctivitis was present in 93% of cases; 49% reported fever, and 36% had respiratory symptoms. Fifteen patients had only conjunctivitis, highlighting the utility of conjunctival specimens for detection. Early antiviral therapy with oseltamivir was common, initiated at a median of two days after symptom onset. No additional cases were found among 97 closely monitored household contacts, indicating no evidence of sustained human-to-human transmission. Genetic analyses revealed clade 2.3.4.4b viruses, with some genotypic differences between poultry-related (D1.1 genotype) and cow-related (B3.13 genotype) infections.
Conclusions: In this observational study, H5N1 infections in US adults were generally mild, self-limited, and predominantly associated with conjunctivitis. The absence of critical illness or fatalities contrasts with historical reports of more severe H5N1 disease. Although no ongoing person-to-person transmission was documented, continued vigilance is warranted, given the virus’s potential for rapid adaptation.
Implications for Practice: Occupational health measures, such as consistent PPE use (especially eye protection), timely surveillance, and prompt antiviral treatment, may reduce the impact of H5N1 infections among exposed workers. Clinicians should consider conjunctival sampling for symptomatic patients with relevant animal contact. Policy efforts should focus on improving biosecurity practices in both poultry and dairy settings.
Study Strengths and Limitations: Strengths include systematic surveillance, robust laboratory testing of both respiratory and conjunctival specimens, and early antiviral administration. Limitations involve possible underreporting of mild or asymptomatic cases, incomplete details on exposure duration, and limited data on specific routes of cow-to-human transmission.
Future Research: Further studies should explore viral evolution in cows, the significance of raw milk as a transmission vehicle, and the potential for more severe infections, as highlighted by sporadic reports of severe H5N1 illness worldwide.
Reference: Garg S, Reinhart K, Couture A, Kniss K, Davis CT, Kirby MK, Murray EL, et al. Highly Pathogenic Avian Influenza A(H5N1) Virus Infections in Humans. New England Journal of Medicine. Published December 31, 2024. Link: https://www.nejm.org/doi/full/10.1056/NEJMoa2414610
- Editorial: Ison MG, Marrazzo J. The Emerging Threat of H5N1 to Human Health. New England Journal of Medicine. Published December 31, 2024. Link: https://www.nejm.org/doi/full/10.1056/NEJMe2416323
Meta-Analysis: Long Half-Life Phosphodiesterase Inhibitors Reduce HbA1c in Adults with Elevated Baseline Levels
6 Jan, 2025 | 08:00h | UTCBackground: Phosphodiesterase type 5 (PDE5) inhibitors are traditionally used to treat erectile dysfunction and pulmonary arterial hypertension. Recent evidence suggests that PDE5 inhibitors could also be repurposed to lower hemoglobin A1c (HbA1c) in patients with type 2 diabetes. Given the disparity in half-lives among these agents, this meta-analysis focused on whether longer half-life PDE5 inhibitors (tadalafil, PF-00489791) produce a more sustained HbA1c reduction compared to short half-life PDE5 inhibitors (sildenafil, avanafil).
Objective: To evaluate the effect of PDE5 inhibitors on HbA1c levels in individuals with baseline values above 6%, comparing agents with short and long half-lives to assess differential clinical benefits in glycemic control.
Methods: This systematic review and meta-analysis included only randomized controlled trials (RCTs) in which participants received any PDE5 inhibitor for at least four weeks, with control or placebo for comparison. Major databases (Cochrane CENTRAL, PubMed Central, ClinicalTrials.gov, and WHO ICTRP) were searched through September 2024 without language restrictions. Statistical analyses were performed using a random-effects model, reporting mean differences in HbA1c. Secondary outcomes (HOMA-IR, lipid profiles, fasting glucose, and others) were also explored.
Results: Thirteen RCTs were eligible (N=1083). Long half-life agents showed a significant mean reduction of approximately −0.40% in HbA1c (p=0.002) in the overall analysis, whereas short half-life PDE5 inhibitors exhibited no significant change. In more stringent subgroup analyses (≥8 weeks’ duration, exclusive type 2 diabetes, baseline HbA1c ≥6.5%), long half-life PDE5 inhibitors maintained a significant decrease (−0.50%), while short half-life agents paradoxically showed a slight but significant increase (+0.36%, p=0.03). In trials enrolling patients with poorly controlled diabetes (baseline HbA1c near 10%), tadalafil’s HbA1c reductions were considerably larger, aligning with the efficacy of other standard oral antidiabetic medications.
Conclusions: Long half-life PDE5 inhibitors appear to confer meaningful reductions in HbA1c, comparable to established oral antidiabetic agents, particularly in patients whose HbA1c is inadequately controlled. In contrast, short half-life PDE5 inhibitors did not show a consistent benefit and may paradoxically raise HbA1c in certain subgroups, although further large-scale studies are warranted to confirm these findings.
Implications for Practice: Long half-life PDE5 inhibitors could serve as an adjunctive therapy in type 2 diabetes management, especially in individuals with higher baseline HbA1c. Yet, caution is advised given limited data on adverse events and the short duration of most included trials. Physicians should remain prudent until more robust evidence, especially in populations with markedly elevated HbA1c, becomes available.
Study Strengths and Limitations: Strengths include a direct comparison between short and long half-life PDE5 inhibitors in a clinically relevant population, plus systematic subgroup analyses. Limitations involve heterogeneity in trial designs, relatively low baseline HbA1c in most participants, and a lack of long-term follow-up data or major clinical endpoints.
Future Research: Subsequent trials should target populations with poorly controlled diabetes (HbA1c ≥9.0%) and assess longer durations (≥3 months) to capture the full impact of PDE5 inhibitor therapy. A deeper examination of combination regimens, pharmacokinetic optimization, and clinical outcomes like cardiovascular events would further clarify the role of these agents in diabetes care.
Reference: Kim J, Zhao R, Kleinberg LR, Kim K. (2025) “Effect of long and short half-life PDE5 inhibitors on HbA1c levels: a systematic review and meta-analysis.” eClinicalMedicine, 80, 103035. Available at: DOI: http://doi.org/10.1016/j.eclinm.2024.103035
AGA Clinical Practice Update on Managing Portal Vein Thrombosis in Cirrhotic Patients: Expert Review
3 Jan, 2025 | 10:00h | UTCIntroduction: This summary highlights key recommendations from an AGA expert review on portal vein thrombosis (PVT) in cirrhotic patients. PVT is common in cirrhosis, with an estimated five-year incidence of around 11%, and may worsen portal hypertension and elevate mortality. Management is challenging because of limited evidence, the potential complications of both PVT and anticoagulation, and significant heterogeneity regarding clot characteristics, host factors, and cirrhosis severity. This review presents the latest guidance on identifying clinically relevant PVT, selecting anticoagulation, and considering endovascular interventions, including TIPS (transjugular intrahepatic portosystemic shunt).
Key Recommendations:
- No Routine Screening: Asymptomatic patients with compensated cirrhosis do not require regular screening for PVT in the absence of suggestive clinical changes.
- Imaging Confirmation: When Doppler ultrasound reveals suspected PVT, contrast-enhanced CT or MRI is recommended to confirm the diagnosis, exclude malignancy, and characterize clot extent and occlusion.
- Hypercoagulability Testing: Extensive thrombophilia workup is not indicated unless there is family or personal history of thrombotic events, or associated laboratory abnormalities.
- Intestinal Ischemia Management: Patients who develop PVT with evidence of intestinal ischemia should receive prompt anticoagulation and, ideally, multidisciplinary team care involving gastroenterology, hepatology, interventional radiology, hematology, and surgery.
- Observation of Minor or Recent Thrombi: In cirrhotic patients without ischemia, with recent (<6 months) thrombi that are <50% occlusive, close imaging follow-up every three months is a reasonable option to track potential spontaneous clot regression.
- Anticoagulation for Significant PVT: Consider anticoagulation for more extensive or obstructive (>50%) recent PVT, especially if the main portal vein or mesenteric vessels are involved. Candidates for liver transplantation and those with inherited thrombophilia may derive additional benefit.
- Chronic Cavernous PVT: Anticoagulation is generally not advised in patients with long-standing (>6 months) complete occlusion and well-formed collateral channels.
- Variceal Screening: Perform endoscopic screening or ensure prophylaxis for varices. Avoid delays in initiating anticoagulation, as timeliness is essential for better recanalization outcomes.
- Choice of Anticoagulant: Vitamin K antagonists, low-molecular-weight heparin, and direct oral anticoagulants (DOACs) are all viable options in cirrhosis. DOACs may be appropriate in well-compensated (Child-Turcotte-Pugh class A or certain class B) cirrhosis but should be avoided in class C. Treatment selection should consider patient preferences, monitoring feasibility, and risk of bleeding.
- Duration of Therapy: Reassess clot status with cross-sectional imaging every three months. Continue anticoagulation for transplant-eligible individuals who show partial or complete recanalization, and consider discontinuation in nonresponders after six months if futility is evident.
- TIPS Revascularization: Portal vein revascularization using TIPS may be pursued in patients who have other TIPS indications (like refractory ascites or variceal bleeding) or to improve transplant feasibility by recanalizing portal flow.
Conclusion: PVT in cirrhosis remains a complex clinical issue requiring careful evaluation of clot extent, timing, and the potential need for transplantation. The recommendations presented here underscore prompt imaging, timely anticoagulation for high-risk thrombi, and individualized therapy based on Child-Turcotte-Pugh classification and bleeding risk. When necessary, multidisciplinary collaboration is key to achieving optimal patient outcomes. Prospective randomized trials and standardized classifications of PVT will be instrumental in refining future guidelines.
Reference:
Davis JPE, Lim JK, Francis FF, Ahn J. AGA Clinical Practice Update on Management of Portal Vein Thrombosis in Patients With Cirrhosis: Expert Review. Gastroenterology. 2024. DOI: http://doi.org/10.1053/j.gastro.2024.10.038
RCT: Chlorthalidone Shows No Renal Advantage Over Hydrochlorothiazide Under Equivalent Dosing in Older Adults With Hypertension
3 Jan, 2025 | 09:00h | UTCBackground: Hypertension is a critical factor in chronic kidney disease (CKD) progression and cardiovascular risk. Thiazide-type diuretics, such as chlorthalidone and hydrochlorothiazide, are first-line antihypertensive treatments. However, whether one agent confers stronger renal protection remains contested, especially at doses considered pharmacologically comparable. Prior observational studies suggested potential discrepancies in kidney outcomes and hypokalemia incidence. This secondary analysis of the Diuretic Comparison Project (DCP) further clarifies the comparative effectiveness of chlorthalidone versus hydrochlorothiazide on renal endpoints.
Objective: To evaluate whether chlorthalidone (12.5–25 mg/day) prevents CKD progression more effectively than hydrochlorothiazide (25–50 mg/day) in adults ≥65 years with hypertension and no pre-specified exclusion by renal function.
Methods: The DCP is a pragmatic, open-label randomized clinical trial embedded in Veterans Affairs (VA) facilities across the United States. Between June 1, 2016, and December 31, 2023, patients already receiving hydrochlorothiazide (25 or 50 mg/day) for hypertension were randomized either to continue that medication or switch to chlorthalidone (12.5–25 mg/day), reflecting equivalent potency.
The prespecified primary kidney outcome was a composite of doubling of serum creatinine, a terminal estimated glomerular filtration rate (eGFR) <15 mL/min, or dialysis initiation. Secondary measures included ≥40% eGFR decline, incident CKD (new eGFR <60 mL/min), eGFR slope, and relevant adverse events. Laboratory data were obtained through usual clinical care rather than protocol-driven testing.
Results: Among 13,523 randomized participants, 12,265 had analyzable renal data (mean [SD] age, 71 [4] years; 96.8% male). The mean (SD) follow-up was 3.9 (1.3) years. Chlorthalidone did not demonstrate superiority over hydrochlorothiazide for the composite kidney endpoint (6.0% vs 6.4%; hazard ratio, 0.94; 95% CI, 0.81–1.08; P=.37). Additional analyses showed no differences in CKD incidence, ≥40% eGFR decline, or eGFR slope. Hypokalemia occurred more frequently in chlorthalidone users (overall ~2% higher rate of low potassium measurements), and hospitalizations for hypokalemia also trended higher.
Conclusions: Under dosing regimens designed to achieve equivalent antihypertensive potency, chlorthalidone provided no measurable renal benefit over hydrochlorothiazide but posed a modestly elevated risk of hypokalemia. These findings reinforce the clinical interchangeability of both agents for long-term blood pressure management in older adults, provided serum potassium is monitored.
Implications for Practice: Clinicians can confidently employ either chlorthalidone or hydrochlorothiazide in older patients with hypertension, including those with mild or moderate CKD, since renal deterioration rates did not differ significantly. Importantly, the trial used half the milligram amount of chlorthalidone (12.5–25 mg/day) to match the usual doses of hydrochlorothiazide (25–50 mg/day). Recognizing this equivalence helps guide therapy transitions and dosing decisions. Vigilant monitoring of electrolytes remains essential, particularly when prescribing chlorthalidone, given the slightly higher incidence of hypokalemia.
Study Strengths and Limitations: Strengths include the randomized design, broad participant inclusion, and pragmatic structure that mirrors real-world prescribing. Limitations involve potential underestimation or overestimation of renal events due to reliance on routine (rather than scheduled) lab tests. Also, nearly all participants had prior hydrochlorothiazide exposure, which may have influenced tolerance and adherence patterns.
Future Research: Further clinical trials focusing on more advanced CKD stages, distinct comorbidities, or combination regimens (e.g., with potassium-sparing agents) would expand our understanding of how thiazide-type diuretics influence long-term kidney outcomes. Extended follow-up or additional subgroup analyses could also shed light on the interplay of dose-response effects in highly vulnerable populations.
Reference: Ishani A, Hau C, Raju S, et al. “Chlorthalidone vs Hydrochlorothiazide and Kidney Outcomes in Patients With Hypertension: A Secondary Analysis of a Randomized Clinical Trial.” JAMA Netw Open. 2024;7(12):e2449576. DOI: http://doi.org/10.1001/jamanetworkopen.2024.49576
Systematic Review and Bayesian Meta-Analysis: Higher Protein Delivery May Increase Mortality in Critically Ill Patients
2 Jan, 2025 | 08:30h | UTCBackground: Nutritional guidelines often recommend higher protein doses (approximately 1.2–2.0 g/kg/d) to mitigate muscle loss in critically ill patients. However, recent multicenter trials have raised concerns that elevated protein targets might increase mortality and adversely affect patient-centered outcomes. This study applied a Bayesian approach to synthesize current evidence regarding the effect of higher versus lower protein delivery on mortality, infections, mechanical ventilation duration, and health-related quality of life in critically ill adults.
Objective: To estimate the probability of beneficial or harmful effects of increased protein delivery on clinically important outcomes, with emphasis on quantifying the likelihood of mortality benefit versus risk.
Methods: A systematic review and Bayesian meta-analysis were conducted according to a preregistered protocol (PROSPERO CRD42024546387) and PRISMA 2020 guidelines. Twenty-two randomized controlled trials comparing higher (mean 1.5 g/kg/d) versus lower (mean 0.9 g/kg/d) protein delivery in adult ICU patients were included, ensuring similar energy intake in both groups. A hierarchical random-effects Bayesian model was applied, using vague priors to estimate relative risks for mortality and infections, mean differences for ventilator days, and standardized mean differences for quality of life.
Results: A total of 4,164 patients were analyzed. The posterior probability that higher protein intake increases mortality was 56.4%, compared with a 43.6% probability of any mortality benefit. Probabilities for a clinically relevant (≥5%) mortality decrease were low (22.9%), while the probability of at least a 5% increase reached 32.4%. Infections were slightly more likely with higher protein, although the likelihood of a major detrimental effect remained modest. The probability of a clinically meaningful difference in ventilator days was negligible, suggesting near equivalence for that endpoint. Conversely, quality of life might be negatively impacted by higher protein dosing, although few trials measured this outcome.
Conclusions: Under a Bayesian framework, current evidence suggests that high protein delivery in critically ill patients might pose a meaningful risk of increased mortality. Although a beneficial effect cannot be fully excluded, its probability appears comparatively small. These findings challenge the longstanding assumption that more protein universally translates to better outcomes.
Implications for Practice: Clinicians should exercise caution when aiming for higher protein targets. Individual patient characteristics, such as severity of illness, renal function, and underlying comorbidities, may modulate outcomes. The data support considering a personalized protein prescription rather than routinely pushing intake beyond conventional targets.
Study Strengths and Limitations: Strengths include a robust Bayesian analysis that evaluates probabilities of both benefit and harm across multiple thresholds, as well as the inclusion of recently published large trials. Limitations involve heterogeneity in protein dosing strategies, potential publication bias (indicated by Egger’s test), and limited data on quality of life.
Future Research: Ongoing trials, such as TARGET Protein and REPLENISH, will provide valuable insights into optimal protein dosing, particularly in specific subgroups. Further investigation should explore mechanistic underpinnings of how high protein intake could adversely affect recovery in critically ill patients.
Reference: Heuts S, Lee ZY, Lew CCH, et al. Higher Versus Lower Protein Delivery in Critically Ill Patients: A Systematic Review and Bayesian Meta-Analysis. Critical Care Medicine. December 27, 2024. DOI: http://doi.org/10.1097/CCM.0000000000006562
Management of Adult Sepsis in Resource-Limited Settings: A Global Delphi-Based Consensus
26 Dec, 2024 | 02:06h | UTCIntroduction: This summary presents key points from a recent expert consensus on managing adult sepsis under limited-resource conditions, where patients may lack access to an ICU bed, advanced monitoring technologies, or sufficient staffing. The statements were developed through a Delphi process involving an international panel of clinicians, aiming to complement existing sepsis guidelines by focusing on pragmatic approaches and context-specific adaptations. These consensus statements address unique challenges such as limited diagnostic tests, alternative strategies for hemodynamic monitoring, and management of sepsis in areas with tropical infections.
Key Recommendations:
- Location of Care and Transfer
- When an ICU bed is unavailable, care can be provided in a non-ICU setting if minimum monitoring (neurological status, blood pressure, peripheral perfusion) is ensured.
- Before transferring a patient, ensure airway patency, initiate intravenous fluids and antimicrobials, and maintain safe transport conditions.
- Incorporate telemedicine or phone consultation with critical care specialists whenever feasible.
- Diagnostic Considerations
- Employ screening tools (e.g., qSOFA) in areas with limited resources, acknowledging its diagnostic constraints.
- Use clinical parameters like altered mental state, capillary refill time (CRT), and urine output to gauge tissue perfusion when lactate measurement is unavailable.
- Insert an indwelling urinary catheter in septic shock to monitor urine output accurately, balancing infection risks against close monitoring needs.
- Hemodynamic Management
- Rely on clinical indicators (CRT, urine output) to guide fluid resuscitation when serum lactate is not accessible.
- Use fluid responsiveness tests (e.g., passive leg raising, pulse pressure variation) if advanced hemodynamic monitoring is impractical.
- Consider balanced solutions such as Ringer’s lactate or Hartmann’s solution for fluid resuscitation.
- Recognize that patients with tropical infections (e.g., malaria, dengue) may require cautious fluid volumes to avoid overload.
- Initiate epinephrine if norepinephrine or vasopressin is unavailable, and use vasopressors through peripheral lines if central access cannot be established.
- Antimicrobial Therapy
- Administer antibiotics without delay (ideally within one hour) in suspected sepsis or septic shock.
- In severe infections of parasitic origin (e.g., malaria), start antiparasitic agents promptly.
- In settings where laboratory investigations are limited, begin broad-spectrum antimicrobial coverage when infection cannot be ruled out.
- De-escalate or discontinue therapy based on clinical improvement, declining white blood cell counts, and adequate source control.
- Respiratory Support
- For acute hypoxemic respiratory failure in septic patients, noninvasive ventilation (NIV) can be used if high-flow nasal oxygen is not available, provided close monitoring for potential failure is ensured.
Conclusion: These consensus-based statements offer practical guidance for clinicians treating sepsis in resource-limited environments. By adapting globally accepted recommendations and incorporating alternative strategies—such as clinical markers of perfusion, use of peripheral vasopressors, and prioritizing immediate antimicrobial therapy—these principles aim to improve patient outcomes where healthcare resources are scarce. Further research and context-specific adaptations will be essential to address remaining uncertainties and refine these expert recommendations.
Reference:
Thwaites, L., Nasa, P., Abbenbroek, B. et al. Management of adult sepsis in resource-limited settings: global expert consensus statements using a Delphi method. Intensive Care Medicine (2024). https://doi.org/10.1007/s00134-024-07735-7
VisionFM: A Generalist AI Surpasses Single-Modality Models in Ophthalmic Diagnostics
25 Dec, 2024 | 13:41h | UTCBackground: Ophthalmic AI models typically address single diseases or modalities. Their limited generalizability restricts broad clinical application. This study introduces VisionFM, a novel foundation model trained on 3.4 million images from over 500,000 individuals. It covers eight distinct ophthalmic imaging modalities (e.g., fundus photography, OCT, slit-lamp, ultrasound, MRI) and encompasses multiple diseases. Compared with prior single-task or single-modality approaches, VisionFM’s architecture and large-scale pretraining enable diverse tasks such as disease screening, lesion segmentation, prognosis, and prediction of systemic markers.
Objective: To develop and validate a generalist ophthalmic AI framework that can handle multiple imaging modalities, recognize multiple diseases, and adapt to new clinical tasks through efficient fine-tuning, potentially easing the global burden of vision impairment.
Methods: VisionFM employs individual Vision Transformer–based encoders for each of the eight imaging modalities, pretrained with self-supervised learning (iBOT) focused on masked image modeling. After pretraining, various task-specific decoders were fine-tuned for classification, segmentation, and prediction tasks. The model was evaluated on 53 public and 12 private datasets, covering eight disease categories (e.g., diabetic retinopathy, glaucoma, cataract), five imaging modalities (fundus photographs, OCT, etc.), plus additional tasks (e.g., MRI-based orbital tumor segmentation). Performance metrics included AUROCs, Dice similarity coefficients, F1 scores, and comparisons with ophthalmologists of varying clinical experience.
Results: VisionFM achieved an average AUROC of 0.950 (95% CI, 0.941–0.959) across eight disease categories in internal validation. External validation showed AUROCs of 0.945 (95% CI, 0.934–0.956) for diabetic retinopathy and 0.974 (95% CI, 0.966–0.983) for AMD, surpassing baseline deep learning approaches. In a 12-disease classification test involving 38 ophthalmologists, VisionFM’s accuracy matched intermediate-level specialists. It successfully handled modality shifts (e.g., grading diabetic retinopathy on previously unseen OCTA), with an AUROC of 0.935 (95% CI, 0.902–0.964). VisionFM also predicted glaucoma progression (F1, 72.3%; 95% CI, 55.0–86.3) and flagged possible intracranial tumors (AUROC, 0.986; 95% CI, 0.960–1.00) from fundus images.
Conclusions: VisionFM offers a versatile, scalable platform for comprehensive ophthalmic tasks. Through self-supervised learning and efficient fine-tuning, it extends specialist-level performance to multiple clinical scenarios and imaging modalities. The study demonstrates that large-scale, multimodal pretraining can enable robust generalization to unseen data, potentially reducing data annotation burdens and accelerating AI adoption worldwide.
Implications for Practice: VisionFM may help address global shortages of qualified ophthalmologists and expand care in low-resource settings, though clinical decision-making still requires appropriate human oversight. Further multicenter studies are needed before widespread implementation, especially for higher-risk use cases such as tumor detection.
Study Strengths and Limitations: Strengths include its unique multimodal design, large-scale pretraining, and extensive external validation. Limitations involve demographic bias toward Chinese datasets, the need for larger cohorts in certain applications (e.g., intracranial tumor detection), and the challenges of matching real-world clinical complexity when only image-based data are used.
Future Research: Further validation in diverse populations, integration of new imaging modalities (e.g., widefield imaging, ultrasound variants), and expansion to additional diseases are planned. Hybridization with large language models could facilitate automatic generation of clinical reports.
Reference: Qiu J, Wu J, Wei H, et al. Development and Validation of a Multimodal Multitask Vision Foundation Model for Generalist Ophthalmic Artificial Intelligence. NEJM AI 2024;1(12). DOI: http://doi.org/10.1056/AIoa2300221
RCT: Avoiding Prophylactic Drain Increases Postoperative Invasive Procedures After Gastrectomy
25 Dec, 2024 | 12:47h | UTCBackground: Prophylactic abdominal drainage following gastrectomy for gastric cancer has been debated for decades. While some Enhanced Recovery After Surgery (ERAS) guidelines discourage routine drains, many surgeons still advocate their use to detect and manage intra-abdominal collections before they become severe. Previous trials were small and underpowered, thus failing to provide robust evidence regarding the real need for prophylactic drains.
Objective: To determine whether omitting a prophylactic drain in gastric cancer surgery leads to a higher likelihood of postoperative invasive procedures (reoperation or percutaneous drainage) within 30 days.
Methods: In this multicenter randomized clinical trial, 404 patients from 11 Italian centers were randomly assigned to either prophylactic drain placement or no drain at the end of subtotal or total gastrectomy. Both academic and community hospitals participated. The primary composite outcome was the rate of reoperation or percutaneous drainage within 30 postoperative days, analyzed via a modified intention-to-treat approach. Secondary endpoints included overall morbidity, anastomotic leaks, length of hospital stay, and 90-day mortality. A parallel invited commentary addressed methodological and clinical perspectives.
Results: Among the 390 patients who underwent resection, 196 had a prophylactic drain and 194 did not. By postoperative day 30, 7.7% of patients in the drain group required reoperation or percutaneous drainage, compared with 15% in the no-drain group. This statistically significant difference was driven by a higher reoperation rate in patients without drains. Both groups had similar anastomotic leak rates (approximately 4% overall). However, patients without prophylactic drains had a higher in-hospital mortality (4.6% vs 0.5%) and were more likely to require escalation of care. There were few drain-related complications, indicating a low risk associated with drain placement. Length of hospital stay and readmission rates were comparable between groups.
Conclusions: Omitting prophylactic drains in gastrectomy was associated with an increased need for postoperative invasive interventions, particularly reoperations. While prior guidelines have recommended against routine drain placement, these findings challenge that stance for total and even subtotal gastrectomies. Surgeons may wish to revisit existing protocols, especially in facilities with fewer resources or lower patient volumes, given the potential reduction in reoperation risk associated with prophylactic drainage.
Implications for Practice: Clinicians should carefully balance possible benefits (earlier detection of fluid collections and reduced reoperations) against potential drawbacks of drain usage. Routine placement may be reconsidered, at least in higher-risk cases or in institutions less equipped for complex salvage procedures.
Study Strengths and Limitations: Key strengths include its robust sample size and standardized criteria for complications. Limitations involve the unblinded nature of postoperative management and the lack of drain fluid amylase measurements to guide removal protocols. Additionally, differentiating total from subtotal gastrectomies might refine selection criteria for prophylactic drainage.
Future Research: Further studies could focus on stratified risk profiles for total vs subtotal gastrectomy and on biomarkers in drain fluid to identify subgroups most likely to benefit from prophylactic drainage.
RCT: Levofloxacin for the Prevention of Multidrug-Resistant Tuberculosis in Vietnam
24 Dec, 2024 | 12:53h | UTCBackground:
Multidrug-resistant (MDR) and rifampin-resistant tuberculosis pose significant global health challenges. Preventing active disease among contacts exposed to resistant strains is critical, yet limited evidence exists on targeted chemopreventive interventions. This study investigated whether a six-month course of daily levofloxacin could reduce the incidence of bacteriologically confirmed tuberculosis among household contacts of individuals with confirmed MDR or rifampin-resistant tuberculosis in Vietnam.
Objective:
To assess if levofloxacin prophylaxis decreases the 30-month incidence of active tuberculosis among high-risk contacts. Primary endpoints included bacteriologically confirmed disease, and secondary outcomes encompassed adverse events, mortality, and development of fluoroquinolone-resistant Mycobacterium tuberculosis.
Methods:
Researchers conducted a double-blind, placebo-controlled, randomized trial. Eligible participants were household contacts of persons who had started MDR tuberculosis treatment within the previous three months, had a positive tuberculin skin test or immunosuppressive condition, and showed no clinical or radiographic signs of active disease. Enrolled individuals received weight-based oral levofloxacin (up to 750 mg/day) or an identical placebo for 180 days. Monthly visits supported adherence and monitored adverse events. Participants underwent follow-up visits every six months until 30 months for tuberculosis screening, chest radiography, and sputum testing where indicated.
Results:
Of 2041 randomized contacts, 1995 (97.7%) completed 30 months of follow-up or reached a primary endpoint. Confirmed tuberculosis was diagnosed in 6 participants (0.6%) in the levofloxacin group and 11 (1.1%) in the placebo group (incidence rate ratio, 0.55; 95% CI, 0.19–1.62), a difference that did not achieve statistical significance. Severe (grade 3 or 4) adverse events were infrequent in both groups, while mild adverse events were more common with levofloxacin (31.9% vs. 13.0%). Acquired fluoroquinolone resistance was not detected.
Conclusions:
Daily levofloxacin for six months showed a numerically lower incidence of tuberculosis than placebo, but the difference was not statistically significant due to lower-than-expected case counts. Treatment was generally well tolerated; however, higher discontinuation rates occurred among levofloxacin recipients, often due to mild musculoskeletal complaints. Further studies may clarify the role of fluoroquinolone-based regimens in preventing MDR tuberculosis across diverse epidemiologic contexts.
Implications for Practice:
These findings suggest that levofloxacin prophylaxis could benefit contacts at high risk of MDR tuberculosis, albeit with caution regarding adherence challenges and low-grade side effects. Broader implementation would require diligent screening, consideration of background fluoroquinolone resistance, and strategies to manage mild adverse events that could undermine treatment completion.
Study Strengths and Limitations:
Strengths include a rigorous double-blind, placebo-controlled design, nearly complete follow-up, and thorough exclusion of prevalent tuberculosis at baseline. Limitations involve an unexpectedly low incidence of confirmed disease, limiting statistical power, and a study population with low HIV prevalence, which may reduce generalizability.
Future Research:
Further research is necessary to confirm these findings in diverse settings, explore alternative or shorter regimens (including newer agents like delamanid), and investigate optimal approaches for patients with fluoroquinolone-resistant strains. The long-term impact on transmission dynamics and microbiome shifts also warrants additional investigation.
2025 ASA Practice Advisory for the Perioperative Care of Older Adults Undergoing Inpatient Surgery
23 Dec, 2024 | 20:27h | UTCIntroduction: This summary outlines the American Society of Anesthesiologists (ASA) 2025 advisory on optimizing perioperative care for older adults (age 65 years or older) undergoing inpatient surgery. It focuses on preoperative, intraoperative, and postoperative measures to mitigate cognitive complications, especially delirium and longer-term cognitive decline, in a population that is highly vulnerable to functional deterioration and loss of independence. The recommendations are based on systematic reviews and meta-analyses, supplemented by expert consensus where evidence is limited. Although not intended as strict standards of care, these advisory statements provide practical guidance that can be adapted to local contexts and patient-specific needs.
Key Recommendations:
- Expanded Preoperative Evaluation:
- Incorporate frailty assessment, cognitive screening, and psychosocial or nutritional evaluations into routine preoperative workups for older patients.
- Patients identified with frailty or cognitive deficits should receive targeted interventions, such as geriatric co-management, deprescribing when indicated, and early family education about delirium risks.
- Evidence suggests a modest decrease in postoperative delirium when such evaluations are included.
- Choice of Primary Anesthetic (Neuraxial vs. General):
- Current studies do not demonstrate a clear advantage of neuraxial over general anesthesia in reducing postoperative delirium risk.
- Both approaches are acceptable; individualize decisions based on patient factors, surgical requirements, and preference-sensitive discussions.
- Maintenance of General Anesthesia (Total Intravenous vs. Inhaled Agents):
- Data are inconclusive regarding delirium prevention, with no significant difference between total intravenous anesthesia (TIVA) and inhaled volatile agents.
- Some low-level evidence indicates TIVA might reduce short-term cognitive decline, but this effect is inconsistent over longer follow-up.
- Dexmedetomidine for Delirium Prophylaxis:
- Moderate-level evidence supports dexmedetomidine for reducing delirium incidence in older patients, yet its use may increase bradycardia and hypotension.
- Optimal dosing and timing remain uncertain, and baseline patient vulnerability should inform decisions.
- Medications with Potential Central Nervous System Effects:
- Drugs such as benzodiazepines, antipsychotics, anticholinergics, ketamine, and gabapentinoids warrant careful risk-benefit analysis.
- Current findings are inconclusive, suggesting neither a firm endorsement nor outright disapproval; preexisting conditions and polypharmacy should guide individualized treatment plans.
Conclusion: Preserving cognitive function and independence in older adults undergoing inpatient surgery is a growing priority. These recommendations highlight the importance of comprehensive preoperative screenings (frailty, cognition, and psychosocial domains), shared decision-making when choosing anesthetic techniques, and thoughtful use of pharmacologic agents. While dexmedetomidine shows promise in mitigating delirium, vigilance regarding hypotension and bradycardia is essential. Ultimately, these strategies aim to reduce anesthesia-related complications in older patients by addressing the multifaceted determinants of postoperative cognitive outcomes.
RCT: Early Restrictive vs Liberal Oxygen Strategy in Severe Trauma – No Significant Outcome Difference
22 Dec, 2024 | 17:21h | UTCBackground: The Advanced Trauma Life Support (ATLS) guidelines recommend providing supplemental oxygen to severely injured patients in the early phase after trauma, although the evidence base is limited. Observational research suggests that liberal oxygen administration may raise the risk of death and respiratory complications. Therefore, the TRAUMOX2 trial examined whether an 8-hour restrictive oxygen strategy (targeting an SpO₂ of 94%) could improve outcomes compared with a liberal strategy (12–15 L/min or FiO₂ 0.6–1.0) initiated prehospital or upon trauma center admission.
Objective: To determine whether an early restrictive oxygen approach, as compared with a liberal approach, reduces the composite outcome of death and/or major respiratory complications (pneumonia or ARDS) within 30 days in severely injured adults.
Methods: This investigator-initiated, international, multicenter, open-label, randomized controlled trial enrolled patients aged 18 years or older with blunt or penetrating trauma requiring full trauma team activation and anticipated hospital stay of at least 24 hours. Randomization occurred either prehospital or upon trauma center arrival in a 1:1 ratio to restrictive (lowest dose of oxygen to maintain SpO₂ at 94%) versus liberal therapy (12–15 L/min via nonrebreather mask or FiO₂ 0.6–1.0). The intervention lasted 8 hours, with all other management per standard care. The primary outcome—death or major respiratory complications (pneumonia per CDC criteria or ARDS per the Berlin definition)—was evaluated by blinded assessors within 30 days. Statistical analyses employed logistic regression, adjusted for stratification variables.
Results: Among 1979 randomized patients, 1508 completed the study (median age, 50 years; Injury Severity Score [ISS], 14). The composite primary outcome occurred in 16.1% (118/733) of restrictive-group patients and 16.7% (121/724) of liberal-group patients (odds ratio, 1.01; 95% CI, 0.75–1.37; p=0.94). Mortality alone (8.6% vs 7.3%) and major respiratory complications alone (8.9% vs 10.8%) showed no significant differences between groups. Adverse and serious adverse events were similar, except atelectasis was less frequent in the restrictive group (27.6% vs 34.7%).
Conclusions: In severely injured trauma patients, an 8-hour restrictive oxygen strategy did not significantly reduce death or major respiratory complications compared with a liberal strategy. Both approaches produced similar 30-day outcomes. Nevertheless, restricting oxygen may limit atelectasis and could be a reasonable alternative to giving high-flow oxygen to all trauma patients.
Implications for Practice: Clinicians may choose to target approximately 94% SpO₂ in the early trauma phase without compromising major outcomes. This approach potentially avoids the risks of hyperoxia, though no definitive survival benefit was identified. Pragmatic implementation of a conservative oxygen strategy seems feasible in diverse prehospital and hospital settings.
Study Strengths and Limitations: Notable strengths include multicenter design, randomized enrollment in prehospital and in-hospital settings, and blinded outcome assessment. Limitations include postrandomization exclusions of patients with minor injuries, a relatively short intervention period (8 hours), and an overall open-label design. These factors, along with lower-than-expected event rates, may have limited the power to detect differences in mortality. Commentary from https://bit.ly/bottomline_traumox2 also highlights that the median ISS of 14 indicates moderate rather than extremely severe trauma, possibly contributing to the modest event rates.
Future Research: Large-scale studies with extended intervention durations and targeted subgroups (e.g., severe traumatic brain injury) could clarify optimal oxygen thresholds in trauma care. Ongoing trials with larger sample sizes may better capture smaller but clinically meaningful differences in mortality or complications.
Guideline: Doxycycline Postexposure Prophylaxis to Reduce Bacterial STI Incidence in High-Risk Populations
19 Dec, 2024 | 22:32h | UTCIntroduction: This summary presents key recommendations from the 2024 Centers for Disease Control and Prevention (CDC) guidelines on using doxycycline postexposure prophylaxis (doxyPEP) to prevent bacterial sexually transmitted infections (STIs), including syphilis, gonorrhea, and chlamydia. Targeting men who have sex with men (MSM) and transgender women with at least one bacterial STI in the past 12 months, these guidelines aim to reduce recurrence rates and improve sexual health outcomes through timely prophylactic intervention.
Key Recommendations:
- Offer doxyPEP counseling to MSM and transgender women with a recent bacterial STI history, addressing the benefits, harms, and uncertainties of prophylactic doxycycline use.
- Advise eligible patients to take a single 200 mg dose of doxycycline as soon as possible (ideally within 72 hours) following condomless oral, anal, or vaginal sexual exposure to reduce their subsequent STI risk.
- Reinforce periodic screening (every 3–6 months) for STI markers, including syphilis and HIV serologies, as well as nucleic acid amplification tests for gonorrhea and chlamydia at relevant anatomical sites.
- Integrate doxyPEP into comprehensive sexual health services that include risk-reduction counseling, condom use, recommended immunizations, and linkage to HIV preexposure prophylaxis (PrEP) or HIV care, thereby enhancing overall prevention strategies.
- Consider extending doxyPEP to other high-risk groups, including heterosexual individuals with recurrent STIs, guided by clinical judgment and shared decision-making.
- Monitor and address adverse events, particularly gastrointestinal symptoms, and acknowledge the potential for antimicrobial resistance. Continued vigilance is warranted given the risk of resistance in commensal flora and key STI pathogens, such as Neisseria gonorrhoeae.
- Assess social and ethical dimensions of doxyPEP implementation, ensuring equitable access and minimizing potential harms, including stigma or intimate partner violence related to prophylaxis disclosure.
Conclusion: Implementing doxyPEP for MSM and transgender women who have experienced a recent bacterial STI can substantially lower the incidence of recurrent infections. By combining prophylactic doxycycline with routine surveillance, comprehensive preventive services, and careful consideration of resistance patterns, clinicians may enhance patient care and strengthen STI control efforts. Further investigation is needed to establish efficacy in cisgender women, transgender men, nonbinary persons, and other populations at risk. Longer-term, population-based studies focused on antimicrobial resistance and community-level effects will help guide sustainable and equitable use of this prevention strategy.
RCT: A Single Dose of Ceftriaxone Reduces Early Ventilator-Associated Pneumonia in Acute Brain Injury Patients
17 Dec, 2024 | 12:26h | UTCBackground: Patients with acute brain injury are at increased risk for early ventilator-associated pneumonia (VAP), which can worsen their clinical course. Although short-term antibiotic prophylaxis has been considered, its utility remains uncertain. This study evaluated whether a single early dose of ceftriaxone could reduce the incidence of early VAP in these patients.
Objective: To determine if a single 2-g intravenous dose of ceftriaxone administered within 12 hours of intubation reduces the incidence of early VAP (day 2 to day 7 of mechanical ventilation) in comatose adults (Glasgow Coma Scale ≤12) requiring prolonged mechanical ventilation after acute brain injury.
Methods: This multicenter, randomized, double-blind, placebo-controlled, assessor-masked superiority trial was conducted in nine ICUs across eight French university hospitals. Patients with acute brain injury from trauma, stroke, or subarachnoid hemorrhage who required at least 48 hours of mechanical ventilation were enrolled. Participants received either ceftriaxone 2 g or placebo once, early after endotracheal intubation. All patients received standard VAP prevention measures, but no selective oropharyngeal or digestive decontamination. The primary endpoint was the incidence of early VAP confirmed by blinded assessors using standard clinical, radiological, and microbiological criteria.
Results: Among 319 patients included in the analysis (162 ceftriaxone, 157 placebo), early VAP incidence was significantly lower with ceftriaxone (14%) compared to placebo (32%) (HR 0.60 [95% CI 0.38–0.95]; p=0.030). Patients receiving ceftriaxone had fewer overall VAP episodes, fewer ventilator and antibiotic exposure days, shorter ICU and hospital stays, and reduced 28-day mortality (15% vs 25%). No significant increase in resistant organisms or adverse events attributable to ceftriaxone was observed.
Conclusions: A single early dose of ceftriaxone significantly reduced early VAP risk in acute brain injury patients undergoing mechanical ventilation. This prophylactic approach may improve clinical outcomes without evident safety concerns.
Implications for Practice: Incorporating a single early ceftriaxone dose into VAP prevention protocols for brain-injured patients could mitigate early respiratory infections and potentially enhance clinical outcomes. Nonetheless, clinicians should remain cautious, considering overall antibiotic stewardship and the need for further evidence on long-term microbial resistance patterns.
Study Strengths and Limitations: Strengths include a robust, multicenter, double-blind, placebo-controlled design and blinded adjudication of VAP cases. Limitations include the lack of long-term assessment of the intestinal microbiota and antimicrobial resistance. Further investigation is required to confirm the safety profile regarding microbial ecology and to explore neurological outcomes in greater depth.
Future Research: Future studies should examine the long-term effects of this single-dose approach on resistance patterns, microbial flora, and functional neurological recovery.
RCT: Liberal vs Restrictive Transfusion Yields No Neurologic Outcome Benefit in Aneurysmal Subarachnoid Hemorrhage
16 Dec, 2024 | 11:26h | UTCBackground: Aneurysmal subarachnoid hemorrhage (SAH) is a critical neurologic condition associated with high morbidity and mortality. Anemia is common in this setting and may worsen cerebral oxygenation and outcomes. However, the impact of a liberal transfusion threshold compared with a restrictive approach on long-term neurologic outcomes has been uncertain.
Objective: To determine whether a liberal red blood cell transfusion strategy (transfusion at hemoglobin ≤10 g/dL) improves 12-month neurologic outcomes compared with a restrictive strategy (transfusion at hemoglobin ≤8 g/dL) in patients with aneurysmal SAH and anemia.
Methods: This was a multicenter, pragmatic, open-label, randomized controlled trial conducted at 23 specialized neurocritical care centers. Critically ill adults with a first-ever aneurysmal SAH and hemoglobin ≤10 g/dL within 10 days of admission were randomized to a liberal or restrictive transfusion strategy. The primary outcome was unfavorable neurologic outcome at 12 months, defined as a modified Rankin scale score ≥4. Secondary outcomes included the Functional Independence Measure (FIM), quality of life assessments, and imaging-based outcomes such as vasospasm and cerebral infarction. Outcome assessors were blinded to group allocation.
Results: Among 742 randomized patients, 725 were analyzed for the primary outcome. At 12 months, unfavorable neurologic outcome occurred in 33.5% of patients in the liberal group and 37.7% in the restrictive group (risk ratio 0.88; 95% CI, 0.72–1.09; p=0.22). There were no clinically meaningful differences in secondary outcomes. Mortality at 12 months was similar (approximately 27% in both arms). Radiographic vasospasm was more frequently detected in the restrictive group, though this did not translate into improved functional outcomes in the liberal arm. Adverse events and transfusion reactions were comparable between groups.
Conclusions: In patients with aneurysmal SAH and anemia, a liberal transfusion strategy did not lead to a significantly lower risk of unfavorable neurologic outcome at 12 months compared with a restrictive approach.
Implications for Practice: These findings suggest that routinely maintaining higher hemoglobin levels does not confer substantial long-term functional benefit. Clinicians may consider a more restrictive threshold (≤8 g/dL) to minimize unnecessary transfusions without compromising outcomes. Some skepticism toward adopting a more liberal transfusion policy is warranted given the lack of demonstrable benefit.
Study Strengths and Limitations: Strengths include the randomized, multicenter design, blinded outcome assessment, and a 12-month follow-up. Limitations include potential unmeasured subtle benefits, the inability to blind clinical teams, and the challenge of capturing all aspects of functional recovery with current measurement tools. Further research may clarify if more tailored transfusion strategies can yield modest but meaningful improvements.
Future Research: Future studies should evaluate intermediate hemoglobin thresholds, develop more sensitive measures of functional and cognitive recovery, and consider individualized transfusion strategies based on specific patient factors and biomarkers of cerebral ischemia.
Guidelines for the Management of Hyperglycemic Crises in Adult Patients with Diabetes
15 Dec, 2024 | 13:18h | UTCIntroduction: Diabetic ketoacidosis (DKA) and hyperglycemic hyperosmolar state (HHS) are critical, acute complications of type 1 and type 2 diabetes. Recent data show a global rise in DKA and HHS admissions, driven by factors such as psychosocial challenges, suboptimal insulin use, infection, and certain medications (e.g., SGLT2 inhibitors). This consensus report, developed by leading diabetes organizations (ADA, EASD, JBDS, AACE, DTS), provides updated recommendations on epidemiology, pathophysiology, diagnosis, treatment, and prevention of DKA and HHS in adults, aiming to guide clinical practice and improve outcomes.
Key Recommendations:
- Diagnosis and Classification:
- DKA is defined by hyperglycemia (>11.1 mmol/l [200 mg/dl] or known diabetes), elevated ketone levels (β-hydroxybutyrate ≥3.0 mmol/l), and metabolic acidosis (pH <7.3 or bicarbonate <18 mmol/l).
- HHS is characterized by marked hyperglycemia, severe hyperosmolality (>320 mOsm/kg), significant dehydration, and minimal ketonaemia or acidosis.
- Consider euglycemic DKA, especially with SGLT2 inhibitor use.
- Classify DKA severity (mild, moderate, severe) to guide the setting of care.
- Fluid and Electrolyte Management:
- Initiate isotonic or balanced crystalloid solutions to restore intravascular volume, enhance renal perfusion, and reduce hyperglycemia.
- Adjust fluids based on hydration, sodium levels, and glucose trends.
- Add dextrose when glucose falls below ~13.9 mmol/l (250 mg/dl) to allow ongoing insulin therapy until ketoacidosis resolves.
- Carefully monitor potassium and provide adequate replacement to prevent severe hypokalemia.
- Insulin Therapy:
- Start a continuous intravenous infusion of short-acting insulin as soon as feasible after confirming adequate potassium.
- For mild or moderate DKA, subcutaneous rapid-acting insulin analogs may be used under close supervision.
- Continue insulin until DKA resolves (pH ≥7.3, bicarbonate ≥18 mmol/l, β-hydroxybutyrate <0.6 mmol/l) or HHS improves (osmolality <300 mOsm/kg, improved mental status).
- Overlap subcutaneous basal insulin by 1–2 hours before discontinuing intravenous insulin to prevent rebound hyperglycemia.
- Additional Considerations:
- Avoid routine bicarbonate; use only if pH <7.0.
- Phosphate supplementation is not routinely recommended unless levels are severely low.
- Identify and treat underlying precipitating causes (infection, psychological factors, medication-related triggers).
- Address social determinants of health and mental health conditions to reduce recurrence.
Conclusion: By implementing these evidence-based recommendations—early diagnosis, structured fluid and insulin therapy, careful electrolyte management, and addressing precipitating factors—clinicians can improve patient care, reduce morbidity and mortality, and enhance the quality of life for adults experiencing DKA and HHS.
Noninferiority RCT: Omitting Sentinel-Lymph-Node Biopsy Maintains 5-Year Invasive Disease–Free Survival in Early-Stage Breast Cancer
13 Dec, 2024 | 15:15h | UTCBackground: While axillary surgical staging using sentinel-lymph-node biopsy has been a mainstay in early-stage breast cancer management, its necessity in clinically node-negative patients undergoing breast-conserving therapy has been called into question. With tumor biology increasingly guiding treatment decisions, reducing surgical interventions without compromising survival is a major goal.
Objective: To determine whether omitting sentinel-lymph-node biopsy in patients with clinically node-negative, T1 or T2 (≤5 cm) invasive breast cancer undergoing breast-conserving surgery is noninferior to performing sentinel-lymph-node biopsy in terms of 5-year invasive disease–free survival.
Methods: In this prospective, randomized, noninferiority trial, 5502 eligible patients were randomized in a 1:4 ratio to omission of axillary surgery versus sentinel-lymph-node biopsy. The per-protocol population included 4858 patients. All patients received whole-breast irradiation. Invasive disease–free survival, defined as time to any invasive disease event or death, was the primary endpoint. Noninferiority required a 5-year invasive disease–free survival rate ≥85% and a hazard ratio upper limit <1.271.
Results: After a median follow-up of 73.6 months, the 5-year invasive disease–free survival was 91.9% (95% CI, 89.9–93.5) in the omission group and 91.7% (95% CI, 90.8–92.6) in the sentinel-biopsy group (HR, 0.91; 95% CI, 0.73–1.14), meeting noninferiority criteria. Axillary recurrences were slightly higher in the omission group (1.0% vs. 0.3%), though without detrimental effects on overall survival. Patients who omitted axillary surgery experienced lower rates of lymphedema, better arm mobility, and less pain than those undergoing sentinel-lymph-node biopsy.
Conclusions: For appropriately selected patients with clinically node-negative, early-stage breast cancer undergoing breast-conserving surgery, omitting sentinel-lymph-node biopsy did not compromise 5-year invasive disease–free survival and yielded fewer surgical complications and better quality of life outcomes.
Implications for Practice: Omitting axillary staging may be considered in low-risk patients, particularly older individuals with small, hormone receptor-positive, HER2-negative tumors. However, clinicians should balance the lack of nodal information against potential alterations in adjuvant therapy decisions, especially regarding radiotherapy and systemic treatment recommendations.
Study Strengths and Limitations: Strengths include a large patient population, rigorous prospective randomization, and substantial follow-up. Limitations include a predominance of low-risk, small, HR-positive/HER2-negative tumors, potentially limiting generalizability. While omitting sentinel-lymph-node biopsy reduces surgical morbidity, the absence of nodal status may influence adjuvant treatment planning.
Future Research: Further studies should assess the applicability of omission strategies to younger patients, larger tumors, or more aggressive subtypes, and explore whether novel biomarkers or imaging methods can reliably guide treatment decisions without axillary surgery.
Retrospective Cohort Study: As-Needed Blood Pressure Medications Associated With Increased AKI and Other Adverse Outcomes in Hospitalized Veterans
8 Dec, 2024 | 21:34h | UTCBackground: Inpatient asymptomatic blood pressure (BP) elevations are common, and clinicians frequently use as-needed BP medications to rapidly lower BP values. However, there is limited evidence supporting this practice, and abrupt BP reductions may increase the risk of ischemic events, including acute kidney injury (AKI).
Objective: To examine whether as-needed BP medication use during hospitalization is associated with increased risk of AKI and other adverse outcomes compared to no as-needed use.
Methods: This retrospective cohort study used a target trial emulation and propensity score matching. Adults hospitalized for ≥3 days in non-ICU VA hospital wards from 2015-2020, who received at least one scheduled BP medication within the first 24 hours and had at least one systolic BP reading >140 mm Hg, were included. Patients were categorized into two groups: those receiving at least one as-needed BP medication (oral or IV) and those receiving only scheduled BP medications. The primary outcome was time-to-first AKI event. Secondary outcomes included a >25% drop in systolic BP within 3 hours and a composite of myocardial infarction (MI), stroke, or death.
Results: Among 133,760 veterans (mean age 71.2 years; 96% male), 21% received as-needed BP medications. As-needed BP medication use was associated with a 23% higher risk of AKI (HR=1.23; 95% CI, 1.18-1.29). The IV route showed a particularly pronounced AKI risk (HR=1.64). Secondary analyses indicated a 1.5-fold increased risk of rapid BP reduction and a 1.69-fold higher rate of the composite outcome (MI, stroke, death) among as-needed users.
Conclusions: In a large, national cohort of hospitalized veterans, as-needed BP medication use was associated with increased AKI risk and other adverse outcomes. These findings suggest that routine as-needed BP medication use for asymptomatic BP elevations may be harmful.
Implications for Practice: Clinicians should carefully reconsider the use of as-needed BP medications in the inpatient setting, especially in older individuals or those with significant cardiovascular risk. Given the lack of clear benefit and potential for harm, greater caution and potentially more conservative approaches are warranted.
Study Strengths and Limitations: Strengths include a large, nationally representative sample and robust analytic methods. Limitations include the retrospective design, potential residual confounding, and limited generalizability to non-veteran or surgical populations. While causal inferences cannot be made, the findings strongly support the need to question current practice.
Future Research: Prospective, randomized trials are needed to determine the optimal management of asymptomatic inpatient hypertension and to assess whether avoiding or reducing as-needed BP medication use improves clinical outcomes.
RCT: FFR-Guided PCI Plus TAVI is Non-inferior and Superior to SAVR Plus CABG in Patients With Severe Aortic Stenosis and Complex Coronary Disease
8 Dec, 2024 | 21:22h | UTCBackground: Patients with severe aortic stenosis frequently present with concomitant complex coronary artery disease. Current guidelines recommend combined surgical aortic valve replacement (SAVR) and coronary artery bypass grafting (CABG) as first-line therapy. However, transcatheter aortic valve implantation (TAVI) and fractional flow reserve (FFR)-guided percutaneous coronary intervention (PCI) have emerged as alternative treatments. Assessing their efficacy compared to SAVR plus CABG has been an unmet need.
Objective: To determine whether FFR-guided PCI plus TAVI is non-inferior and, if demonstrated, superior to SAVR plus CABG in patients with severe aortic stenosis and complex or multivessel coronary disease.
Methods: This international, multicenter, prospective, open-label, non-inferiority randomized controlled trial included patients aged ≥70 years with severe aortic stenosis and complex coronary disease who were deemed suitable for either percutaneous or surgical treatment by a Heart Team. Participants were randomized (1:1) to FFR-guided PCI plus TAVI or SAVR plus CABG. The primary endpoint was a composite of all-cause mortality, myocardial infarction, disabling stroke, clinically driven target-vessel revascularization, valve reintervention, and life-threatening or disabling bleeding at 1 year.
Results: Among 172 enrolled patients, 91 were assigned to FFR-guided PCI plus TAVI and 81 to SAVR plus CABG. At 1 year, the primary endpoint occurred in 4% of patients in the PCI/TAVI group versus 23% in the SAVR/CABG group (risk difference –18.5%; 90% CI –27.8 to –9.7; p<0.001 for non-inferiority; p<0.001 for superiority). The difference was driven mainly by lower all-cause mortality (0% vs 10%, p=0.0025) and reduced life-threatening bleeding (2% vs 12%, p=0.010).
Conclusions: In patients with severe aortic stenosis and complex coronary artery disease, FFR-guided PCI plus TAVI was non-inferior and in fact superior to SAVR plus CABG at 1 year, predominantly due to lower mortality and serious bleeding events.
Implications for Practice: These findings suggest that a percutaneous strategy may be a viable and potentially preferable alternative to surgery in selected patients. Nevertheless, given this is the first trial of its kind, cautious interpretation is advised, and routine adoption should await further corroboration.
Study Strengths and Limitations: Strengths include a randomized, multicenter design and standardized endpoint assessment. Limitations involve early trial termination resulting in a smaller sample size and the use of a single TAVI device type, limiting generalizability.
Future Research: Larger trials with longer follow-up, evaluation of other TAVI prostheses, and broader patient populations are needed to validate these findings and determine the optimal patient selection criteria.
RCT: Nivolumab Plus Ipilimumab Extends Progression-Free Survival in MSI-H or dMMR Metastatic Colorectal Cancer
4 Dec, 2024 | 11:51h | UTCBackground: Patients with microsatellite-instability–high (MSI-H) or mismatch-repair–deficient (dMMR) metastatic colorectal cancer typically experience poor outcomes with standard chemotherapy. Previous nonrandomized studies suggested that combining nivolumab with ipilimumab may offer clinical benefits in this population.
Objective: To evaluate the efficacy and safety of nivolumab plus ipilimumab compared with chemotherapy in patients with MSI-H or dMMR metastatic colorectal cancer who had not received prior systemic treatment for metastatic disease.
Methods: In this phase 3, open-label, randomized trial, 303 patients with unresectable or metastatic MSI-H or dMMR colorectal cancer were assigned in a 2:2:1 ratio to receive nivolumab plus ipilimumab, nivolumab alone, or chemotherapy with or without targeted therapies. The primary endpoint assessed in this interim analysis was progression-free survival (PFS) of nivolumab plus ipilimumab versus chemotherapy in patients with centrally confirmed MSI-H or dMMR status.
Results: At a median follow-up of 31.5 months, nivolumab plus ipilimumab significantly improved PFS compared to chemotherapy (P<0.001). The 24-month PFS was 72% (95% CI, 64–79) with nivolumab plus ipilimumab versus 14% (95% CI, 6–25) with chemotherapy. The restricted mean survival time at 24 months was 10.6 months longer with the combination therapy. Grade 3 or 4 treatment-related adverse events occurred in 23% of patients receiving nivolumab plus ipilimumab and 48% of those receiving chemotherapy.
Conclusions: First-line treatment with nivolumab plus ipilimumab significantly prolonged progression-free survival compared to chemotherapy in patients with MSI-H or dMMR metastatic colorectal cancer, with a lower incidence of high-grade treatment-related adverse events.
Implications for Practice: The combination of nivolumab and ipilimumab may represent a new standard of care for first-line treatment in MSI-H or dMMR metastatic colorectal cancer. However, clinicians should weigh the benefits against potential immune-related adverse events, and long-term survival benefits remain to be fully established.
Study Strengths and Limitations: Strengths include the randomized, phase 3 design and central confirmation of MSI-H or dMMR status. Limitations involve the open-label design, potential bias in patient-reported outcomes, underrepresentation of certain populations, and immature overall survival data.
Future Research: Further studies are needed to compare nivolumab plus ipilimumab directly with nivolumab monotherapy and to assess long-term overall survival benefits and quality of life in diverse patient populations.
RCT: Transcatheter Edge-to-edge Repair Improves Outcomes in Severe Tricuspid Regurgitation
29 Nov, 2024 | 12:37h | UTCBackground: Severe tricuspid regurgitation (TR) is linked to poor quality of life and increased mortality. Traditional medical therapy offers limited symptom relief, and surgical options carry high risks. Transcatheter tricuspid valve therapies like transcatheter edge-to-edge repair (T-TEER) have emerged as less invasive alternatives, but their impact on patient outcomes needs further exploration.
Objective: To determine if T-TEER combined with optimized medical therapy (OMT) enhances patient-reported outcomes and clinical events compared to OMT alone in patients with severe, symptomatic TR.
Methods: In this multicenter, prospective, randomized (1:1) trial, 300 adults with severe, symptomatic TR despite stable OMT were enrolled from 24 centers in France and Belgium between March 2021 and March 2023. Participants were randomized to receive either T-TEER plus OMT or OMT alone. The primary outcome was a composite clinical endpoint at 1 year, including changes in New York Heart Association (NYHA) class, patient global assessment (PGA), or occurrence of major cardiovascular events. Secondary outcomes encompassed TR severity, Kansas City Cardiomyopathy Questionnaire (KCCQ) score, and a composite of death, tricuspid valve surgery, KCCQ improvement, or hospitalization for heart failure.
Results: At 1 year, 74.1% of patients in the T-TEER plus OMT group improved in the composite endpoint versus 40.6% in the OMT-alone group (P < .001). Massive or torrential TR persisted in 6.8% of the T-TEER group compared to 53.5% of the OMT group (P < .001). The mean KCCQ score was higher in the T-TEER group (69.9 vs 55.4; P < .001). The win ratio for the composite secondary outcome was 2.06 (95% CI, 1.38-3.08; P < .001). No significant differences were observed in major cardiovascular events or cardiovascular death between groups.
Conclusions: Adding T-TEER to OMT significantly reduces TR severity and improves patient-reported outcomes at 1 year in patients with severe, symptomatic TR, without increasing adverse events.
Implications for Practice: T-TEER may offer a valuable addition to OMT for selected patients with severe TR, enhancing symptoms and quality of life. However, the absence of significant differences in hard clinical endpoints and the open-label design suggest cautious interpretation. Clinicians should weigh the benefits against potential biases in patient-reported outcomes.
Study Strengths and Limitations: Strengths include the randomized design and multicenter participation, enhancing the study’s validity. Limitations involve the open-label design without a sham control, potentially introducing bias in subjective outcomes. The short follow-up period and selective patient population based on anatomical suitability for T-TEER may limit generalizability.
Future Research: Longer-term studies are necessary to assess T-TEER’s impact on survival and heart failure hospitalization. Comparative studies of different transcatheter devices and investigations into optimal patient selection criteria are also recommended.