Geriatrics
Managing Autonomic Dysfunction, Pain, and Sleep Disturbances in Parkinson’s Disease: Key Points from the German Society of Neurology Guideline
5 Jan, 2025 | 11:00h | UTCIntroduction: This text summarizes a practice-oriented 2023 guideline from the German Society of Neurology addressing non-motor manifestations of Parkinson’s disease (PD). The guideline focuses on evidence-based approaches for diagnosing and treating autonomic failure (including urogenital, cardiovascular, and gastrointestinal dysfunction), pain, and sleep disturbances—problems that often reduce quality of life and accelerate disease progression. The guideline was developed using PICO (Patient, Intervention, Comparison, Outcome) questions, comprehensive literature searches, and a consensus process among German Parkinson’s experts. By presenting stepwise recommendations, the guideline aims to help clinicians manage these non-motor aspects more effectively and improve patient outcomes.
Key Recommendations:
Autonomic Failure
- Bladder Dysfunction: Encourage behavioral modifications (e.g., timed fluid intake, bladder training) and, if necessary, consider antimuscarinics (e.g., solifenacin, trospium) or β3 agonists (e.g., mirabegron 50 mg once daily). Specifically, solifenacin 5 mg once daily, trospium 15–30 mg twice daily or darifenacin 7.5–15 mg once daily are preferred, due to their lower risk of cognitive side effects.
- In patients who have responded inadequately to oral therapy, intravesical botulinum toxin A injection (200 U or customized) may be considered for treating severe urinary urge incontinence, if the individual motor and cognitive performance enables the subsequently likely necessary intermittent catheterization.
- For nocturia, limit evening fluid intake and consider a 10°–20° head-up tilt in bed. In nocturnal polyuria, desmopressin (5–40 µg once daily nasal spray or 100–800 µg once daily per os) may be used with close monitoring of blood pressure, serum electrolytes and body weight.
- Orthostatic Hypotension (OH): Apply a four-step approach: (1) address aggravating factors (e.g., infections, dehydration); (2) review medications; (3) use non-pharmacological measures (increased fluid/salt intake if no contraindications, abdominal binders, head-up tilt sleeping); (4) add medications to raise blood pressure (e.g., midodrine 2.5–10 mg two to three times a day, fludrocortisone 0.1–0.3 µg once daily). For the diagnosis of OH, a Schellong test or tilt table examination should be performed.
- Monitor for supine hypertension, which may require evening antihypertensives (e.g., low-dose losartan 25–100 mg or transdermal nitroglycerin 0.1–0.2 mg/h) and further adjustments. PD individuals with neurogenic OH should be screened for the presence of supine and nocturnal hypertension.
- Constipation: Follow the general German guideline on “Chronic Constipation.” Emphasize adequate hydration (1.5-2 L per day), fiber intake, and exercise.
- First-line drug therapy is macrogol (polyethylene glycol, PEG, 13–26 g once daily). Consider bisacodyl (5–10 mg once daily), sodium picosulfate (5–10 mg once daily), or prucalopride (1–2 mg once daily) if needed.
- Male Erectile Dysfunction: First-line treatment involves phosphodiesterase type 5 (PDE-5) inhibitors (e.g., sildenafil 50–100 mg on demand), used cautiously in patients with orthostatic hypotension. A multidisciplinary approach with urologists is necessary.
Pain Management
- Classification: Differentiate PD-related pain (nociceptive, neuropathic, or nociplastic) from pain arising independently of PD. Use PD-specific scales, such as the King’s Parkinson’s Disease Pain Scale (KPPS) or the Parkinson’s Disease Pain Classification System (PD-PCS), to clarify pain etiology and guide therapy.
- Approach: Optimize dopaminergic therapy, especially if pain correlates with wearing-off.
- Treat nociceptive pain per the WHO 3-step analgesic ladder (which recommends starting with non-opioid analgesics like acetaminophen or NSAIDs, then moving to mild opioids like codeine if needed, and finally to strong opioids like morphine for severe pain).
- For neuropathic pain, preference is given to anticonvulsants (e.g., gabapentin 300–1800 mg, especially in case of concomitant restless legs syndrome) or antidepressants (e.g., duloxetine 60–120 mg, in case of concomitant depression).
- Opioids (e.g., prolonged-release oxycodone/naloxone 5/2.5–20/10 mg, rarely up to 40/20 mg) may be considered in severe or refractory cases.
Sleep Disturbances
- Screening & Diagnosis: Use the Parkinson’s Disease Sleep Scale-2 (PDSS-2) to identify problems such as insomnia, nocturnal akinesia, restless legs, and REM sleep behavior disorder (RBD).
- Objective tests—actigraphy, polygraphy, or video-polysomnography—are recommended for complex or treatment-refractory sleep issues.
- Treatment: Address comorbid conditions (e.g., restless legs syndrome, sleep apnea) following standard guidelines.
- If motor fluctuations disturb sleep, adjust dopaminergic therapy (e.g., use long-acting levodopa or dopamine agonists at night).
- RBD management typically includes creating a safe sleep environment and considering clonazepam (0.125–3 mg) or melatonin (2–9 mg).
- Insomnia linked to circadian disruption may benefit from good sleep hygiene, bright light therapy, structured exercise, and (if indicated) low-dose agents such as eszopiclone (1 mg), doxepin (25 mg), zolpidem (5 mg), trazodone (50 mg), melatonin (2 mg), venlafaxine (37.5 mg, in case of comorbid depression), nortriptyline (25 mg) or mirtazapine (7.5 mg).
- Excessive daytime sleepiness calls for an etiology-driven approach, with non-pharmacological strategies (e.g., scheduled naps, light therapy, exercise) and possible use of modafinil (200–400 mg) if needed. Driving should be reassessed if sleep attacks occur.
Clinical Impact: Poor sleep worsens cognitive decline, motor deficits, caregiver burden, and overall disease progression. RBD in early PD often predicts faster deterioration and earlier cognitive complications. The guideline also addresses the prognostic implications of sleep disturbances.
Conclusion: This guideline underscores the critical importance of identifying and managing non-motor symptoms in Parkinson’s disease. A structured, practice-oriented, etiology-driven stepwise approach to autonomic failure, pain, and sleep problems helps reduce the risk of dangerous complications, alleviates patient distress, and may delay the progression of both motor and cognitive domains. By integrating evidence-based recommendations into daily practice—focusing on precise assessment, tailored interventions, and regular follow-up—clinicians can improve outcomes and quality of life for individuals with PD and their caregivers.
Reference: Fanciulli A, Sixel-Döring F, Buhmann C, Krismer F, Hermann W, Winkler C, Woitalla D, Jost WH, German Parkinson’s Guideline Group, Trenkwalder C & Höglinger G (2025). Diagnosis and treatment of autonomic failure, pain and sleep disturbances in Parkinson’s disease: guideline “Parkinson’s disease” of the German Society of Neurology. Journal of Neurology (2025). DOI: https://doi.org/10.1007/s00415-024-12730-5
Meta-Analysis: Tailored Hydration Strategies Decrease CI-AKI and MACE in Coronary Angiography
6 Jan, 2025 | 13:00h | UTCBackground: Contrast-induced acute kidney injury (CI-AKI) poses a considerable burden on patients undergoing coronary angiography or percutaneous coronary intervention (PCI). Beyond the direct tubular toxicity of iodine contrast, several risk factors, including chronic kidney disease (CKD) and hemodynamic instability, further increase the likelihood of renal damage. Although guideline-based prevention strategies recommend peri-procedural intravenous hydration, the optimal volume and method remain unclear.
Objective: This meta-analysis aimed to determine whether patient-tailored intravenous fluid administration (using parameters other than body weight alone) can reduce the incidence of CI-AKI, as well as major adverse cardiovascular events (MACE), compared with conventional non-tailored hydration protocols in patients undergoing coronary angiography and/or PCI.
Methods: A systematic review of randomized controlled trials (RCTs) was performed, including 13 studies and 4,458 participants. Tailored hydration strategies encompassed left ventricular end-diastolic pressure (LVEDP)-guided infusion, diuresis-driven matched replacement (RenalGuard®), bioimpedance vector analysis, central venous pressure, or inferior vena cava ultrasound measurements. These were compared against standard non-tailored fluid protocols. The primary outcome was CI-AKI (variously defined but measured within 7 days), and secondary outcomes included MACE, all-cause mortality, and renal replacement therapy (RRT).
Results: Across 12 RCTs (n=3,669), tailored hydration significantly reduced CI-AKI rates (risk ratio 0.56, 95% CI [0.46–0.69], p<0.00001; I²=26%). Ten studies (n=3,377) revealed lower MACE incidence in the tailored hydration arm (RR=0.57, 95% CI [0.42–0.78], p=0.0005; I²=12%). A significant reduction in all-cause mortality (RR=0.57, 95% CI [0.35–0.94], p=0.03) and RRT requirement (RR=0.51, 95% CI [0.29–0.89], p=0.02) was also observed, with no significant increase in pulmonary edema. Subgroup analyses (e.g., CKD) supported the overall benefit of individualizing fluid regimens.
Conclusions: Tailored hydration strategies appear superior to standard approaches in lowering the risk of CI-AKI, MACE, mortality, and RRT after coronary angiography or PCI. Although LVEDP-guided protocols are simple to implement and effective, the RenalGuard® system may offer additional benefits in selected populations, albeit at higher cost and complexity.
Implications for Practice: Clinicians should consider personalized hydration based on physiological or hemodynamic parameters to optimize fluid volume, reduce renal injury, and potentially improve clinical outcomes. Nevertheless, practical challenges include access to specialized equipment and the need for close monitoring in some techniques.
Study Strengths and Limitations: This systematic review highlights consistent treatment effects across diverse RCTs and methods. However, potential biases due to lack of blinding, varying CI-AKI definitions, and limited head-to-head comparisons among tailored approaches constrain definitive conclusions. The small sample size of certain studies and underpowered subgroup analyses also limit the generalizability of findings.
Future Research: Further large-scale trials are warranted to compare various tailored protocols directly, focusing on cost-effectiveness, ease of implementation, and patient-centered endpoints. Ongoing investigations, such as the NEPTUNE trial, aim to clarify whether combining multiple parameters (like LVEDP and contrast volume/eGFR ratio) yields optimal renal protection.
Reference: Cossette F, Trifan A, Prévost-Marcotte G, et al. Tailored Hydration for the Prevention of Contrast-Induced Acute Kidney Injury After Coronary Angiogram or PCI: A Systematic Review and Meta-Analysis. American Heart Journal. Published online January 4, 2025. DOI: http://doi.org/10.1016/j.ahj.2025.01.002
Meta-analysis: Therapeutic-Dose Heparin Improves 28-Day Mortality in COVID-19 Hospitalized Patients
6 Jan, 2025 | 12:00h | UTCBackground: High rates of thrombotic events and systemic inflammation among COVID-19 hospitalized patients led researchers to test whether intensified anticoagulation strategies could reduce morbidity and mortality. Previous trials yielded conflicting results, partly due to varying doses of anticoagulants—prophylactic, intermediate, or therapeutic—and heterogeneous patient severity. This comprehensive investigation, conducted by the WHO Rapid Evidence Appraisal for COVID-19 Therapies (REACT) Working Group, aimed to clarify the benefits and risks of escalated anticoagulation dosing in patients hospitalized for COVID-19.
Objective: To estimate whether higher-dose anticoagulation (therapeutic or intermediate) improves 28-day all-cause mortality compared with lower-dose anticoagulation (prophylactic or intermediate), and to evaluate secondary outcomes, including progression to mechanical ventilation, thromboembolic events, and major bleeding.
Methods: This prospective meta-analysis included randomized trials comparing higher- versus lower-dose anticoagulation for hospitalized COVID-19 patients. Investigators collected trial-level summary data, focusing primarily on heparins. Dosing categories—therapeutic, intermediate, and prophylactic—were predefined. The main outcome was 28-day mortality; secondary outcomes included progression to invasive mechanical ventilation (IMV), venous or arterial thrombotic events, and major hemorrhage. Data were analyzed using a fixed-effects model, with odds ratios (ORs) pooled across trials.
Results: Overall, 22 trials (over 11 000 total participants) contributed data, primarily evaluating heparins. For therapeutic versus prophylactic-dose heparin, 28-day mortality was significantly reduced (OR, 0.77; 95% CI, 0.64–0.93), especially among patients requiring low-flow oxygen or no supplemental oxygen. Therapeutic dose reduced thromboembolic events (OR 0.48; 95% CI, 0.36-0.64) but increased major bleeding (OR 1.90; 95% CI, 1.19-3.05) compared to prophylactic dose. In contrast, when therapeutic was compared to intermediate-dose heparin, the summary OR for 28-day mortality was 1.21 (CI, 0.93–1.58), suggesting a potential trend toward higher mortality that did not reach statistical significance. Intermediate versus prophylactic-dose comparisons revealed no conclusive mortality difference (OR, 0.95; CI, 0.76–1.19). Across all higher-dose arms, thromboembolic events decreased, while the risk of major bleeding increased, underscoring the delicate risk–benefit balance. Subgroup analyses by respiratory support level, D-dimer, and baseline severity did not indicate strong interaction effects, although sample sizes were limited in more severe illness subgroups.
Conclusions: Therapeutic-dose heparin reduces 28-day mortality relative to prophylactic-dose in hospitalized patients with COVID-19, mainly among those not requiring invasive ventilation. Mortality was similar or potentially worse when therapeutic was compared to intermediate-dose. Clinicians must weigh the lower rate of thrombotic complications against the higher bleeding risk, particularly in critically ill patients.
Implications for Practice: Although higher anticoagulant dosing appears beneficial for certain hospitalized COVID-19 patients, especially those with mild to moderate respiratory compromise, individualized assessment remains key. Current guidelines broadly recommend prophylactic dosing for the critically ill and suggest considering higher doses only in carefully selected patients. Evolving viral variants and changes in standard of care further complicate direct application of these findings to present-day hospital settings.
Study Strengths and Limitations: Strengths include prospective planning, collaboration with multiple trials, and a large pooled sample. Limitations encompass heterogeneity in dose definitions, partial reliance on published data where individual-level parameters could not be fully harmonized, and potential temporal changes in COVID-19 clinical profiles. Moreover, bleeding severity beyond major hemorrhage was not universally reported, limiting robust safety assessments.
Future Research: Further studies should focus on individualized anticoagulant strategies that consider biomarkers (for example, D-dimer) and evolving treatment protocols. Investigations examining optimal timing, duration, and post-discharge management will help refine anticoagulation practices.
Reference:
The WHO Rapid Evidence Appraisal for COVID-19 Therapies (REACT) Working Group. Anticoagulation Among Patients Hospitalized for COVID-19: A Systematic Review and Prospective Meta-analysis. Annals of Internal Medicine. DOI: https://doi.org/10.7326/ANNALS-24-00800
Shappell CN, Anesi GL. Anticoagulation for COVID-19: Seeking Clarity and Finding Yet More Gray. Annals of Internal Medicine. DOI: https://doi.org/10.7326/ANNALS-24-03244
Review: Nutritional Support in Critically Ill Patients
6 Jan, 2025 | 11:00h | UTCIntroduction: This summary is derived from a state-of-the-art review on nutritional support in the intensive care unit (ICU) published in The BMJ. Critically ill patients experience metabolic disturbances, inflammation, and profound muscle wasting. Nutritional therapy aims to mitigate these effects, though recent randomized controlled trials (RCTs) challenge the dogma of early, aggressive provision of high-calorie and high-protein diets for all ICU patients. Instead, emerging evidence indicates that moderate energy and protein restriction, particularly during the first week, may enhance recovery and reduce complications such as hospital-acquired infections, muscle weakness, and ICU-acquired morbidity. Nonetheless, identifying ideal feeding strategies remains complex, given the dynamic nature of critical illness and the interplay with other interventions such as sedation and physical rehabilitation.
Key Recommendations:
- Individualized Timing and Dose: Limit caloric and protein loads during the acute phase (roughly the first seven days), especially in patients with hemodynamic instability or shock. Later, as patients transition to recovery, gradually increase macronutrient delivery to meet evolving metabolic needs.
- Preferred Feeding Route: Enteral nutrition is generally recommended when the gastrointestinal tract is functional, particularly after shock resolution. Parenteral nutrition can be reserved for prolonged gut dysfunction or inability to meet needs enterally. Studies comparing enteral versus parenteral feeding have shown no clear outcome differences, but early enteral feeding is often favored for physiological and cost reasons.
- Avoid Overfeeding and Overzealous Protein Provision: Several large RCTs (including EFFORT-Protein, EDEN, and NUTRIREA-3) observed no mortality benefit—and in some instances, worse outcomes—when patients received full or high doses of energy and protein in the first week. Metabolic “resistance” and inhibition of protective processes such as autophagy might explain why restricted early feeding sometimes confers advantages.
- Monitoring and Assessment: Traditional tools (NUTRIC, NRS-2002) and biomarkers (albumin, prealbumin) do not reliably predict who benefits from higher or lower feeding levels. Ultrasound or computed tomography to assess muscle mass may hold promise, but no validated approach exists to guide individualized macronutrient targets.
- Micronutrients and Specialized Formulations: Broad-spectrum pharmaconutrients (glutamine, antioxidants, etc.) have not improved outcomes in well-powered trials. Instead, standard vitamin and trace element supplementation consistent with recommended daily allowances appears sufficient in most cases.
- Long-term Rehabilitation: Combined nutritional support and physical exercise are critical for mitigating long-term impacts of ICU-acquired weakness and functional decline. Evidence increasingly highlights the need for prolonged, structured rehabilitation to optimize muscle recovery and quality of life.
Conclusion: Although nutritional support remains central to critical care, it is most effective when carefully adapted to disease phase, patient comorbidities, and evolving organ dysfunction. Key evidence suggests a more conservative approach to energy and protein during the acute phase, followed by gradual escalation and integration with rehabilitation. Ongoing research seeks to identify physiological markers that distinguish when to intensify nutritional therapy and how best to align macronutrient delivery with other therapies to promote muscle function and reduce complications.
Reference: Reignier J, Rice TW, Arabi YM, Casaer M. Nutritional Support in the ICU. BMJ. 2025;388:e077979. DOI: https://doi.org/10.1136/bmj-2023-077979
Meta-Analysis: Long Half-Life Phosphodiesterase Inhibitors Reduce HbA1c in Adults with Elevated Baseline Levels
6 Jan, 2025 | 08:00h | UTCBackground: Phosphodiesterase type 5 (PDE5) inhibitors are traditionally used to treat erectile dysfunction and pulmonary arterial hypertension. Recent evidence suggests that PDE5 inhibitors could also be repurposed to lower hemoglobin A1c (HbA1c) in patients with type 2 diabetes. Given the disparity in half-lives among these agents, this meta-analysis focused on whether longer half-life PDE5 inhibitors (tadalafil, PF-00489791) produce a more sustained HbA1c reduction compared to short half-life PDE5 inhibitors (sildenafil, avanafil).
Objective: To evaluate the effect of PDE5 inhibitors on HbA1c levels in individuals with baseline values above 6%, comparing agents with short and long half-lives to assess differential clinical benefits in glycemic control.
Methods: This systematic review and meta-analysis included only randomized controlled trials (RCTs) in which participants received any PDE5 inhibitor for at least four weeks, with control or placebo for comparison. Major databases (Cochrane CENTRAL, PubMed Central, ClinicalTrials.gov, and WHO ICTRP) were searched through September 2024 without language restrictions. Statistical analyses were performed using a random-effects model, reporting mean differences in HbA1c. Secondary outcomes (HOMA-IR, lipid profiles, fasting glucose, and others) were also explored.
Results: Thirteen RCTs were eligible (N=1083). Long half-life agents showed a significant mean reduction of approximately −0.40% in HbA1c (p=0.002) in the overall analysis, whereas short half-life PDE5 inhibitors exhibited no significant change. In more stringent subgroup analyses (≥8 weeks’ duration, exclusive type 2 diabetes, baseline HbA1c ≥6.5%), long half-life PDE5 inhibitors maintained a significant decrease (−0.50%), while short half-life agents paradoxically showed a slight but significant increase (+0.36%, p=0.03). In trials enrolling patients with poorly controlled diabetes (baseline HbA1c near 10%), tadalafil’s HbA1c reductions were considerably larger, aligning with the efficacy of other standard oral antidiabetic medications.
Conclusions: Long half-life PDE5 inhibitors appear to confer meaningful reductions in HbA1c, comparable to established oral antidiabetic agents, particularly in patients whose HbA1c is inadequately controlled. In contrast, short half-life PDE5 inhibitors did not show a consistent benefit and may paradoxically raise HbA1c in certain subgroups, although further large-scale studies are warranted to confirm these findings.
Implications for Practice: Long half-life PDE5 inhibitors could serve as an adjunctive therapy in type 2 diabetes management, especially in individuals with higher baseline HbA1c. Yet, caution is advised given limited data on adverse events and the short duration of most included trials. Physicians should remain prudent until more robust evidence, especially in populations with markedly elevated HbA1c, becomes available.
Study Strengths and Limitations: Strengths include a direct comparison between short and long half-life PDE5 inhibitors in a clinically relevant population, plus systematic subgroup analyses. Limitations involve heterogeneity in trial designs, relatively low baseline HbA1c in most participants, and a lack of long-term follow-up data or major clinical endpoints.
Future Research: Subsequent trials should target populations with poorly controlled diabetes (HbA1c ≥9.0%) and assess longer durations (≥3 months) to capture the full impact of PDE5 inhibitor therapy. A deeper examination of combination regimens, pharmacokinetic optimization, and clinical outcomes like cardiovascular events would further clarify the role of these agents in diabetes care.
Reference: Kim J, Zhao R, Kleinberg LR, Kim K. (2025) “Effect of long and short half-life PDE5 inhibitors on HbA1c levels: a systematic review and meta-analysis.” eClinicalMedicine, 80, 103035. Available at: DOI: http://doi.org/10.1016/j.eclinm.2024.103035
RCT: Chlorthalidone Shows No Renal Advantage Over Hydrochlorothiazide Under Equivalent Dosing in Older Adults With Hypertension
3 Jan, 2025 | 09:00h | UTCBackground: Hypertension is a critical factor in chronic kidney disease (CKD) progression and cardiovascular risk. Thiazide-type diuretics, such as chlorthalidone and hydrochlorothiazide, are first-line antihypertensive treatments. However, whether one agent confers stronger renal protection remains contested, especially at doses considered pharmacologically comparable. Prior observational studies suggested potential discrepancies in kidney outcomes and hypokalemia incidence. This secondary analysis of the Diuretic Comparison Project (DCP) further clarifies the comparative effectiveness of chlorthalidone versus hydrochlorothiazide on renal endpoints.
Objective: To evaluate whether chlorthalidone (12.5–25 mg/day) prevents CKD progression more effectively than hydrochlorothiazide (25–50 mg/day) in adults ≥65 years with hypertension and no pre-specified exclusion by renal function.
Methods: The DCP is a pragmatic, open-label randomized clinical trial embedded in Veterans Affairs (VA) facilities across the United States. Between June 1, 2016, and December 31, 2023, patients already receiving hydrochlorothiazide (25 or 50 mg/day) for hypertension were randomized either to continue that medication or switch to chlorthalidone (12.5–25 mg/day), reflecting equivalent potency.
The prespecified primary kidney outcome was a composite of doubling of serum creatinine, a terminal estimated glomerular filtration rate (eGFR) <15 mL/min, or dialysis initiation. Secondary measures included ≥40% eGFR decline, incident CKD (new eGFR <60 mL/min), eGFR slope, and relevant adverse events. Laboratory data were obtained through usual clinical care rather than protocol-driven testing.
Results: Among 13,523 randomized participants, 12,265 had analyzable renal data (mean [SD] age, 71 [4] years; 96.8% male). The mean (SD) follow-up was 3.9 (1.3) years. Chlorthalidone did not demonstrate superiority over hydrochlorothiazide for the composite kidney endpoint (6.0% vs 6.4%; hazard ratio, 0.94; 95% CI, 0.81–1.08; P=.37). Additional analyses showed no differences in CKD incidence, ≥40% eGFR decline, or eGFR slope. Hypokalemia occurred more frequently in chlorthalidone users (overall ~2% higher rate of low potassium measurements), and hospitalizations for hypokalemia also trended higher.
Conclusions: Under dosing regimens designed to achieve equivalent antihypertensive potency, chlorthalidone provided no measurable renal benefit over hydrochlorothiazide but posed a modestly elevated risk of hypokalemia. These findings reinforce the clinical interchangeability of both agents for long-term blood pressure management in older adults, provided serum potassium is monitored.
Implications for Practice: Clinicians can confidently employ either chlorthalidone or hydrochlorothiazide in older patients with hypertension, including those with mild or moderate CKD, since renal deterioration rates did not differ significantly. Importantly, the trial used half the milligram amount of chlorthalidone (12.5–25 mg/day) to match the usual doses of hydrochlorothiazide (25–50 mg/day). Recognizing this equivalence helps guide therapy transitions and dosing decisions. Vigilant monitoring of electrolytes remains essential, particularly when prescribing chlorthalidone, given the slightly higher incidence of hypokalemia.
Study Strengths and Limitations: Strengths include the randomized design, broad participant inclusion, and pragmatic structure that mirrors real-world prescribing. Limitations involve potential underestimation or overestimation of renal events due to reliance on routine (rather than scheduled) lab tests. Also, nearly all participants had prior hydrochlorothiazide exposure, which may have influenced tolerance and adherence patterns.
Future Research: Further clinical trials focusing on more advanced CKD stages, distinct comorbidities, or combination regimens (e.g., with potassium-sparing agents) would expand our understanding of how thiazide-type diuretics influence long-term kidney outcomes. Extended follow-up or additional subgroup analyses could also shed light on the interplay of dose-response effects in highly vulnerable populations.
Reference: Ishani A, Hau C, Raju S, et al. “Chlorthalidone vs Hydrochlorothiazide and Kidney Outcomes in Patients With Hypertension: A Secondary Analysis of a Randomized Clinical Trial.” JAMA Netw Open. 2024;7(12):e2449576. DOI: http://doi.org/10.1001/jamanetworkopen.2024.49576
Cohort Study: One in Four Patients Demonstrates Covert Cognition Despite Behavioral Unresponsiveness
3 Jan, 2025 | 08:30h | UTCBackground: Cognitive motor dissociation (CMD) refers to the presence of specific neuroimaging or electrophysiological responses to commands in patients otherwise incapable of voluntary behavioral output. Detecting CMD is clinically relevant because its underdiagnosis may lead to premature decisions regarding goals of care, life-sustaining treatment, and rehabilitation efforts. Although several single-center studies have suggested that CMD may exist in 10–20% of patients with disorders of consciousness, larger multinational data were lacking, particularly using both functional magnetic resonance imaging (fMRI) and electroencephalography (EEG).
Objective: To determine how often CMD occurs in a large, multinational cohort of adults with impaired consciousness and to evaluate the clinical variables potentially associated with this phenomenon.
Methods: This prospective cohort study included 353 adults with disorders of consciousness recruited from six international centers between 2006 and 2023. Enrolled participants had at least one behavioral assessment using the Coma Recovery Scale–Revised (CRS-R) and underwent task-based fMRI, EEG, or both. Sites utilized validated analytic pipelines and automated data processing to minimize false positives. Participants were divided into two groups: those without observable responses to verbal commands (coma, vegetative state, or minimally conscious state–minus) and those with observable responses (minimally conscious state–plus or emerged). CMD was defined as the absence of any observable behavioral response to commands, combined with a positive command-following signal on fMRI or EEG.
Results: Among 241 participants with no overt command-following, 25% showed CMD through either fMRI alone, EEG alone, or both. CMD was more common in younger patients, those assessed later after injury, and those with traumatic brain injury. Interestingly, in 112 participants who did exhibit command-following on bedside exams, only 38% demonstrated confirmatory responses on fMRI or EEG. These findings support the notion that the tasks used for neuroimaging and electrophysiological assessments may require more sustained cognitive engagement than typical bedside evaluations.
Conclusions: CMD was identified in about one in four patients who lacked behavioral command-following. Combining fMRI with EEG likely increases detection rates compared to either modality alone. The results highlight the need for increased awareness of covert cognitive activity in this population, given potential ramifications for prognosis, family counseling, and clinical care.
Implications for Practice: Clinicians should consider the possibility of CMD in patients who appear unresponsive at the bedside. When feasible, employing both fMRI and EEG might reveal hidden cognitive capacities that can guide patient-centered decisions, encourage targeted therapies, and allow healthcare teams to respect potential consciousness and autonomy. However, such technologies remain limited to specialized centers.
Study Strengths and Limitations: Strengths include a diverse sample from multiple international sites and the integration of two complementary neurodiagnostic techniques. Limitations involve heterogeneous recruitment practices, variations in local data acquisition methods, and potential selection biases toward patients who survived until advanced testing was available. Additionally, the absence of standardized paradigms across sites reduced consistency of results.
Future Research: Further large-scale investigations should standardize fMRI and EEG protocols and determine whether earlier and more consistent identification of CMD affects functional outcomes. Efforts to refine and validate automated analytic pipelines could facilitate widespread adoption of these techniques in routine clinical settings.
Reference: Bodien YG, Allanson J, Cardone P, et al. Cognitive Motor Dissociation in Disorders of Consciousness. New England Journal of Medicine. 2024;391:598-608. DOI: http://doi.org/10.1056/NEJMoa2400645
Meta-analysis: One-day Low-residue Diet Achieves Comparable Bowel Cleansing Compared to Multi-day Regimens
26 Dec, 2024 | 18:21h | UTCBackground: Colorectal cancer remains a leading cause of cancer-related morbidity worldwide, making early detection through colonoscopy essential. Adequate bowel preparation is crucial to maximize mucosal visibility and detect lesions effectively. Although low-residue diets (LRDs) are commonly recommended before colonoscopy, guidelines vary regarding the optimal duration (one day versus multiple days). This systematic review and meta-analysis evaluated whether a one-day LRD regimen is non-inferior to multi-day protocols in achieving satisfactory bowel cleansing and lesion detection.
Objective: To compare the efficacy of 1-day versus >1-day LRD regimens for bowel preparation in adult patients undergoing elective colonoscopy, focusing on bowel cleanliness, polyp detection, and adenoma detection rates.
Methods: A comprehensive search of PubMed, Cochrane Central Register of Controlled Trials, ScienceDirect, Scopus, and ClinicalTrials.gov was conducted for randomized controlled trials (RCTs) comparing 1-day with >1-day LRD regimens. Six RCTs involving 2,469 participants met inclusion criteria. Patients were randomized to either a 1-day LRD (n=1,237) or a multi-day LRD (n=1,232). Adequate bowel preparation was primarily defined by a Boston Bowel Preparation Scale (BBPS) score ≥2 in each segment or total BBPS ≥6. Secondary outcomes included polyp detection rate (PDR), adenoma detection rate (ADR), withdrawal time, cecal intubation rate, and cecal intubation time.
Results: Both groups demonstrated similar rates of adequate bowel preparation (87.2% in the 1-day LRD vs. 87.1% in the multi-day group), with no significant difference (OR=1.03, 95% CI, 0.76–1.41; p=0.84; I2=0%). PDR was likewise comparable (OR=0.91, 95% CI, 0.76–1.09; p=0.29; I2=16%), as was ADR (OR=0.87, 95% CI, 0.71–1.08; p=0.21; I2=0%). Withdrawal time did not differ (MD=–0.01 minutes, 95% CI, –0.25 to 0.24; p=0.97; I2=63%), and cecal intubation parameters were also statistically similar. Across studies, the pooled mean global BBPS revealed minimal difference (MD=0.16, 95% CI, –0.02 to 0.34; p=0.08; I2=15%), confirming the non-inferiority of a shorter LRD protocol.
Conclusions: A one-day LRD achieves bowel cleansing outcomes comparable to those of multi-day LRDs, without compromising polyp or adenoma detection. This shorter regimen may help optimize patient adherence, reduce dietary restriction burden, and simplify procedural logistics, especially for busy endoscopy practices.
Implications for Practice: Adopting a 1-day LRD can streamline preparation, improve patient satisfaction, and maintain high-quality visualization. Clinicians should weigh individual patient factors such as chronic constipation or comorbidities but may generally favor a shorter dietary restriction period to enhance compliance and comfort.
Study Strengths and Limitations: This meta-analysis included only RCTs, strengthening its internal validity. Heterogeneity for primary outcomes was minimal. However, the included trials employed varied dietary protocols and bowel preparation solutions. Additionally, some studies lacked uniform reporting of cecal intubation endpoints, limiting direct comparisons. Future investigations with standardized outcome measures could offer more definitive guidance.
Future Research: Further large-scale RCTs should assess cost-effectiveness, patient-reported outcomes, and LRD composition in specific populations. Identifying optimal dietary instructions for individuals with slower colonic transit or specific nutritional needs would refine colonoscopy preparation guidelines and potentially increase detection of precancerous lesions.
Reference: Putri RD, et al. One-day low-residue diet is equally effective as the multiple-day low-residue diet in achieving adequate bowel cleansing: a meta-analysis of randomized controlled trials. Clinical Endoscopy. 2024. DOI: https://doi.org/10.5946/ce.2024.061
VisionFM: A Generalist AI Surpasses Single-Modality Models in Ophthalmic Diagnostics
25 Dec, 2024 | 13:41h | UTCBackground: Ophthalmic AI models typically address single diseases or modalities. Their limited generalizability restricts broad clinical application. This study introduces VisionFM, a novel foundation model trained on 3.4 million images from over 500,000 individuals. It covers eight distinct ophthalmic imaging modalities (e.g., fundus photography, OCT, slit-lamp, ultrasound, MRI) and encompasses multiple diseases. Compared with prior single-task or single-modality approaches, VisionFM’s architecture and large-scale pretraining enable diverse tasks such as disease screening, lesion segmentation, prognosis, and prediction of systemic markers.
Objective: To develop and validate a generalist ophthalmic AI framework that can handle multiple imaging modalities, recognize multiple diseases, and adapt to new clinical tasks through efficient fine-tuning, potentially easing the global burden of vision impairment.
Methods: VisionFM employs individual Vision Transformer–based encoders for each of the eight imaging modalities, pretrained with self-supervised learning (iBOT) focused on masked image modeling. After pretraining, various task-specific decoders were fine-tuned for classification, segmentation, and prediction tasks. The model was evaluated on 53 public and 12 private datasets, covering eight disease categories (e.g., diabetic retinopathy, glaucoma, cataract), five imaging modalities (fundus photographs, OCT, etc.), plus additional tasks (e.g., MRI-based orbital tumor segmentation). Performance metrics included AUROCs, Dice similarity coefficients, F1 scores, and comparisons with ophthalmologists of varying clinical experience.
Results: VisionFM achieved an average AUROC of 0.950 (95% CI, 0.941–0.959) across eight disease categories in internal validation. External validation showed AUROCs of 0.945 (95% CI, 0.934–0.956) for diabetic retinopathy and 0.974 (95% CI, 0.966–0.983) for AMD, surpassing baseline deep learning approaches. In a 12-disease classification test involving 38 ophthalmologists, VisionFM’s accuracy matched intermediate-level specialists. It successfully handled modality shifts (e.g., grading diabetic retinopathy on previously unseen OCTA), with an AUROC of 0.935 (95% CI, 0.902–0.964). VisionFM also predicted glaucoma progression (F1, 72.3%; 95% CI, 55.0–86.3) and flagged possible intracranial tumors (AUROC, 0.986; 95% CI, 0.960–1.00) from fundus images.
Conclusions: VisionFM offers a versatile, scalable platform for comprehensive ophthalmic tasks. Through self-supervised learning and efficient fine-tuning, it extends specialist-level performance to multiple clinical scenarios and imaging modalities. The study demonstrates that large-scale, multimodal pretraining can enable robust generalization to unseen data, potentially reducing data annotation burdens and accelerating AI adoption worldwide.
Implications for Practice: VisionFM may help address global shortages of qualified ophthalmologists and expand care in low-resource settings, though clinical decision-making still requires appropriate human oversight. Further multicenter studies are needed before widespread implementation, especially for higher-risk use cases such as tumor detection.
Study Strengths and Limitations: Strengths include its unique multimodal design, large-scale pretraining, and extensive external validation. Limitations involve demographic bias toward Chinese datasets, the need for larger cohorts in certain applications (e.g., intracranial tumor detection), and the challenges of matching real-world clinical complexity when only image-based data are used.
Future Research: Further validation in diverse populations, integration of new imaging modalities (e.g., widefield imaging, ultrasound variants), and expansion to additional diseases are planned. Hybridization with large language models could facilitate automatic generation of clinical reports.
Reference: Qiu J, Wu J, Wei H, et al. Development and Validation of a Multimodal Multitask Vision Foundation Model for Generalist Ophthalmic Artificial Intelligence. NEJM AI 2024;1(12). DOI: http://doi.org/10.1056/AIoa2300221
Meta-analysis: Incidence Rate Difference of Adverse Events from Canabinoids in Middle-Aged and Older Adults
25 Dec, 2024 | 12:19h | UTCBackground: Growing evidence suggests that cannabinoid-based medicines (CBMs) are increasingly prescribed to individuals aged 50 years and above for various clinical conditions. While these agents may offer therapeutic benefits, questions remain about the incidence of adverse events (AEs), particularly in older adults with multiple comorbidities. This systematic review and meta-analysis aims to quantify the incidence rate difference (IRD) of AEs and determine whether weekly doses of delta-9-tetrahydrocannabinol (THC) and cannabidiol (CBD) are associated with any dose-dependent increase in risk.
Objective: To evaluate whether adults aged ≥50 years exposed to CBMs, including THC-alone formulations and THC combined with CBD, experience a higher incidence of AEs than controls, and to assess how variations in weekly THC and CBD doses might affect AE rates.
Methods: Researchers searched MEDLINE, PubMed, EMBASE, CINAHL, PsychInfo, Cochrane Library, and ClinicalTrials.gov from January 1, 1990, to June 12, 2023. Randomized clinical trials involving middle-aged and older adults (mean age ≥50 years) using medicinal CBMs for all indications were included. Data on common and serious AEs, withdrawals, and deaths were extracted and pooled using a random-effects model. Further meta-regression analyses examined THC and CBD weekly doses as predictors of AEs in THC-only and THC:CBD trials.
Results: Fifty-eight randomized clinical trials (n=6611) met inclusion criteria, with 3450 participants receiving CBMs. Compared to controls, individuals on THC-alone experienced significantly higher incidence of dizziness, somnolence, impaired coordination, dissociative symptoms, and dry mouth, often in a dose-dependent manner. Similarly, THC:CBD combinations increased nausea, vomiting, fatigue, dizziness, and disorientation. The incidence of serious AEs, withdrawals, or mortality did not differ significantly between CBM and control groups, although neurological or psychiatric side effects were more pronounced with higher THC doses.
Conclusions: THC-containing CBMs can provoke dose-related gastrointestinal, neurological, and psychiatric adverse events, posing additional risks in older adults susceptible to falls and cognitive disturbances. However, the meta-analysis found no significant increases in serious AEs or deaths. Clinicians should weigh potential benefits against the likelihood of common side effects, especially when prescribing higher THC doses or combining cannabinoids with other medications frequently used by older patients.
Implications for Practice:
- Physicians should exercise caution when initiating or escalating THC-based therapies in middle-aged and older adults, monitoring for neurological or psychiatric AEs.
- Using lower THC doses, titrating gradually, and adding CBD may mitigate some side effects.
- Though severe AEs are uncommon, vigilance is warranted in individuals with complex medication regimens.
Study Strengths and Limitations:
- Strength: This review merges diverse clinical conditions and provides a comprehensive assessment of THC vs. THC:CBD. Its large pooled population allows for more precise IRD estimates.
- Limitation: Short treatment durations in many trials limit understanding of long-term toxicity, and some studies lacked rigorous reporting of randomization and outcome measures, potentially introducing bias.
Future Research:
- Longer-duration trials focused on older populations are needed to clarify chronic safety profiles.
- Studies exploring drug-drug interactions between CBMs and medications commonly prescribed to older adults will further elucidate real-world tolerability.
Reference: Velayudhan L, Pisani S, Dugonjic M, McGoohan K, Bhattacharyya S. Adverse events caused by cannabinoids in middle aged and older adults for all indications: a meta-analysis of incidence rate difference. Age and Ageing. 2024;53(11):afae261. DOI: https://doi.org/10.1093/ageing/afae261
2025 ASA Practice Advisory for the Perioperative Care of Older Adults Undergoing Inpatient Surgery
23 Dec, 2024 | 20:27h | UTCIntroduction: This summary outlines the American Society of Anesthesiologists (ASA) 2025 advisory on optimizing perioperative care for older adults (age 65 years or older) undergoing inpatient surgery. It focuses on preoperative, intraoperative, and postoperative measures to mitigate cognitive complications, especially delirium and longer-term cognitive decline, in a population that is highly vulnerable to functional deterioration and loss of independence. The recommendations are based on systematic reviews and meta-analyses, supplemented by expert consensus where evidence is limited. Although not intended as strict standards of care, these advisory statements provide practical guidance that can be adapted to local contexts and patient-specific needs.
Key Recommendations:
- Expanded Preoperative Evaluation:
- Incorporate frailty assessment, cognitive screening, and psychosocial or nutritional evaluations into routine preoperative workups for older patients.
- Patients identified with frailty or cognitive deficits should receive targeted interventions, such as geriatric co-management, deprescribing when indicated, and early family education about delirium risks.
- Evidence suggests a modest decrease in postoperative delirium when such evaluations are included.
- Choice of Primary Anesthetic (Neuraxial vs. General):
- Current studies do not demonstrate a clear advantage of neuraxial over general anesthesia in reducing postoperative delirium risk.
- Both approaches are acceptable; individualize decisions based on patient factors, surgical requirements, and preference-sensitive discussions.
- Maintenance of General Anesthesia (Total Intravenous vs. Inhaled Agents):
- Data are inconclusive regarding delirium prevention, with no significant difference between total intravenous anesthesia (TIVA) and inhaled volatile agents.
- Some low-level evidence indicates TIVA might reduce short-term cognitive decline, but this effect is inconsistent over longer follow-up.
- Dexmedetomidine for Delirium Prophylaxis:
- Moderate-level evidence supports dexmedetomidine for reducing delirium incidence in older patients, yet its use may increase bradycardia and hypotension.
- Optimal dosing and timing remain uncertain, and baseline patient vulnerability should inform decisions.
- Medications with Potential Central Nervous System Effects:
- Drugs such as benzodiazepines, antipsychotics, anticholinergics, ketamine, and gabapentinoids warrant careful risk-benefit analysis.
- Current findings are inconclusive, suggesting neither a firm endorsement nor outright disapproval; preexisting conditions and polypharmacy should guide individualized treatment plans.
Conclusion: Preserving cognitive function and independence in older adults undergoing inpatient surgery is a growing priority. These recommendations highlight the importance of comprehensive preoperative screenings (frailty, cognition, and psychosocial domains), shared decision-making when choosing anesthetic techniques, and thoughtful use of pharmacologic agents. While dexmedetomidine shows promise in mitigating delirium, vigilance regarding hypotension and bradycardia is essential. Ultimately, these strategies aim to reduce anesthesia-related complications in older patients by addressing the multifaceted determinants of postoperative cognitive outcomes.
Meta-Analysis: Endovascular Therapy for Vertebrobasilar Occlusion Improves Functional Outcomes
19 Dec, 2024 | 22:56h | UTCBackground: Acute vertebrobasilar artery occlusion (VBAO) is associated with high mortality and severe neurological deficits. Previous randomized trials of endovascular therapy (EVT) for VBAO have shown inconsistent results, leaving uncertainty about its efficacy across different patient subgroups.
Objective: To determine whether EVT confers improved 90-day functional outcomes compared with standard medical therapy alone in patients with acute VBAO and to explore treatment effect heterogeneity in prespecified subgroups.
Methods: This individual patient data meta-analysis included all four major randomized controlled trials (ATTENTION, BAOCHE, BASICS, BEST) that enrolled patients with VBAO treated within 24 hours of estimated onset. Participants received either EVT or best medical therapy. The primary outcome was a favorable functional status at 90 days (modified Rankin Scale [mRS] score 0–3). Secondary outcomes included functional independence (mRS 0–2), distribution of mRS scores (shift analysis), symptomatic intracranial hemorrhage (sICH), and all-cause mortality at 90 days.
Results: Among 988 patients (556 EVT; 432 control), median age 67 years, EVT significantly increased the proportion achieving mRS 0–3 (45% vs 30%; adjusted odds ratio [aOR] 2.41, 95% CI 1.78–3.26) and mRS 0–2 (35% vs 21%; aOR 2.52, 95% CI 1.82–3.48). EVT improved the overall distribution of functional outcomes (aOR for mRS shift 2.09, 95% CI 1.61–2.71) and reduced 90-day mortality (36% vs 45%; aOR 0.60, 95% CI 0.45–0.80). Although sICH was more common with EVT (5% vs <1%; aOR 11.98, 95% CI 2.82–50.81), the net clinical benefit remained strongly in favor of EVT. Subgroup analyses showed broadly consistent benefit, though the advantage was uncertain for patients with mild baseline severity (NIHSS <10).
Conclusions: EVT for acute VBAO significantly improves functional outcomes and reduces mortality despite a higher sICH risk. These results support EVT as a standard consideration in appropriately selected patients with moderate-to-severe VBAO. The benefit’s magnitude is comparable to that seen in anterior circulation large vessel occlusions, although caution is advised in mild cases and those with extensive baseline infarction.
Implications for Practice: Clinicians should consider EVT for most patients presenting with acute VBAO. While sICH risk is increased, the substantial improvements in function and survival justify its use in suitable candidates. Careful imaging and clinical assessment remain critical for optimal patient selection.
Study Strengths and Limitations: Strengths include a pooled individual patient dataset from all major VBAO EVT trials, allowing detailed subgroup analyses. Limitations involve early trial termination, underrepresentation of women, predominance of Asian populations, and exclusion of patients with very mild symptoms or large baseline infarcts, potentially limiting generalizability.
Future Research: Further trials are needed to define EVT’s role in patients with mild symptoms, isolated vertebral occlusion, large infarcts, or those presenting beyond 24 hours. Additional studies should assess real-world applicability and diverse patient populations.
Review: New and Emerging Treatments for Major Depressive Disorder
19 Dec, 2024 | 22:21h | UTCIntroduction: This is a summary of a review on new and emerging treatments for major depressive disorder (MDD), a globally prevalent condition with substantial morbidity and socioeconomic burden. While conventional monoaminergic antidepressants often provide benefit, many patients do not achieve remission, leading to treatment-resistant depression. Novel approaches, including psychedelics (psilocybin, ketamine/esketamine), anti-inflammatory agents, opioid modulators, neuropeptides, botulinum toxin injections, and various neuromodulatory techniques (newer forms of transcranial magnetic stimulation and light-based therapies), are under investigation. This summary highlights their potential efficacy, tolerability, and current limitations.
Key Recommendations:
- Ketamine and Esketamine: Consider these as adjunctive treatments for patients with refractory MDD, given their rapid antidepressant and anti-suicidal effects. Carefully monitor for blood pressure elevations and potential habituation. Long-term cost-effectiveness and sustained benefits remain uncertain.
- Psychedelics (Psilocybin, Ayahuasca): Psilocybin-assisted therapy may produce rapid symptom improvement, but scalability, required therapeutic support, and possible increases in suicidality raise concern. Ayahuasca shows early promise, yet lacks robust long-term data and standardized administration protocols.
- Neuromodulation (rTMS, TBS, Accelerated TMS, Light Therapy): Repetitive transcranial magnetic stimulation (rTMS) and its variants (theta burst stimulation, accelerated protocols) demonstrate modest efficacy with good tolerability. Bright light therapy may enhance neuromodulation outcomes. Optimal protocols and positioning in treatment pathways are not well established.
- Anti-inflammatory and Other Agents: Preliminary findings suggest potential adjunctive roles for minocycline, NSAIDs, statins, omega-3 fatty acids, and a buprenorphine-samidorphan combination. However, larger, high-quality trials are needed to confirm their efficacy and safety profiles.
- Onabotulinumtoxin A: A single glabellar injection may confer antidepressant effects, but the underlying mechanism and durability are unclear. Methodological issues, including difficulties with blinding, limit strong recommendations.
- More Invasive Interventions (DBS, MST): Deep brain stimulation (DBS) and magnetic seizure therapy (MST) are invasive approaches supported by limited evidence, restricting their use to highly refractory cases. The balance of benefit, risk, and resource intensity remains uncertain.
Conclusion: Although these emerging treatments offer potential avenues beyond traditional antidepressants, most remain investigational. Key challenges include limited comparative data, uncertain long-term outcomes, and scaling difficulties. Further rigorous research, including head-to-head trials, long-term follow-ups, and clarity regarding optimal psychotherapeutic support, is required. As evidence matures, these novel interventions may become more integrated into standard care, potentially improving outcomes for patients with difficult-to-treat MDD.
Review: Nonsurgical Management of Chronic Venous Insufficiency
19 Dec, 2024 | 16:45h | UTCIntroduction: This summary highlights key points from a recent review on the nonsurgical management of chronic venous insufficiency, a condition characterized by persistent venous hypertension leading to edema, skin changes, and venous ulcers. Chronic venous insufficiency is influenced by both structural factors (e.g., venous reflux, obstruction) and functional elements (e.g., obesity, impaired calf-muscle pump). While interventional procedures may improve symptoms in patients with significant structural abnormalities, most cases require comprehensive nonsurgical strategies targeting venous hypertension and improving quality of life.
Key Recommendations:
- Comprehensive Assessment: Distinguish between structural and functional components of venous disease. Structural issues may warrant endovenous procedures, whereas functional insufficiency (e.g., due to obesity, weak calf muscles) requires behavioral and medical interventions.
- Compression Therapy (Class 1A for Venous Ulcers): Use tailored compression stockings or wraps to reduce venous pressure, alleviate swelling, and aid ulcer healing. Compression levels above 30 mm Hg can facilitate healing, but lower levels (20–30 mm Hg) may improve adherence.
- Lifestyle Modifications: Implement weight reduction measures in obese patients to lower central venous pressure and improve venous return. Consider evaluating and managing obstructive sleep apnea or cardiac dysfunction that may elevate venous pressure.
- Exercise and Leg Elevation: Encourage exercises that strengthen calf and foot muscles, thereby enhancing the venous pump function and reducing stasis. Advise regular leg elevation to alleviate edema and discomfort.
- Medication Review: Assess current medications (e.g., calcium-channel blockers, gabapentinoids) that may cause edema and consider alternatives. Avoid unnecessary diuretics unless true volume overload is confirmed.
- Venous Interventions for Structural Lesions (Class IB for Varicose Veins): In patients with symptomatic varicose veins and axial reflux, procedural interventions (e.g., endovenous ablation, sclerotherapy, or surgical stripping) can be more effective than long-term compression alone. Early intervention may expedite ulcer healing in selected cases.
- Cautious Use of Venoactive Agents: Although certain supplements (e.g., flavonoids, horse chestnut) are widely available, current guidelines provide only weak recommendations, with limited evidence for clinically meaningful outcomes.
Conclusion: Nonsurgical management of chronic venous insufficiency emphasizes reducing venous hypertension, improving calf muscle pump function, and addressing central factors such as obesity and cardiac conditions. By combining compression therapy, exercise, weight reduction, and appropriate medication adjustments, clinicians can alleviate symptoms, enhance patient comfort, and potentially improve wound healing. Procedural interventions remain essential adjuncts for selected structural abnormalities, but long-term functional management is key to sustained clinical benefit.
Review: Management of Atrial Fibrillation
18 Dec, 2024 | 14:22h | UTCIntroduction: This summary of a comprehensive review on atrial fibrillation (AF) focuses on an increasingly prevalent arrhythmia affecting more than 10 million adults in the United States. AF significantly elevates the risks of stroke, heart failure (HF), cognitive decline, and mortality. This guideline-based overview examines the pathophysiology, detection, prevention, and treatment strategies for AF, emphasizing risk factor modification, appropriate anticoagulation, and early rhythm control interventions to improve clinical outcomes and quality of life.
Key Recommendations:
- Risk Factor and Lifestyle Modification: Implement weight reduction, regular exercise, optimal blood pressure control, smoking cessation, and reduced alcohol intake at all AF stages to prevent new-onset AF, reduce recurrences, and mitigate complications.
- Screening and Diagnosis: Consider AF screening in high-risk patients using wearable devices or implantable loop recorders. Confirm suspected AF with electrocardiography and extended rhythm monitoring in those with cryptogenic stroke.
- Stroke Prevention: Assess stroke risk using CHA2DS2-VASc. For patients with annual stroke risk ≥2%, initiate oral anticoagulation (preferably direct oral anticoagulants over warfarin) to lower stroke risk by up to 80%. Avoid aspirin monotherapy for AF-related stroke prevention due to inferior efficacy.
- Early Rhythm Control: Begin rhythm control within one year of AF diagnosis, particularly in symptomatic patients or those with HF and reduced ejection fraction (HFrEF). Early use of antiarrhythmic drugs or catheter ablation can improve symptoms, cardiac function, and reduce hospitalizations.
- Catheter Ablation: Utilize ablation as a first-line therapy in symptomatic paroxysmal AF to maintain sinus rhythm and prevent progression. In patients with AF and HFrEF, ablation enhances quality of life, improves left ventricular function, and lowers mortality and HF hospitalization rates.
- Rate Control Strategies: For patients who are not candidates for rhythm control, use beta-blockers or nondihydropyridine calcium channel blockers to achieve satisfactory ventricular rate control. Consider atrioventricular nodal ablation plus pacemaker implantation if pharmacologic therapy is inadequate.
- Staging and Long-Term Management: Recognize four AF stages (at risk, pre-AF, clinically apparent AF, and permanent AF) to tailor management. After ablation, continue anticoagulation for at least three months, then reassess stroke risk before considering discontinuation.
- Addressing Inequities: Improve access to guideline-directed AF therapies, including ablation and specialized care, and address social determinants of health that influence disparities in diagnosis, treatment, and outcomes.
Conclusion: Guideline-directed AF management, encompassing comprehensive risk factor modification, appropriate anticoagulation, and timely rhythm control strategies, can reduce stroke incidence, improve HF outcomes, and prolong life. Catheter ablation is a key intervention for appropriate patients, especially those with symptomatic paroxysmal AF or HFrEF, while striving for equitable and evidence-based care across diverse populations remains a critical priority.
RCT: Liberal vs Restrictive Transfusion Yields No Neurologic Outcome Benefit in Aneurysmal Subarachnoid Hemorrhage
16 Dec, 2024 | 11:26h | UTCBackground: Aneurysmal subarachnoid hemorrhage (SAH) is a critical neurologic condition associated with high morbidity and mortality. Anemia is common in this setting and may worsen cerebral oxygenation and outcomes. However, the impact of a liberal transfusion threshold compared with a restrictive approach on long-term neurologic outcomes has been uncertain.
Objective: To determine whether a liberal red blood cell transfusion strategy (transfusion at hemoglobin ≤10 g/dL) improves 12-month neurologic outcomes compared with a restrictive strategy (transfusion at hemoglobin ≤8 g/dL) in patients with aneurysmal SAH and anemia.
Methods: This was a multicenter, pragmatic, open-label, randomized controlled trial conducted at 23 specialized neurocritical care centers. Critically ill adults with a first-ever aneurysmal SAH and hemoglobin ≤10 g/dL within 10 days of admission were randomized to a liberal or restrictive transfusion strategy. The primary outcome was unfavorable neurologic outcome at 12 months, defined as a modified Rankin scale score ≥4. Secondary outcomes included the Functional Independence Measure (FIM), quality of life assessments, and imaging-based outcomes such as vasospasm and cerebral infarction. Outcome assessors were blinded to group allocation.
Results: Among 742 randomized patients, 725 were analyzed for the primary outcome. At 12 months, unfavorable neurologic outcome occurred in 33.5% of patients in the liberal group and 37.7% in the restrictive group (risk ratio 0.88; 95% CI, 0.72–1.09; p=0.22). There were no clinically meaningful differences in secondary outcomes. Mortality at 12 months was similar (approximately 27% in both arms). Radiographic vasospasm was more frequently detected in the restrictive group, though this did not translate into improved functional outcomes in the liberal arm. Adverse events and transfusion reactions were comparable between groups.
Conclusions: In patients with aneurysmal SAH and anemia, a liberal transfusion strategy did not lead to a significantly lower risk of unfavorable neurologic outcome at 12 months compared with a restrictive approach.
Implications for Practice: These findings suggest that routinely maintaining higher hemoglobin levels does not confer substantial long-term functional benefit. Clinicians may consider a more restrictive threshold (≤8 g/dL) to minimize unnecessary transfusions without compromising outcomes. Some skepticism toward adopting a more liberal transfusion policy is warranted given the lack of demonstrable benefit.
Study Strengths and Limitations: Strengths include the randomized, multicenter design, blinded outcome assessment, and a 12-month follow-up. Limitations include potential unmeasured subtle benefits, the inability to blind clinical teams, and the challenge of capturing all aspects of functional recovery with current measurement tools. Further research may clarify if more tailored transfusion strategies can yield modest but meaningful improvements.
Future Research: Future studies should evaluate intermediate hemoglobin thresholds, develop more sensitive measures of functional and cognitive recovery, and consider individualized transfusion strategies based on specific patient factors and biomarkers of cerebral ischemia.
Guidelines for the Management of Hyperglycemic Crises in Adult Patients with Diabetes
15 Dec, 2024 | 13:18h | UTCIntroduction: Diabetic ketoacidosis (DKA) and hyperglycemic hyperosmolar state (HHS) are critical, acute complications of type 1 and type 2 diabetes. Recent data show a global rise in DKA and HHS admissions, driven by factors such as psychosocial challenges, suboptimal insulin use, infection, and certain medications (e.g., SGLT2 inhibitors). This consensus report, developed by leading diabetes organizations (ADA, EASD, JBDS, AACE, DTS), provides updated recommendations on epidemiology, pathophysiology, diagnosis, treatment, and prevention of DKA and HHS in adults, aiming to guide clinical practice and improve outcomes.
Key Recommendations:
- Diagnosis and Classification:
- DKA is defined by hyperglycemia (>11.1 mmol/l [200 mg/dl] or known diabetes), elevated ketone levels (β-hydroxybutyrate ≥3.0 mmol/l), and metabolic acidosis (pH <7.3 or bicarbonate <18 mmol/l).
- HHS is characterized by marked hyperglycemia, severe hyperosmolality (>320 mOsm/kg), significant dehydration, and minimal ketonaemia or acidosis.
- Consider euglycemic DKA, especially with SGLT2 inhibitor use.
- Classify DKA severity (mild, moderate, severe) to guide the setting of care.
- Fluid and Electrolyte Management:
- Initiate isotonic or balanced crystalloid solutions to restore intravascular volume, enhance renal perfusion, and reduce hyperglycemia.
- Adjust fluids based on hydration, sodium levels, and glucose trends.
- Add dextrose when glucose falls below ~13.9 mmol/l (250 mg/dl) to allow ongoing insulin therapy until ketoacidosis resolves.
- Carefully monitor potassium and provide adequate replacement to prevent severe hypokalemia.
- Insulin Therapy:
- Start a continuous intravenous infusion of short-acting insulin as soon as feasible after confirming adequate potassium.
- For mild or moderate DKA, subcutaneous rapid-acting insulin analogs may be used under close supervision.
- Continue insulin until DKA resolves (pH ≥7.3, bicarbonate ≥18 mmol/l, β-hydroxybutyrate <0.6 mmol/l) or HHS improves (osmolality <300 mOsm/kg, improved mental status).
- Overlap subcutaneous basal insulin by 1–2 hours before discontinuing intravenous insulin to prevent rebound hyperglycemia.
- Additional Considerations:
- Avoid routine bicarbonate; use only if pH <7.0.
- Phosphate supplementation is not routinely recommended unless levels are severely low.
- Identify and treat underlying precipitating causes (infection, psychological factors, medication-related triggers).
- Address social determinants of health and mental health conditions to reduce recurrence.
Conclusion: By implementing these evidence-based recommendations—early diagnosis, structured fluid and insulin therapy, careful electrolyte management, and addressing precipitating factors—clinicians can improve patient care, reduce morbidity and mortality, and enhance the quality of life for adults experiencing DKA and HHS.
Retrospective Cohort Study: As-Needed Blood Pressure Medications Associated With Increased AKI and Other Adverse Outcomes in Hospitalized Veterans
8 Dec, 2024 | 21:34h | UTCBackground: Inpatient asymptomatic blood pressure (BP) elevations are common, and clinicians frequently use as-needed BP medications to rapidly lower BP values. However, there is limited evidence supporting this practice, and abrupt BP reductions may increase the risk of ischemic events, including acute kidney injury (AKI).
Objective: To examine whether as-needed BP medication use during hospitalization is associated with increased risk of AKI and other adverse outcomes compared to no as-needed use.
Methods: This retrospective cohort study used a target trial emulation and propensity score matching. Adults hospitalized for ≥3 days in non-ICU VA hospital wards from 2015-2020, who received at least one scheduled BP medication within the first 24 hours and had at least one systolic BP reading >140 mm Hg, were included. Patients were categorized into two groups: those receiving at least one as-needed BP medication (oral or IV) and those receiving only scheduled BP medications. The primary outcome was time-to-first AKI event. Secondary outcomes included a >25% drop in systolic BP within 3 hours and a composite of myocardial infarction (MI), stroke, or death.
Results: Among 133,760 veterans (mean age 71.2 years; 96% male), 21% received as-needed BP medications. As-needed BP medication use was associated with a 23% higher risk of AKI (HR=1.23; 95% CI, 1.18-1.29). The IV route showed a particularly pronounced AKI risk (HR=1.64). Secondary analyses indicated a 1.5-fold increased risk of rapid BP reduction and a 1.69-fold higher rate of the composite outcome (MI, stroke, death) among as-needed users.
Conclusions: In a large, national cohort of hospitalized veterans, as-needed BP medication use was associated with increased AKI risk and other adverse outcomes. These findings suggest that routine as-needed BP medication use for asymptomatic BP elevations may be harmful.
Implications for Practice: Clinicians should carefully reconsider the use of as-needed BP medications in the inpatient setting, especially in older individuals or those with significant cardiovascular risk. Given the lack of clear benefit and potential for harm, greater caution and potentially more conservative approaches are warranted.
Study Strengths and Limitations: Strengths include a large, nationally representative sample and robust analytic methods. Limitations include the retrospective design, potential residual confounding, and limited generalizability to non-veteran or surgical populations. While causal inferences cannot be made, the findings strongly support the need to question current practice.
Future Research: Prospective, randomized trials are needed to determine the optimal management of asymptomatic inpatient hypertension and to assess whether avoiding or reducing as-needed BP medication use improves clinical outcomes.
RCT: FFR-Guided PCI Plus TAVI is Non-inferior and Superior to SAVR Plus CABG in Patients With Severe Aortic Stenosis and Complex Coronary Disease
8 Dec, 2024 | 21:22h | UTCBackground: Patients with severe aortic stenosis frequently present with concomitant complex coronary artery disease. Current guidelines recommend combined surgical aortic valve replacement (SAVR) and coronary artery bypass grafting (CABG) as first-line therapy. However, transcatheter aortic valve implantation (TAVI) and fractional flow reserve (FFR)-guided percutaneous coronary intervention (PCI) have emerged as alternative treatments. Assessing their efficacy compared to SAVR plus CABG has been an unmet need.
Objective: To determine whether FFR-guided PCI plus TAVI is non-inferior and, if demonstrated, superior to SAVR plus CABG in patients with severe aortic stenosis and complex or multivessel coronary disease.
Methods: This international, multicenter, prospective, open-label, non-inferiority randomized controlled trial included patients aged ≥70 years with severe aortic stenosis and complex coronary disease who were deemed suitable for either percutaneous or surgical treatment by a Heart Team. Participants were randomized (1:1) to FFR-guided PCI plus TAVI or SAVR plus CABG. The primary endpoint was a composite of all-cause mortality, myocardial infarction, disabling stroke, clinically driven target-vessel revascularization, valve reintervention, and life-threatening or disabling bleeding at 1 year.
Results: Among 172 enrolled patients, 91 were assigned to FFR-guided PCI plus TAVI and 81 to SAVR plus CABG. At 1 year, the primary endpoint occurred in 4% of patients in the PCI/TAVI group versus 23% in the SAVR/CABG group (risk difference –18.5%; 90% CI –27.8 to –9.7; p<0.001 for non-inferiority; p<0.001 for superiority). The difference was driven mainly by lower all-cause mortality (0% vs 10%, p=0.0025) and reduced life-threatening bleeding (2% vs 12%, p=0.010).
Conclusions: In patients with severe aortic stenosis and complex coronary artery disease, FFR-guided PCI plus TAVI was non-inferior and in fact superior to SAVR plus CABG at 1 year, predominantly due to lower mortality and serious bleeding events.
Implications for Practice: These findings suggest that a percutaneous strategy may be a viable and potentially preferable alternative to surgery in selected patients. Nevertheless, given this is the first trial of its kind, cautious interpretation is advised, and routine adoption should await further corroboration.
Study Strengths and Limitations: Strengths include a randomized, multicenter design and standardized endpoint assessment. Limitations involve early trial termination resulting in a smaller sample size and the use of a single TAVI device type, limiting generalizability.
Future Research: Larger trials with longer follow-up, evaluation of other TAVI prostheses, and broader patient populations are needed to validate these findings and determine the optimal patient selection criteria.
Prospective Cohort: Combined CRP, LDL Cholesterol, and Lipoprotein(a) Levels Predict 30-Year Cardiovascular Risk in Women
8 Dec, 2024 | 20:58h | UTCRCT: Transcatheter Edge-to-edge Repair Improves Outcomes in Severe Tricuspid Regurgitation
29 Nov, 2024 | 12:37h | UTCBackground: Severe tricuspid regurgitation (TR) is linked to poor quality of life and increased mortality. Traditional medical therapy offers limited symptom relief, and surgical options carry high risks. Transcatheter tricuspid valve therapies like transcatheter edge-to-edge repair (T-TEER) have emerged as less invasive alternatives, but their impact on patient outcomes needs further exploration.
Objective: To determine if T-TEER combined with optimized medical therapy (OMT) enhances patient-reported outcomes and clinical events compared to OMT alone in patients with severe, symptomatic TR.
Methods: In this multicenter, prospective, randomized (1:1) trial, 300 adults with severe, symptomatic TR despite stable OMT were enrolled from 24 centers in France and Belgium between March 2021 and March 2023. Participants were randomized to receive either T-TEER plus OMT or OMT alone. The primary outcome was a composite clinical endpoint at 1 year, including changes in New York Heart Association (NYHA) class, patient global assessment (PGA), or occurrence of major cardiovascular events. Secondary outcomes encompassed TR severity, Kansas City Cardiomyopathy Questionnaire (KCCQ) score, and a composite of death, tricuspid valve surgery, KCCQ improvement, or hospitalization for heart failure.
Results: At 1 year, 74.1% of patients in the T-TEER plus OMT group improved in the composite endpoint versus 40.6% in the OMT-alone group (P < .001). Massive or torrential TR persisted in 6.8% of the T-TEER group compared to 53.5% of the OMT group (P < .001). The mean KCCQ score was higher in the T-TEER group (69.9 vs 55.4; P < .001). The win ratio for the composite secondary outcome was 2.06 (95% CI, 1.38-3.08; P < .001). No significant differences were observed in major cardiovascular events or cardiovascular death between groups.
Conclusions: Adding T-TEER to OMT significantly reduces TR severity and improves patient-reported outcomes at 1 year in patients with severe, symptomatic TR, without increasing adverse events.
Implications for Practice: T-TEER may offer a valuable addition to OMT for selected patients with severe TR, enhancing symptoms and quality of life. However, the absence of significant differences in hard clinical endpoints and the open-label design suggest cautious interpretation. Clinicians should weigh the benefits against potential biases in patient-reported outcomes.
Study Strengths and Limitations: Strengths include the randomized design and multicenter participation, enhancing the study’s validity. Limitations involve the open-label design without a sham control, potentially introducing bias in subjective outcomes. The short follow-up period and selective patient population based on anatomical suitability for T-TEER may limit generalizability.
Future Research: Longer-term studies are necessary to assess T-TEER’s impact on survival and heart failure hospitalization. Comparative studies of different transcatheter devices and investigations into optimal patient selection criteria are also recommended.
RCT: Adjunctive Middle Meningeal Artery Embolization Reduces Reoperation in Subdural Hematoma
24 Nov, 2024 | 13:53h | UTCBackground: Subacute and chronic subdural hematomas are common neurosurgical conditions with a high recurrence rate after surgical evacuation, affecting 8% to 20% of patients. Middle meningeal artery embolization (MMAE) is a minimally invasive procedure targeting the blood supply to these membranes. Preliminary studies suggest that adjunctive MMAE may reduce hematoma recurrence, but its impact on reoperation risk remains unclear.
Objective: To determine whether adjunctive MMAE reduces the risk of hematoma recurrence or progression leading to repeat surgery within 90 days compared to surgery alone in patients with symptomatic subacute or chronic subdural hematoma.
Methods: In this prospective, multicenter, randomized controlled trial, 400 patients aged 18 to 90 years with symptomatic subacute or chronic subdural hematoma requiring surgical evacuation were randomly assigned to receive either MMAE plus surgery (n=197) or surgery alone (n=203). The primary endpoint was hematoma recurrence or progression leading to repeat surgery within 90 days after the index treatment. The secondary endpoint was deterioration of neurologic function at 90 days, assessed using the modified Rankin Scale.
Results: Hematoma recurrence or progression requiring repeat surgery occurred in 8 patients (4.1%) in the MMAE plus surgery group versus 23 patients (11.3%) in the surgery-alone group (relative risk, 0.36; 95% CI, 0.11 to 0.80; P=0.008). Functional deterioration at 90 days was similar between groups (11.9% vs. 9.8%; risk difference, 2.1 percentage points; 95% CI, −4.8 to 8.9). Mortality at 90 days was 5.1% in the MMAE group and 3.0% in the control group. Serious adverse events related to the embolization occurred in 4 patients (2.0%), including disabling stroke in 2 patients.
Conclusions: Adjunctive MMAE combined with surgery significantly reduced the risk of hematoma recurrence or progression requiring reoperation within 90 days compared to surgery alone. However, there was no significant difference in neurologic functional deterioration, and the procedure was associated with procedural risks.
Implications for Practice: MMAE may be considered as an adjunct to surgical evacuation in patients with subacute or chronic subdural hematoma to reduce reoperation risk. Clinicians should carefully weigh the potential benefits against the risks of procedural complications, including stroke.
Study Strengths and Limitations: Strengths include the randomized controlled design and multicenter approach, enhancing generalizability. Limitations involve the open-label design, introducing potential bias since the primary endpoint was based on surgeon judgment. A substantial loss to follow-up (13.2%) could affect results, and the study was not powered to detect differences in mortality or serious adverse events.
Future Research: Further studies with larger sample sizes are needed to fully evaluate the safety and efficacy of MMAE, including long-term outcomes. Research should focus on optimizing patient selection and assessing the procedure’s impact on mortality and serious adverse events.
Cohort Study: Oral Hormone Therapy and Tibolone Increase Cardiovascular Risk in Menopausal Women
28 Nov, 2024 | 18:42h | UTCBackground: Cardiovascular disease is the leading cause of mortality worldwide, with incidence in women increasing notably during the menopausal transition. Menopausal hormone therapy (MHT) effectively alleviates menopausal symptoms but has been associated with cardiovascular risks in previous studies. The impact of contemporary MHT formulations and administration routes on cardiovascular disease risk in women aged 50–58 remains unclear.
Objective: To assess the effect of different types of contemporary MHT on the risk of cardiovascular disease, focusing on various hormone combinations and administration methods.
Methods: This nationwide register-based emulated target trial included 919,614 Swedish women aged 50–58 years between 2007 and 2020 who had not used MHT in the previous two years. Participants were assigned to one of eight treatment groups—including oral and transdermal therapies—or to a non-initiator group. The primary outcomes were hazard ratios (HRs) for venous thromboembolism (VTE), ischemic heart disease (IHD), cerebral infarction, and myocardial infarction, analyzed separately and as a composite cardiovascular disease outcome.
Results: Among the participants, 77,512 were MHT initiators and 842,102 were non-initiators. During follow-up, 24,089 cardiovascular events occurred. In intention-to-treat analyses, tibolone was associated with an increased risk of cardiovascular disease (HR 1.52, 95% CI 1.11 to 2.08) compared with non-initiators. Initiation of tibolone or oral estrogen-progestin therapy was linked to a higher risk of IHD (HRs 1.46 and 1.21, respectively). A higher risk of VTE was observed with oral continuous estrogen-progestin therapy (HR 1.61), sequential therapy (HR 2.00), and estrogen-only therapy (HR 1.57). Per protocol analyses showed that tibolone use was associated with increased risks of cerebral infarction (HR 1.97) and myocardial infarction (HR 1.94).
Conclusions: Use of oral estrogen-progestin therapy was associated with increased risks of IHD and VTE, while tibolone was linked to higher risks of IHD, cerebral infarction, and myocardial infarction but not VTE. These findings underscore the varying cardiovascular risks associated with different MHT types and administration methods.
Implications for Practice: Clinicians should exercise caution when prescribing oral estrogen-progestin therapy or tibolone for menopausal symptom relief, considering the elevated cardiovascular risks. Alternative MHT options, such as transdermal therapies, may offer a safer profile and should be considered.
Study Strengths and Limitations: Strengths include the large, nationwide cohort and the emulated target trial design, which reduces selection bias and confounding. Limitations involve the lack of data on menopausal status, smoking, and body mass index, which may affect cardiovascular risk. Potential misclassification of exposure and adherence could also impact results.
Future Research: Further studies should investigate the cardiovascular effects of specific progestins within MHT formulations and explore the impact of different doses and durations of therapy.
Review: Acute Respiratory Distress Syndrome
28 Nov, 2024 | 13:06h | UTCIntroduction: Acute respiratory distress syndrome (ARDS) is a severe inflammatory lung condition characterized by diffuse alveolar damage, leading to hypoxemia and respiratory failure. Since its initial description in 1967, the understanding and definition of ARDS have significantly evolved, integrating advances in basic science and clinical practice. A newly recommended global definition expands diagnostic criteria to enhance early recognition and management, especially in resource-limited settings. This review summarizes current insights into the epidemiology, pathophysiology, and evidence-based management of ARDS, highlighting key updates and future research priorities.
Key Recommendations:
- New Global Definition of ARDS: Adoption of an expanded definition that includes patients receiving high-flow nasal oxygen (HFNO) support and allows diagnosis using pulse oximetry and thoracic ultrasonography. This makes ARDS identification feasible in diverse clinical environments, including those with limited resources.
- Established Critical Care Interventions: Emphasis on early implementation of proven strategies such as low tidal volume ventilation (6 mL/kg predicted body weight) with plateau pressures ≤30 cm H₂O, prone positioning for patients with PaO₂/FiO₂ <150 mm Hg, and conservative fluid management after initial resuscitation. These interventions have consistently reduced mortality and are recommended as standard care.
- Personalized Approaches and Phenotyping: Recognition of the heterogeneity in ARDS pathophysiology underscores the need for personalized treatment strategies. Identification of hyper-inflammatory and hypo-inflammatory phenotypes may guide targeted therapies and improve outcomes, although prospective validation is required.
- Impact of COVID-19 on ARDS: Acknowledgment of the significant increase in ARDS incidence due to the COVID-19 pandemic. While COVID-19 ARDS shares similarities with traditional ARDS, notable differences in endothelial dysfunction and immune response highlight the necessity for tailored management approaches in these patients.
- Pharmacologic Interventions: Updated guidelines provide conditional recommendations for the use of corticosteroids in ARDS, particularly in early moderate to severe cases. Ongoing research into pharmacologic agents such as statins, mesenchymal stromal cells, and other cell-based therapies shows potential but requires further clinical trials to establish efficacy.
- Future Research Priorities: Identification of key areas for investigation, including the long-term sequelae of ARDS, optimization of non-invasive and invasive ventilation strategies, exploration of genetic and environmental risk factors, and development of rapid biomarker assays for real-time phenotyping and targeted therapy.
Conclusion: The evolving definition and understanding of ARDS aim to improve early detection and standardization of care across various clinical settings. Reinforcing established critical care interventions while advancing personalized and novel therapeutic approaches holds promise for reducing mortality and enhancing long-term patient outcomes. Continuous research into the pathophysiology and management of ARDS, enriched by insights from the COVID-19 pandemic, is essential to address ongoing challenges and improve patient care.