Other bulletins in this series include:

Breast Surgery

Tuesday, 13 August 2019

At-Risk Drinking Is Independently Associated With Acute Kidney Injury in Critically Ill Patients



by Gacouin, Arnaud; Lesouhaitier, Mathieu; Frerou, Aurelien; Painvin, Benoit; Reizine, Florian; Rafi, Sonia; Maamar, Adel; Le Tulzo, Yves; Tadié, Jean Marc


Objectives: Unhealthy use of alcohol and acute kidney injury are major public health problems, but little is known about the impact of excessive alcohol consumption on kidney function in critically ill patients. We aimed to determine whether at-risk drinking is independently associated with acute kidney injury in the ICU and at ICU discharge.
Design: Prospective observational cohort study. Setting: A 21-bed polyvalent ICU in a university hospital. Patients: A total of 1,107 adult patients admitted over a 30-month period who had an ICU stay of greater than or equal to 3 days and in whom alcohol consumption could be assessed. Interventions: None.
Measurements and Main Results: We assessed Kidney Disease Improving Global Outcomes stages 2–3 acute kidney injury in 320 at-risk drinkers (29%) and 787 non–at-risk drinkers (71%) at admission to the ICU, within 4 days after admission and at ICU discharge. The proportion of patients with stages 2–3 acute kidney injury at admission to the ICU (42.5% vs 18%; p < 0.0001) was significantly higher in at-risk drinkers than in non–at-risk drinkers. Within 4 days and after adjustment on susceptible and predisposing factors for acute kidney injury was performed, at-risk drinking was significantly associated with acute kidney injury for the entire population (odds ratio, 2.15; 1.60–2.89; p < 0.0001) in the subgroup of 832 patients without stages 2–3 acute kidney injury at admission to the ICU (odds ratio, 1.44; 1.02–2.02; p = 0.04) and in the subgroup of 971 patients without known chronic kidney disease (odds ratio, 1.92; 1.41–2.61; p < 0.0001). Among survivors, 22% of at-risk drinkers and 9% of non–at-risk drinkers were discharged with stages 2–3 acute kidney injury (p < 0.001).
Conclusions: Our results suggest that chronic and current alcohol misuse in critically ill patients is associated with kidney dysfunction. The systematic and accurate identification of patients with alcohol misuse may allow for the prevention of acute kidney injury.

The end-expiratory occlusion test: please, let me hold your breath!



by Francesco Gavelli, Jean-Louis Teboul and Xavier Monnet 

Critical Care:  volume 23, Article number: 274 (2019) 

Introduction
Fluids must be considered as drugs, with serious adverse effects and inconstant efficacy. Then, they should be administered only if there is reasonable chance that cardiac output (CO) will increase in response. Many tests or indices detecting “fluid responsiveness” have been developed for this purpose.
With some of these tests, the relationship between CO and cardiac preload is assessed through the haemodynamic effects of mechanical ventilation. It is the case for the end-expiratory occlusion (EEO) test, which has already been investigated in a reasonable number of studies [1,2,3,4,5,6,7,8,9,10,11,12,13]. In this commentary, we will explore its haemodynamic effects, review the literature validating it and describe its practical modalities.

Abdominal functional electrical stimulation to assist ventilator weaning in critical illness: a double-blinded, randomised, sham-controlled pilot study



by Euan J. McCaughey, Annemijn H. Jonkman, Claire L. Boswell-Ruys, Rachel A. McBain, Elizabeth A. Bye, Anna L. Hudson, David W. Collins, Leo M. A. Heunks, Angus J. McLachlan, Simon C. Gandevia and Jane E. Butler

Critical Care:  volume 23, Article number: 261 (2019)

Background:
For every day a person is dependent on mechanical ventilation, respiratory and cardiac complications increase, quality of life decreases and costs increase by > $USD 1500. Interventions that improve respiratory muscle function during mechanical ventilation can reduce ventilation duration. The aim of this pilot study was to assess the feasibility of employing an abdominal functional electrical stimulation (abdominal FES) training program with critically ill mechanically ventilated patients. We also investigated the effect of abdominal FES on respiratory muscle atrophy, mechanical ventilation duration and intensive care unit (ICU) length of stay.

Methods:
Twenty critically ill mechanically ventilated participants were recruited over a 6-month period from one metropolitan teaching hospital. They were randomly assigned to receive active or sham (control) abdominal FES for 30 min, twice per day, 5 days per week, until ICU discharge. Feasibility was assessed through participant compliance to stimulation sessions. Abdominal and diaphragm muscle thickness were measured using ultrasound 3 times in the first week, and weekly thereafter by a blinded assessor. Respiratory function was recorded when the participant could first breathe independently and at ICU discharge, with ventilation duration and ICU length of stay also recorded at ICU discharge by a blinded assessor.

Results:
Fourteen of 20 participants survived to ICU discharge (8, intervention; 6, control). One control was transferred before extubation, while one withdrew consent and one was withdrawn for staff safety after extubation. Median compliance to stimulation sessions was 92.1% (IQR 5.77%) in the intervention group, and 97.2% (IQR 7.40%) in the control group (p = 0.384). While this pilot study is not adequately powered to make an accurate statistical conclusion, there appeared to be no between-group thickness changes of the rectus abdominis (p = 0.099 at day 3), diaphragm (p = 0.652 at day 3) or combined lateral abdominal muscles (p = 0.074 at day 3). However, ICU length of stay (p = 0.011) and ventilation duration (p = 0.039) appeared to be shorter in the intervention compared to the control group.

Conclusions:
Our compliance rates demonstrate the feasibility of using abdominal FES with critically ill mechanically ventilated patients. While abdominal FES did not lead to differences in abdominal muscle or diaphragm thickness, it may be an effective method to reduce ventilation duration and ICU length of stay in this patient group. A fully powered study into this effect is warranted.

Trial registration:

The Australian New Zealand Clinical Trials Registry, ACTRN12617001180303. Registered 9 August 2017.

Association of sublingual microcirculation parameters and endothelial glycocalyx dimensions in resuscitated sepsis


by Alexandros Rovas, Laura Mareen Seidel, Hans Vink, Timo Pohlkötter, Hermann Pavenstädt, Christian Ertmer, Michael Hessler and Philipp Kümpers

Critical Care: volume 23, Article number: 260 (2019) 

Background:
The endothelial glycocalyx (eGC) covers the luminal surface of the vascular endothelium and plays an important protective role in systemic inflammatory states and particularly in sepsis. Its breakdown leads to capillary leak and organ dysfunction. Moreover, sepsis-induced alterations of sublingual microcirculation are associated with a worse clinical outcome. The present study was performed to investigate the associations between eGC dimensions and established parameters of microcirculation dysfunction in sepsis.

Methods:
This observational, prospective, cross-sectional study included 40 participants, of which 30 critically ill septic patients were recruited from intensive care units of a university hospital and 10 healthy volunteers served as controls. The established microcirculation parameters were obtained sublingually and analyzed according to the current recommendations. In addition, the perfused boundary region (PBR), an inverse parameter of the eGC dimensions, was measured sublingually, using novel data acquisition and analysis software (GlycoCheck™). Moreover, we exposed living endothelial cells to 5% serum from a subgroup of study participants, and the delta eGC breakdown, measured with atomic force microscopy (AFM), was correlated with the paired PBR values.

Results:
In septic patients, sublingual microcirculation was impaired, as indicated by a reduced microvascular flow index (MFI) and a reduced proportion of perfused vessels (PPV) compared to those in healthy controls (MFI, 2.93 vs 2.74, p = 0.002; PPV, 98.53 vs 92.58, p = 0.0004). PBR values were significantly higher in septic patients compared to those in healthy controls, indicating damage of the eGC (2.04 vs 2.34, p < 0.0001). The in vitro AFM data correlated exceptionally well with paired PBR values obtained at the bedside (rs = − 0.94, p = 0.02). Both PBR values and microcirculation parameters correlated well with the markers of critical illness. Interestingly, no association was observed between the PBR values and established microcirculation parameters.

Conclusion:
Our findings suggest that eGC damage can occur independently of microcirculatory impairment as measured by classical consensus parameters. Further studies in critically ill patients are needed to unravel the relationship of glycocalyx damage and microvascular impairment, as well as their prognostic and therapeutic importance in sepsis.

Trial registration:

Retrospectively registered: Clinicaltrials.gov, NCT03960307




Intravenous fluid resuscitation is associated with septic endothelial glycocalyx degradation



by Joseph A. Hippensteel, Ryo Uchimido, Patrick D. Tyler, Ryan C. Burke, Xiaorui Han, Fuming Zhang, Sarah A. McMurtry, James F. Colbert, Christopher J. Lindsell, Derek C. Angus, John A. Kellum, Donald M. Yealy, Robert J. Linhardt, Nathan I. Shapiro and Eric P. Schmidt

Critical Care  volume 23, Article number: 259 (2019

Background:
Intravenous fluids, an essential component of sepsis resuscitation, may paradoxically worsen outcomes by exacerbating endothelial injury. Preclinical models suggest that fluid resuscitation degrades the endothelial glycocalyx, a heparan sulfate-enriched structure necessary for vascular homeostasis. We hypothesized that endothelial glycocalyx degradation is associated with the volume of intravenous fluids administered during early sepsis resuscitation.
Methods:
We used mass spectrometry to measure plasma eparin sulfate (a highly sensitive and specific index of systemic endothelial glycocalyx degradation) after 6 h of intravenous fluids in 56 septic shock patients, at presentation and after 24 h of intravenous fluids in 100 sepsis patients, and in two groups of non-infected patients. We compared plasma eparin sulfate concentrations between sepsis and non-sepsis patients, as well as between sepsis survivors and sepsis non-survivors. We used multivariable linear regression to model the association between volume of intravenous fluids and changes in plasma eparin sulfate.
Results:
Consistent with previous studies, median plasma heparan sulfate was elevated in septic shock patients (118 [IQR, 113–341] ng/ml 6 h after presentation) compared to non-infected controls (61 [45–79] ng/ml), as well as in a second cohort of sepsis patients (283 [155–584] ng/ml) at emergency department presentation) compared to controls (177 [144–262] ng/ml). In the larger sepsis cohort, heparan sulfate predicted in-hospital mortality. In both cohorts, multivariable linear regression adjusting for age and severity of illness demonstrated a significant association between volume of intravenous fluids administered during resuscitation and plasma heparan sulfate. In the second cohort, independent of disease severity and age, each 1 l of intravenous fluids administered was associated with a 200 ng/ml increase in circulating heparan sulfate (p = 0.006) at 24 h after enrollment.
Conclusions:
Glycocalyx degradation occurs in sepsis and septic shock and is associated with in-hospital mortality. The volume of intravenous fluids administered during sepsis resuscitation is independently associated with the degree of glycocalyx degradation. These findings suggest a potential mechanism by which intravenous fluid resuscitation strategies may induce iatrogenic endothelial injury.

Noninvasive assessment of airflows by electrical impedance tomography in intubated hypoxemic patients: an exploratory study



by Tommaso Mauri, Elena Spinelli, Francesca Dalla Corte, Eleonora Scotti, Cecilia Turrini, Marta Lazzeri, Laura Alban, Marco Albanese, Donatella Tortolani, Yu-Mei Wang, Savino Spadaro, Jian-Xin Zhou, Antonio Pesenti and Giacomo Grasselli

Annals of Intensive Care: volume 9, Article number: 83 (2019) 

Background:
Noninvasive monitoring of maximal inspiratory and expiratory flows (MIF and MEF, respectively) by electrical impedance tomography (EIT) might enable early recognition of changes in the mechanical properties of the respiratory system due to new conditions or in response to treatments. We aimed to validate EIT-based measures of MIF and MEF against spirometry in intubated hypoxemic patients during controlled ventilation and spontaneous breathing. Moreover, regional distribution of maximal airflows might interact with lung pathology and increase the risk of additional ventilation injury. Thus, we also aimed to describe the effects of mechanical ventilation settings on regional MIF and MEF.
Methods:
We performed a new analysis of data from two prospective, randomized, crossover studies. We included intubated patients admitted to the intensive care unit with acute hypoxemic respiratory failure (AHRF) and acute respiratory distress syndrome (ARDS) undergoing pressure support ventilation (PSV, n = 10) and volume-controlled ventilation (VCV, n = 20). We measured MIF and MEF by spirometry and EIT during six different combinations of ventilation settings: higher vs. lower support during PSV and higher vs. lower positive end-expiratory pressure (PEEP) during both PSV and VCV. Regional airflows were assessed by EIT in dependent and non-dependent lung regions, too.
Results:
MIF and MEF measured by EIT were tightly correlated with those measured by spirometry during all conditions (range of R2 0.629–0.776 and R2 0.606–0.772, respectively, p < 0.05 for all), with clinically acceptable limits of agreement. Higher PEEP significantly improved homogeneity in the regional distribution of MIF and MEF during volume-controlled ventilation, by increasing airflows in the dependent lung regions and lowering them in the non-dependent ones.
Conclusions:
EIT provides accurate noninvasive monitoring of MIF and MEF. The present study also generates the hypothesis that EIT could guide PSV and PEEP settings aimed to increase homogeneity of distending and deflating regional airflows.

Impact of Structured Pathways for Postcardiac Arrest Care: A Systematic Review and Meta-Analysis



by Storm, Christian; Leithner, Christoph; Krannich, Alexander; Suarez, Jose I.; Stevens, Robert D. 


Objectives: Recent research has demonstrated value in selected therapeutic and prognostic interventions delivered to patients following cardiac arrest. The aim of this work was to determine if the implementation of a structured care pathway, which combines different interventions, could improve outcomes in survivors of cardiac arrest.
Data Sources: PubMed and review of citations in retrieved articles.
Study Selection: Randomized trials and prospective observational studies conducted in adult cardiac arrest patients, which evaluated the impact on outcome of a structured care pathway, defined as an organized set of interventions designed specifically for postcardiac arrest patients.
Data Extraction: Data collected included study characteristics and methodologic quality, populations enrolled, interventions that were part of the cardiac arrest structured care pathway, and outcomes. The principal outcome was favorable functional status defined as a Cerebral Performance Category score of 1–2 at or after hospital discharge.
Data Synthesis: The systematic search retrieved 481 articles of which nine (total, 1,994 patients) were selected for systematic review, and six (1,422 patients) met criteria for meta-analysis. Interventions in the care pathways included early coronary angiography with or without percutaneous coronary intervention (eight studies), targeted temperature management (nine studies), and protocolized management in the ICU (seven studies). Neurologic prognostication was not a part of any of the structured pathways. Meta-analysis found significantly higher odds of achieving a favorable functional outcome in patients who were treated in a structured care pathway, when compared with standard care (odds ratio, 2.35; 95% CI, 1.46–3.81).
Conclusions: Following cardiac arrest, patients treated in a structured care pathway may have a substantially higher likelihood of favorable functional outcome than those who receive standard care. These findings suggest benefit of a highly organized approach to postcardiac arrest care, in which a cluster of evidence-based interventions are delivered by a specialized interdisciplinary team. Given the overall low certainty of evidence, definitive recommendations will need confirmation in additional high-quality studies.

Early Enteral Nutrition in Patients Undergoing Sustained Neuromuscular Blockade: A Propensity-Matched Analysis Using a Nationwide Inpatient Database*



By Ohbe, Hiroyuki; Jo, Taisuke; Matsui, Hiroki; Fushimi, Kiyohide; Yasunaga, Hideo 


Objectives: Whether enteral nutrition should be postponed in patients undergoing sustained treatment with neuromuscular blocking agents remains unclear. We evaluated the association between enteral nutrition initiated within 2 days of sustained neuromuscular blocking agent treatment and in-hospital mortality.
Design: Retrospective administrative database study from July 2010 to March 2016. Setting: More than 1,200 acute care hospitals covering approximately 90% of all tertiary-care emergency hospitals in Japan.
Patients: Mechanically ventilated patients, who had undergone sustained treatment with neuromuscular blocking agents in an ICU, were retrospectively reviewed. We defined patients who received sustained treatment with neuromuscular blocking agents as those who received either rocuronium at greater than or equal to 250 mg/d or vecuronium at greater than or equal to 50 mg/d for at least 2 consecutive days. Interventions: Enteral nutrition started within 2 days from the initiation of neuromuscular blocking agents (defined as early enteral nutrition).
Measurements and Main Results: We identified 2,340 eligible patients during the 69-month study period. Of these, 378 patients (16%) had received early enteral nutrition. One-to-three propensity score matching created 374–1,122 pairs. The in-hospital mortality rate was significantly lower in the early than late enteral nutrition group (risk difference, –6.3%; 95% CI, –11.7% to –0.9%). There was no significant difference in the rate of hospital pneumonia between the two groups (risk difference, 2.8%; 95% CI, –2.7% to 8.3%). Length of hospital stay among survivors was significantly shorter in the early compared with the late enteral nutrition group (risk difference, –11.4 d; 95% CI, –19.1 to –3.7 d). There was no significant difference between the two groups in length of ICU stay or length of mechanical ventilation among survivors.
Conclusions: According to this retrospective database study, early enteral nutrition may be associated with lower in-hospital mortality with no increase in-hospital pneumonia in patients undergoing sustained treatment with neuromuscular blocking agents.

Extracorporeal Membrane Oxygenation for Septic Shock



by Falk, Lars; Hultman, Jan; Broman, Lars Mikael 


Objectives: Septic shock carries a high mortality risk. Studies have indicated that patients with septic shock may benefit from extracorporeal membrane oxygenation. In most studies, patients exhibited shock due to myocardial dysfunction rather than distributive/vasoplegic shock. One proposed theory is that venoarterial extracorporeal membrane oxygenation alleviates a failing myocardial function. Design: Retrospective observational study.
Setting: Single-center, high-volume extracorporeal membrane oxygenation unit.
Patients: All patients treated for septic shock between 2012 and 2017 with an age greater than 18 years old, fulfilling septic shock criteria according to “Sepsis-3” at acceptance for extracorporeal membrane oxygenation, presence of cardiocirculatory failure requiring a support equivalent to a Vasoactive Inotropic Score greater than 50 to reach a mean arterial pressure greater than 65 mm Hg despite adequate fluid resuscitation, were included. Interventions: None.
Measurements and Main Results: Thirty-seven patients, mean age 54.7 years old, were included. Median Simplified Acute Physiology Score-3 score was 86 and Sequential Organ Failure Assessment 16. Twenty-seven patients were submitted to venoarterial and 10 patients to venovenous extracorporeal membrane oxygenation. Hospital survival was 90% for septic shock with left ventricular failure and 64.7% in patients with distributive shock. At long-term follow-up at 46.1 months, total survival was 59.5%. Commencement of venovenous extracorporeal membrane oxygenation and more organ failures at admission showed a less favorable outcome in terms of hospital and long-term survival.
Conclusions: The current results add not only to the growing evidence of the benefit of venoarterial extracorporeal membrane oxygenation for septic cardiomyopathy but also indicate improved hospital survival in distributive septic shock.

Monocyte Distribution Width: A Novel Indicator of Sepsis-2 and Sepsis-3 in High-Risk Emergency Department Patients*



by Crouser, Elliott D.; Parrillo, Joseph E.; Seymour, Christopher W.; Angus, Derek C.; Bicking, Keri; Esguerra, Vincent G.; Peck-Palmer, Octavia M.; Magari, Robert T.; Julian, Mark W.; Kleven, Jennifer M.; Raj, Paarth J.; Procopio, Gabrielle; Careaga, Diana; Tejidor, Liliana


Objectives: Most septic patients are initially encountered in the emergency department where sepsis recognition is often delayed, in part due to the lack of effective biomarkers. This study evaluated the diagnostic accuracy of peripheral blood monocyte distribution width alone and in combination with WBC count for early sepsis detection in the emergency department. Design: An Institutional Review Board approved, blinded, observational, prospective cohort study conducted between April 2017 and January 2018.
Setting: Subjects were enrolled from emergency departments at three U.S. academic centers. Patients: Adult patients, 18–89 years, with complete blood count performed upon presentation to the emergency department, and who remained hospitalized for at least 12 hours. A total of 2,212 patients were screened, of whom 2,158 subjects were enrolled and categorized per Sepsis-2 criteria, such as controls (n = 1,088), systemic inflammatory response syndrome (n = 441), infection (n = 244), and sepsis (n = 385), and Sepsis-3 criteria, such as control (n = 1,529), infection (n = 386), and sepsis (n = 243). Interventions: The primary outcome determined whether an monocyte distribution width of greater than 20.0 U, alone or in combination with WBC, improves early sepsis detection by Sepsis-2 criteria. Secondary endpoints determined monocyte distribution width performance for Sepsis-3 detection.
Measurements and Main Results: Monocyte distribution width greater than 20.0 U distinguished sepsis from all other conditions based on either Sepsis-2 criteria (area under the curve, 0.79; 95% CI, 0.76–0.82) or Sepsis-3 criteria (area under the curve, 0.73; 95% CI, 0.69–0.76). The negative predictive values for monocyte distribution width less than or equal to 20 U for Sepsis-2 and Sepsis-3 were 93% and 94%, respectively. Monocyte distribution width greater than 20.0 U combined with an abnormal WBC further improved Sepsis-2 detection (area under the curve, 0.85; 95% CI, 0.83–0.88) and as reflected by likelihood ratio and added value analyses. Normal WBC and monocyte distribution width inferred a six-fold lower sepsis probability.
Conclusions: An monocyte distribution width value of greater than 20.0 U is effective for sepsis detection, based on either Sepsis-2 criteria or Sepsis-3 criteria, during the initial emergency department encounter. In tandem with WBC, monocyte distribution width is further predicted to enhance medical decision making during early sepsis management in the emergency department.

Wednesday, 3 July 2019

The hospital frailty risk score is of limited value in intensive care unit patients



by Raphael Romano Bruno, Bernhard Wernly, Hans Flaatten, Fabian Schölzel, Malte Kelm and Christian Jung 

The identification of patients with frailty is of utmost importance, in particular during intensive care treatment of very old intensive care patients (VOPs). It is quite obvious that tools for this triage process should differ from younger patients. Frailty—not necessarily age—is associated with a negative impact on outcome, especially in critically ill patients [1]. This problem is of great importance as VOPs are one of the fastest growing subgroups in intensive care medicine. We expect an increase in the proportion of the world population older than 60 years from 12% in 2013 to 21% in 2050 [2]…

Improve short-term survival in postcardiotomy cardiogenic shock by simultaneous use of intra-aortic balloon pumping with veno-arterial extracorporeal membrane oxygenation: Beware of confounders!



by Patrick M. Honore, David De Bels, Sebastien Redant and Kianoush Kashani 

We enthusiastically read the recently published retrospective study by Chen et al. [1] who demonstrated that simultaneous use of intra-aortic balloon pumping (IABP) together with veno-arterial extracorporeal membrane oxygenation (VA-ECMO) in postcardiotomy cardiogenic shock (PCS) patients improved short-term survival and reduced peripheral perfusion complications. In this study, 42 (28%) patients were on concomitant IABP and VA-ECMO [1]. While the study adds substantial value to the current knowledge, the current literature about the concomitant use of VA-ECMO and IABP remains controversial [2]. A more mechanical and pragmatic approach would be to state that VA-ECMO increases left ventricular (LV) afterload and decreases the blood flow in the coronary arteries due to retrograde blood flow, which can potentially deteriorate cardiac function while IABP could reduce these effects. Reduced LV afterload and increased blood flow in the coronary arteries by IABP theoretically promote myocardial recovery and could potentially improve survival (although improved survival was never shown) [23]. In the baseline characteristics of patients before VA-ECMO implantation among the non-survivors, they were significantly more hypertension (35 vs. 15%; P < .004), secondary thoracotomies (39 vs. 19%; P < .007), cardiac arrests (34 vs. 11%; P < .001), bedside implantations 42 vs. 11%; P < .0001) and significantly less concomitant insertions of VA-ECMO and IABP (22 vs. 41%; P < .025) when compared with the study survivors [1]. All mentioned variables are well-described risk factors for increased mortality [3]. It is also reported that brain and kidney blood flow improves with concurrent initiation of IABP with ECMO [1]. Therefore, the question would be to find the mechanism by which concurrent initiation could reduce the need for continuous renal replacement therapy (CRRT) and decrease neurological complications [1]. Strategies aiming to prevent acute kidney injury (AKI) by increasing global blood flow to the kidneys have failed [4] as increasing blood flow mostly impacts the cortex while medulla remains hypoperfused. Therefore, it remains unclear why the use of IABP added to VA-ECMO in order to improve renal blood flow could significantly reduce the need for CRRT [134]. In order to decrease the chances of bias in the reported findings, the traditional AKI risk factors like diabetes mellitus, contrast exposure, the presence of shock and need for inotropes should be included in the comparison of these groups (VA-ECMO alone vs. VA-ECMO plus IABP) [1]. Adding IABP to VA-ECMO was not reported to increase limb ischemia [1]. This is in contradiction with a recent study by Yang et al. [5] which stated major vascular complications (MVCs) to be common and associated with higher in-hospital mortality among adult PCS patients receiving peripheral VA-ECMO support. Previously, obesity, concomitant IABP/ECMO, SOFA score at 24 h post-ECMO, and bleeding disorders were reported as independent risk factors for MVCs [5]. In conclusion and according to our interpretation, this very interesting study does not definitively show that adding IABP is improving short-term survival as many confounders could explain the observed difference in mortality.

Focus on sepsis



Intensive Care Medicine
Authors: Morten Hylander Møller, Waleed Alhazzani, Manu Shankar-Hari



Sepsis continues to be an important clinical and research problem within critical care, as highlighted in the most recent literature.
The Surviving Sepsis Campaign bundle was updated in 2018 [1]. It was emphasised that within 1 h of presentation with sepsis, clinicians should: measure lactate, obtain blood cultures, administer broad-spectrum antimicrobials, begin fluid resuscitation with 30 ml/kg crystalloids, and apply vasopressors in case of fluid refractory shock. It was recommended that this new sepsis 1-h bundle should be used systematically in emergency departments, wards, and ICUs to reduce the global burden of sepsis [1]. While the 1-h bundle is welcomed and reasonable from a patient perspective, the quality of evidence supporting some individual elements of the bundle is low. A group of international experts representing the European Society of Intensive Care Medicine and the Society of Critical Care Medicine recently highlighted research priorities in the recent Surviving Sepsis Campaign guideline [2]. The top-six research priorities were use of personalised medicine in sepsis, fluid resuscitation, rapid diagnostic tests, empirical antibiotic combination therapy, long-term outcomes, and predictors of organ dysfunction…

Reporting of Organ Support Outcomes in Septic Shock Randomized Controlled Trials: A Methodologic Review—The Sepsis Organ Support Study



by Bourcier, Simon; Hindlet, Patrick; Guidet, Bertrand; Dechartres, Agnès 

Objectives: Many recent randomized controlled trials in the field of septic shock failed to demonstrate a benefit on mortality. Randomized controlled trials increasingly report organ support duration and organ support-free days as primary or secondary outcomes. We conducted a methodologic systematic review to assess how organ support outcomes were defined and reported in septic shock randomized controlled trials.
Data Sources: MEDLINE via PubMed, Embase, Cochrane Central Register of Controlled Trials, and Web of Science.
Study Selection: We included randomized controlled trials published between January 2004 and March 2018 that involved septic shock adults and assessed organ support duration and/or organ support-free days for hemodynamic support, respiratory support, or renal replacement therapy. 
Data Extraction: For each randomized controlled trial, we extracted the definitions of organ support duration and organ support-free days. We particularly evaluated how nonsurvivors were accounted for. Study authors were contacted to provide any missing information regarding these definitions. Data Synthesis: We included 28 randomized controlled trials. Organ support duration and organ support-free days outcomes were reported in 17 and 15 randomized controlled trials, respectively, for hemodynamic support, 15 and 15 for respiratory support, and five and nine for renal replacement therapy. Nonsurvivors were included in the organ support duration calculation in 13 of 14 randomized controlled trials (93%) for hemodynamic support and nine of 10 (90%) for respiratory support. The organ support-free days definition for hemodynamic support, respiratory support, and renal replacement therapy was reported in six of 15 randomized controlled trials (40%), eight of 15 randomized controlled trials (53%), and six of nine randomized controlled trials (67%) reporting an organ support-free days outcome, respectively. Of these, one half assigned “0” to nonsurvivors, and the other half attributed one point per day alive free of organ support up to a predefined time point. Conclusions: This study highlights the heterogeneity and infrequency of organ support duration/organ support-free days outcome reporting in septic shock trials. When reported, the definitions of these outcome measures and methods of calculation are also infrequently reported, in particular how nonsurvivors were accounted for, which may have an important impact on interpretation.

Heparin-binding protein in sepsis: player! predictor! positioning?




by Patrick M. Honore, David De Bels, Leonel Barreto Gutierrez, Sebastien Redant and Herbert D. Spapen 

In a post hoc study of the multicenter FINNAKI trial, Tverring et al. recently reported that measuring heparin-binding protein (HBP) on admission in the intensive care unit (ICU) improved prediction of sepsis-induced acute kidney injury (AKI). In addition, high plasma HBP levels were associated with a significantly higher fluid balance within 24 h, more organ failure within the first week, and increased 28-day mortality [1]. These observations support HBP as a novel prominent pawn on the already well-stuffed AKI biomarker chessboard…

When less is more in the active management of elevated body temperature of ICU patients



By Paul J. Young, Hallie C. Prescott

Abstract
Fever is a pathophysiological response in which the body’s normal thermoregulatory set-point is adjusted upwards leading to an increase in body temperature. In contrast, hyperthermia occurs from excessive heat production or insufficient thermoregulation (e.g., heat stroke or drug reactions). Although temperature elevation is common in Intensive Care Unit (ICU) patients, a newly elevated body temperature should prompt consideration of a diagnostic evaluation. It is always prudent to consider the possibility of infection; however, for critically patients with acute brain pathologies in particular, elevated body temperature is common, even in the absence of infection. Body temperature may be elevated due to drugs, particularly antipsychotic, serotonergic, sympathomimetic, anesthetic, and anticholinergics drugs [1]. Thyrotoxicosis and pheochromocytoma should also be considered in the differential diagnosis. Often elevated temperature is multifactorial and, in many patients, particularly after major surgery, a specific cause is not found.
Although body temperature is recorded assiduously in the ICU [2], it is often unclear when or how to intervene when a patient’s body temperature is elevated. A recent individual patient data meta-analysis reported that more active fever management did not increase survival compared with less active fever management in an all-comers population of critically ill adults [3]. Survival by treatment group was similar in a range of subgroups defined by age, illness severity, receipt of specific organ supports, and the presence versus absence of high fever at baseline. These data suggest that, in general, when it comes to active management of fever in ICU patients, although less may not be more, doing less to treat fever results in similar outcomes to doing more.

The association between nutritional adequacy and 28-day mortality in the critically ill is not modified by their baseline nutritional status and disease severity



by Charles Chin Han Lew, Gabriel Jun Yung Wong, Ka Po Cheung, Robert J. L. Fraser, Ai Ping Chua, Mary Foong Fong Chong and Michelle Miller

Background
During the initial phase of critical illness, the association between the dose of nutrition support and mortality risk may vary among patients in the intensive care unit (ICU) because the prevalence of malnutrition varies widely (28 to 78%), and not all ICU patients are severely ill. Therefore, we hypothesized that a prognostic model that integrates nutritional status and disease severity could accurately predict mortality risk and classify critically ill patients into low- and high-risk groups. Additionally, in critically ill patients placed on exclusive nutritional support (ENS), we hypothesized that their risk categories could modify the association between dose of nutrition support and mortality risk.
Methods
A prognostic model that predicts 28-day mortality was built from a prospective cohort study of 440 patients. The association between dose of nutrition support and mortality risk was evaluated in a subgroup of 252 mechanically ventilated patients via logistic regressions, stratified by low- and high-risk groups, and days of exclusive nutritional support (ENS) [short-term (≤ 6 days) vs. longer-term (≥ 7 days)]. Only the first 6 days of ENS was evaluated for a fair comparison.
Results
The prognostic model demonstrated good discrimination [AUC 0.78 (95% CI 0.73–0.82), and a bias-corrected calibration curve suggested fair accuracy. In high-risk patients with short-term ENS (≤ 6 days), each 10% increase in goal energy and protein intake was associated with an increased adjusted odds (95% CI) of 28-day mortality [1.60 (1.19–2.15) and 1.47 (1.12–1.86), respectively]. In contrast, each 10% increase in goal protein intake during the first 6 days of ENS in high-risk patients with longer-term ENS (≥ 7 days) was associated with a lower adjusted odds of 28-day mortality [0.75 (0.57–0.99)]. Despite the opposing associations, the mean predicted mortality risks and prevalence of malnutrition between short- and longer-term ENS patients were similar.
Conclusions
Combining baseline nutritional status and disease severity in a prognostic model could accurately predict 28-day mortality. However, the association between the dose of nutrition support during the first 6 days of ENS and 28-day mortality was independent of baseline disease severity and nutritional status.

The Restrictive IV Fluid Trial in Severe Sepsis and Septic Shock (RIFTS): A Randomized Pilot Study*



by Corl, Keith A.; Prodromou, Michael; Merchant, Roland C.; Gareen, Ilana; Marks, Sarah; Banerjee, Debasree; Amass, Timothy; Abbasi, Adeel; Delcompare, Cesar; Palmisciano, Amy; Aliotta, Jason; Jay, Gregory; Levy, Mitchell M.

Objectives: It is unclear if a low- or high-volume IV fluid resuscitation strategy is better for patients with severe sepsis and septic shock.
Design: Prospective randomized controlled trial.
Setting: Two adult acute care hospitals within a single academic system.
Patients: Patients with severe sepsis and septic shock admitted from the emergency department to the ICU from November 2016 to February 2018.
Interventions: Patients were randomly assigned to a restrictive IV fluid resuscitation strategy (≤ 60 mL/kg of IV fluid) or usual care for the first 72 hours of care.
Measurements and Main Results: We enrolled 109 patients, of whom 55 were assigned to the restrictive resuscitation group and 54 to the usual care group. The restrictive group received significantly less resuscitative IV fluid than the usual care group (47.1 vs 61.1 mL/kg; p = 0.01) over 72 hours. By 30 days, there were 12 deaths (21.8%) in the restrictive group and 12 deaths (22.2%) in the usual care group (odds ratio, 1.02; 95% CI, 0.41–2.53). There were no differences between groups in the rate of new organ failure, hospital or ICU length of stay, or serious adverse events.
Conclusions: This pilot study demonstrates that a restrictive resuscitation strategy can successfully reduce the amount of IV fluid administered to patients with severe sepsis and septic shock compared with usual care. Although limited by the sample size, we observed no increase in mortality, organ failure, or adverse events. These findings further support that a restrictive IV fluid strategy should be explored in a larger multicenter trial.

Sleep and Work in ICU Physicians During a Randomized Trial of Nighttime Intensivist Staffing*



by Bakhru, Rita N.; Basner, Mathias; Kerlin, Meeta Prasad; Halpern, Scott D.; Hansen-Flaschen, John; Rosen, Ilene M.; Dinges, David F.; Schweickert, William D

Objectives: To compare sleep, work hours, and behavioral alertness in faculty and fellows during a randomized trial of nighttime in-hospital intensivist staffing compared with a standard daytime intensivist model.

Design: Prospective observational study. Setting: Medical ICU of a tertiary care academic medical center during a randomized controlled trial of in-hospital nighttime intensivist staffing. Patients: Twenty faculty and 13 fellows assigned to rotations in the medical ICU during 2012. Interventions: As part of the parent study, there was weekly randomization of staffing model, stratified by 2-week faculty rotation. During the standard staffing model, there were in-hospital residents, with a fellow and faculty member available at nighttime by phone. In the intervention, there were in-hospital residents with an in-hospital nighttime intensivist. Fellows and faculty completed diaries detailing their sleep, work, and well-being; wore actigraphs; and performed psychomotor vigilance testing daily.

Measurements and Main Results: Daily sleep time (mean hours [sd]) was increased for fellows and faculty in the intervention versus control (6.7 [0.3] vs 6.0 [0.2]; p < 0.001 and 6.7 [0.1] vs 6.4 [0.2]; p < 0.001, respectively). In-hospital work duration did not differ between the models for fellows or faculty. Total hours of work done at home was different for both fellows and faculty (0.1 [< 0.1] intervention vs 1.0 [0.1] control; p < 0.001 and 0.2 [< 0.1] intervention vs 0.6 [0.1] control; p < 0.001, respectively). Psychomotor vigilance testing did not demonstrate any differences. Measures of well-being including physical exhaustion and alertness were improved in faculty and fellows in the intervention staffing model.

Conclusions: Although no differences were measured in patient outcomes between the two staffing models, in-hospital nighttime intensivist staffing was associated with small increases in total sleep duration for faculty and fellows, reductions in total work hours for fellows only, and improvements in subjective well-being for both groups. Staffing models should consider how work duration, sleep, and well-being may impact burnout and sustainability.

Preserving the quality of life: nutrition in the ICU



by Pierre Singer 

Abstract
Critically ill patients require adequate nutritional support to meet energy requirements both during and after intensive care unit (ICU) stay to protect against severe catabolism and prevent significant deconditioning. ICU patients often suffer from chronic critical illness causing an increase in energy expenditure, leading to proteolysis and related muscle loss. Careful supplementation and modulation of caloric and protein intake can avoid under- or overfeeding, both associated with poorer outcomes. Indirect calorimetry is the preferred method for assessing resting energy expenditure and the appropriate caloric and protein intake to counter energy and muscle loss. Physical exercise may have favorable effects on muscle preservation and should be considered even early in the hospital course of a critically ill patient. After liberation from the ventilator or during non-invasive ventilation, oral intake should be carefully evaluated and, in case of severe dysphagia, should be avoided and replaced by enteral of parenteral nutrition. Upon transfer from the ICU to the ward, adequate nutrition remains essential for long-term rehabilitation success and continued emphasis on sufficient nutritional supplementation in the ward is necessary to avoid a suboptimal nutritional state.