Biblio du Mois: Mars 2015
Dans La Biblio du Mois, les grands classiques: PEEP, ACR et PCT dans le sepsis. Et un point de plus pour la pose de KTC in-plane.
Bonne lecture !
BACKGROUND Mechanical-ventilation strategies that use lower end-inspiratory (plateau) airway pressures, lower tidal volumes (VT), and higher positive end-expiratory pressures (PEEPs) can improve survival in patients with the acute respiratory distress syndrome (ARDS), but the relative importance of each of these components is uncertain. Because respiratory-system compliance (CRS) is strongly related to the volume of aerated remaining functional lung during disease (termed functional lung size), we hypothesized that driving pressure (ΔP=VT/CRS), in which VT is intrinsically normalized to functional lung size (instead of predicted lung size in healthy persons), would be an index more strongly associated with survival than VT or PEEP in patients who are not actively breathing.METHODS Using a statistical tool known as multilevel mediation analysis to analyze individual data from 3562 patients with ARDS enrolled in nine previously reported randomized trials, we examined ΔP as an independent variable associated with survival. In the mediation analysis, we estimated the isolated effects of changes in ΔP resulting from randomized ventilator settings while minimizing confounding due to the baseline severity of lung disease.RESULTS Among ventilation variables, ΔP was most strongly associated with survival. A 1-SD increment in ΔP (approximately 7 cm of water) was associated with increased mortality (relative risk, 1.41; 95% confidence interval [CI], 1.31 to 1.51; P<0.001), even in patients receiving “protective” plateau pressures and VT (relative risk, 1.36; 95% CI, 1.17 to 1.58; P<0.001). Individual changes in VT or PEEP after randomization were not independently associated with survival; they were associated only if they were among the changes that led to reductions in ΔP (mediation effects of ΔP, P=0.004 and P=0.001, respectively).CONCLUSIONS We found that ΔP was the ventilation variable that best stratified risk. Decreases in ΔP owing to changes in ventilator settings were strongly associated with increased survival.
IMPORTANCE Severely injured patients experiencing hemorrhagic shock often require massive transfusion. Earlier transfusion with higher blood product ratios (plasma, platelets, and red blood cells), defined as damage control resuscitation, has been associated with improved outcomes; however, there have been no large multicenter clinical trials.
OBJECTIVE To determine the effectiveness and safety of transfusing patients with severe trauma and major bleeding using plasma, platelets, and red blood cells in a 1:1:1 ratio compared with a 1:1:2 ratio. DESIGN, SETTING, AND PARTICIPANTS Pragmatic, phase 3, multisite, randomized clinical trial of 680 severely injured patients who arrived at 1 of 12 level I trauma centers in North America directly from the scene and were predicted to require massive transfusion between August 2012 and December 2013.
INTERVENTIONS Blood product ratios of 1:1:1 (338 patients) vs 1:1:2 (342 patients) during active resuscitation in addition to all local standard-of-care interventions (uncontrolled).
MAIN OUTCOMES AND MEASURES Primary outcomes were 24-hour and 30-day all-cause mortality. Prespecified ancillary outcomes included time to hemostasis, blood product volumes transfused, complications, incidence of surgical procedures, and functional status.
RESULTS No significant differences were detected in mortality at 24 hours (12.7% in 1:1:1 group vs 17.0% in 1:1:2 group; difference, −4.2% [95% CI, −9.6% to 1.1%]; P = .12) or at 30 days (22.4% vs 26.1%, respectively; difference, −3.7% [95% CI, −10.2% to 2.7%]; P = .26). Exsanguination, which was the predominant cause of death within the first 24 hours, was significantly decreased in the 1:1:1 group (9.2% vs 14.6% in 1:1:2 group; difference, −5.4% [95% CI, −10.4% to −0.5%]; P = .03). More patients in the 1:1:1 group achieved hemostasis than in the 1:1:2 group (86% vs 78%, respectively; P = .006). Despite the 1:1:1 group receiving more plasma (median of 7 U vs 5 U, P < .001) and platelets (12 U vs 6 U, P < .001) and similar amounts of red blood cells (9 U) over the first 24 hours, no differences between the 2 groups were found for the 23 prespecified complications, including acute respiratory distress syndrome, multiple organ failure, venous thromboembolism, sepsis, and transfusion-related complications.
CONCLUSIONS AND RELEVANCE Among patients with severe trauma and major bleeding, early administration of plasma, platelets, and red blood cells in a 1:1:1 ratio compared with a 1:1:2 ratio did not result in significant differences in mortality at 24 hours or at 30 days. However, more patients in the 1:1:1 group achieved hemostasis and fewer experienced death due to exsanguination by 24 hours. Even though there was an increased use of plasma and platelets transfused in the 1:1:1 group, no other safety differences were identified between the 2 groups.
Background Mechanical chest compression devices have the potential to help maintain high-quality cardiopulmonary resuscitation (CPR), but despite their increasing use, little evidence exists for their effectiveness. We aimed to study whether the introduction of LUCAS-2 mechanical CPR into front-line emergency response vehicles would improve survival from out-of-hospital cardiac arrest.
Methods The pre-hospital randomised assessment of a mechanical compression device in cardiac arrest (PARAMEDIC) trial was a pragmatic, cluster-randomised open-label trial including adults with non-traumatic, out-of-hospital cardiac arrest from four UK Ambulance Services (West Midlands, North East England, Wales, South Central). 91 urban and semi-urban ambulance stations were selected for participation. Clusters were ambulance service vehicles, which were randomly assigned (1:2) to LUCAS-2 or manual CPR. Patients received LUCAS-2 mechanical chest compression or manual chest compressions according to the first trial vehicle to arrive on scene. The primary outcome was survival at 30 days following cardiac arrest and was analysed by intention to treat. Ambulance dispatch staff and those collecting the primary outcome were masked to treatment allocation. Masking of the ambulance staff who delivered the interventions and reported initial response to treatment was not possible. The study is registered with Current Controlled Trials, number ISRCTN08233942.
Findings We enrolled 4471 eligible patients (1652 assigned to the LUCAS-2 group, 2819 assigned to the control group) between April 15, 2010 and June 10, 2013. 985 (60%) patients in the LUCAS-2 group received mechanical chest compression, and 11 (<1%) patients in the control group received LUCAS-2. In the intention-to-treat analysis, 30 day survival was similar in the LUCAS-2 group (104 [6%] of 1652 patients) and in the manual CPR group (193 [7%] of 2819 patients; adjusted odds ratio [OR] 0·86, 95% CI 0·64–1·15). No serious adverse events were noted. Seven clinical adverse events were reported in the LUCAS-2 group (three patients with chest bruising, two with chest lacerations, and two with blood in mouth). 15 device incidents occurred during operational use. No adverse or serious adverse events were reported in the manual group.
Interpretation We noted no evidence of improvement in 30 day survival with LUCAS-2 compared with manual compressions. On the basis of ours and other recent randomised trials, widespread adoption of mechanical CPR devices for routine use does not improve survival.
Background Lung-protective mechanical ventilation during general surgery including the application of PEEP can reduce postoperative pulmonary complications. In a prospective clinical observation study, we evaluated volume-dependent respiratory system compliance in adult patients undergoing ear–nose–throat surgery with ventilation settings chosen empirically by the attending anaesthetist.
Methods In 40 patients, we measured the respiratory variables during intraoperative mechanical ventilation. All measurements were subdivided into 5 min intervals. Dynamic compliance (CRS) and the intratidal volume-dependent CRS curve was calculated for each interval and classified into one of the six specific compliance profiles indicating intratidal recruitment/derecruitment, overdistension or all. We retrospectively compared the occurrences of the respective compliance profiles at PEEP levels of 5 cm H2O and at higher levels.
Results The attending anaesthetists set the PEEP level initially to 5 cm H2O in 29 patients (83%), to 7 cm H2O in 5 patients (14%), and to 8 cm H2O in 2 patients (6%). Across all measurements the mean CRS was 61 (11) ml cm H2O−1 (40–86 ml cm H2O−1) and decreased continuously during the procedure. At PEEP of 5 cm H2O the compliance profile indicating strong intratidal recruitment/derecruitment occurred more often (18.6%) compared with higher PEEP levels (5.5%, P<0.01). Overdistension was practically never observed.
Conclusions In most patients, a PEEP of 5 cm H2O during intraoperative mechanical ventilation is too low to prevent intratidal recruitment/derecruitment. The analysis of the intratidal compliance profile provides the rationale to individually titrate a PEEP level that stabilizes the alveolar recruitment status of the lung during intraoperative mechanical ventilation.
BACKGROUND: Organ failure in severe sepsis and septic shock may be caused by microcirculatory failure.
OBJECTIVE: The objective of this study is to test a conceptual model of microcirculatory failure by using a resuscitation strategy targeting early opening of the constricted microcirculation with active vasodilatation.
DESIGN: A randomised controlled pilot study.
SETTING: Single-centre mixed medical and surgical tertiary ICU.
PATIENTS: Ninety severe sepsis and septic shock patients randomised to early opening microcirculation resuscitation group or standard resuscitation group.
INTERVENTIONS: Standard resuscitation group: fluids, noradrenaline, dobutamine and hydrocortisone were given to achieve a mean arterial pressure (MAP) of more than 60 mmHg, cardiac index more than 2.5 l min−1 m−2 and ScvO2 more than 70%. Microcirculation resuscitation group: nitroglycerin, enoximone, dopamine and dexamethasone targeting a microvascular flow index (MFI), measured by sublingual side-stream dark field imaging, more than 2.5.
MAIN OUTCOME MEASURE: A decrease in organ failure score (SOFA) on day four of ICU treatment.
RESULTS: Data from 37 microcirculation resuscitation and 28 standard resuscitation patients were analysed. In the microcirculation resuscitation group, MFI of more than 2.5 was achieved after a mean ± SD of 7.0 ± 4.6 h. The microcirculation resuscitation group received more fluids, and noradrenaline was equally prescribed in both groups. Per protocol, the decrease in SOFA score at day 4 was not different between groups (P = 0.64). There was a significant reduction in SOFA score in both groups compared with admission (1.2 and 1.6 in microcirculation resuscitation and standard resuscitation groups, respectively; P = 0.028 andP = 0.045).
CONCLUSION: Early opening of the microcirculation in patients with severe sepsis and septic shock using nitroglycerin, enoximone, dopamine and corticosteroids did not result in a faster reduction in organ failure than standard resuscitation.
BACKGROUND: Chronic pain and opioid consumption may trigger diffuse hyperalgesia, but their relative contributions to pain vulnerability remain unclear.
OBJECTIVES: To assess preoperative opioid-induced hyperalgesia and its postoperative clinical consequences in patients with chronic pain scheduled for orthopaedic surgery.
DESIGN: A prospective observational study.
SETTINGS: Raymond Poincare teaching hospital.
PATIENTS: Adults with or without long-term opioid treatment, scheduled for orthopaedic surgery.
PRIMARY OUTCOME MEASURE: Preoperative hyperalgesia was assessed with eight quantitative sensory tests, in a pain-free zone.
SECONDARY OUTCOME MEASURES: Postoperative morphine consumption and pain intensity were evaluated using a numerical rating scale (NRS) in the recovery room and during the first 72 h.
RESULTS: We analysed results from 68 patients (28 opioid-treated patients and 40 controls). Mean daily opioid consumption was 42 ± 25 mg of morphine equivalent. The opioid-treated group displayed significantly higher levels of preoperative hyperalgesia in three tests: heat tolerance threshold (47.1°C vs. 48.4°C; P= 0.045), duration of tolerance to a 47°C stimulus (40.2 vs. 51.1 s; P = 0.03) and mechanical temporal summation [1.79 vs. 1.02 (ΔNRS10–1); P = 0.036]. Patients in the opioid-treated group consumed more morphine (19.1 vs. 9.38 mg; P = 0.001), had a higher pain intensity (7.6 vs. 5.5; P = 0.001) in the recovery room and a higher cumulative morphine dose at 72 h (39.8 vs. 25.6 mg; P = 0.02).
CONCLUSION: Chronic pain patients treated with low doses of opioid had hyperalgesia before surgery. These results highlight the need to personalise the management of patients treated with opioids before surgery.
Rationale: The occurrence of ventilator-associated pneumonia (VAP) is linked to the aspiration of contaminated pharyngeal secretions around the endotracheal tube. Tubes with cuffs made of polyurethane rather than polyvinyl chloride or with a conical rather than a cylindrical shape increase tracheal sealing.
Objectives: To test whether using polyurethane and/or conical cuffs reduces tracheal colonization and VAP in patients with acute respiratory failure.
Methods: We conducted a multicenter, prospective, open-label, randomized study in four parallel groups in four intensive care units between 2010 and 2012. A cohort of 621 patients with expected ventilation longer than 2 days was included at intubation with a cuff composed of cylindrical polyvinyl chloride (n = 148), cylindrical polyurethane (n = 143), conical polyvinyl chloride (n = 150), or conical polyurethane (n = 162). We used Kaplan-Meier estimates and log-rank tests to compare times to events.
Measurements and Main Results: After excluding 17 patients who secondarily refused participation or had met an exclusion criterion, 604 were included in the intention-to-treat analysis. Cumulative tracheal colonization greater than 103 cfu/ml at Day 2 was as follows (median [interquartile range]): cylindrical polyvinyl chloride, 0.66 (0.58–0.74); cylindrical polyurethane, 0.61 (0.53–0.70); conical polyvinyl chloride, 0.67 (0.60–0.76); and conical polyurethane, 0.62 (0.55–0.70) (P = 0.55). VAP developed in 77 patients (14.4%), and postextubational stridor developed in 28 patients (6.4%) (P = 0.20 and 0.28 between groups, respectively).
Conclusions: Among patients requiring mechanical ventilation, polyurethane and/or conically shaped cuffs were not superior to conventional cuffs in preventing tracheal colonization and VAP.
Objective: To evaluate whether using long-axis or short-axis view during ultrasound-guided internal jugular and subclavian central venous catheterization results in fewer skin breaks, decreased time to cannulation, and fewer posterior wall penetrations.
Design: Prospective, randomized crossover study.
Setting: Urban emergency department with approximate annual census of 60,000.
Subjects: Emergency medicine resident physicians at the Denver Health Residency in Emergency Medicine, a postgraduate year 1–4 training program.
Interventions: Resident physicians blinded to the study hypothesis used ultrasound guidance to cannulate the internal jugular and subclavian of a human torso mannequin using the long-axis and short-axis views at each site.
Measurements and Main Results: An ultrasound fellow recorded skin breaks, redirections, and time to cannulation. An experienced ultrasound fellow or attending used a convex 8–4 MHz transducer during cannulation to monitor the needle path and determine posterior wall penetration. Generalized linear mixed models with a random subject effect were used to compare time to cannulation, number of skin breaks and redirections, and posterior wall penetration of the long axis and short axis at each cannulation site. Twenty-eight resident physicians participated: eight postgraduate year 1, eight postgraduate year 2, five postgraduate year 3, and seven postgraduate year 4. The median (interquartile range) number of total internal jugular central venous catheters placed was 27 (interquartile range, 9–42) and subclavian was six catheters (interquartile range, 2–20). The median number of previous ultrasound-guided internal jugular catheters was 25 (interquartile range, 9–40), and ultrasound-guided subclavian catheters were three (interquartile range, 0–5). The long-axis view was associated with a significant decrease in the number of redirections at the internal jugular and subclavian sites, relative risk 0.4 (95% CI, 0.2–0.9) and relative risk 0.5 (95% CI, 0.3–0.7), respectively. There was no significant difference in the number of skin breaks between the long axis and short axis at the subclavian and internal jugular sites. The long-axis view for subclavian was associated with decreased time to cannulation; there was no significant difference in time between the short-axis and long-axis views at the internal jugular site. The prevalence of posterior wall penetration was internal jugular short axis 25%, internal jugular long axis 21%, subclavian short axis 64%, and subclavian long axis 39%. The odds of posterior wall penetration were significantly less in the subclavian long axis (odds ratio, 0.3; 95% CI, 0.1–0.9).
Conclusions: The long-axis view for the internal jugular was more efficient than the short-axis view with fewer redirections. The long-axis view for subclavian central venous catheterization was also more efficient with decreased time to cannulation and fewer redirections. The long-axis approach to subclavian central venous catheterization is also associated with fewer posterior wall penetrations. Using the long-axis view for subclavian central venous catheterization and avoiding posterior wall penetrations may result in fewer central venous catheter–related complications.
Objective: Guidelines for cardiopulmonary resuscitation recommend a chest compression rate of at least 100 compressions/min. A recent clinical study reported optimal return of spontaneous circulation with rates between 100 and 120/min during cardiopulmonary resuscitation for out-of-hospital cardiac arrest. However, the relationship between compression rate and survival is still undetermined.
Design: Prospective, observational study.
Setting: Data is from the Resuscitation Outcomes Consortium Prehospital Resuscitation IMpedance threshold device and Early versus Delayed analysis clinical trial.
Participants: Adults with out-of-hospital cardiac arrest treated by emergency medical service providers.
Measurements Main Results: Data were abstracted from monitor-defibrillator recordings for the first five minutes of emergency medical service cardiopulmonary resuscitation. Multiple logistic regression assessed odds ratio for survival by compression rate categories (<80, 80–99, 100–119, 120–139, ≥140), both unadjusted and adjusted for sex, age, witnessed status, attempted bystander cardiopulmonary resuscitation, location of arrest, chest compression fraction and depth, first rhythm, and study site. Compression rate data were available for 10,371 patients; 6,399 also had chest compression fraction and depth data. Age (mean ± SD) was 67 ± 16 years. Chest compression rate was 111 ± 19 per minute, compression fraction was 0.70 ± 0.17, and compression depth was 42 ± 12 mm. Circulation was restored in 34%; 9% survived to hospital discharge. After adjustment for covariates without chest compression depth and fraction (n = 10,371), a global test found no significant relationship between compression rate and survival (p = 0.19). However, after adjustment for covariates including chest compression depth and fraction (n = 6,399), the global test found a significant relationship between compression rate and survival (p = 0.02), with the reference group (100–119 compressions/min) having the greatest likelihood for survival.
Conclusions: After adjustment for chest compression fraction and depth, compression rates between 100 and 120 per minute were associated with greatest survival to hospital discharge.
Procalcitonin (PCT) is an acute-phase reactant that has been used to diagnose and potentially track the treatment of sepsis. Procalcitonin values rise initially as the infection sets in and eventually fall with resolution. Its level has been reported to be significantly higher in potential nonsurvivors of a septic episode than among survivors. However, there is also a significant amount of evidence against this. We thus conducted a meta-analysis to pool data from all the available studies regarding PCT levels in survivors and nonsurvivors of sepsis. An extensive literature search was conducted using the key words “procalcitonin,” “sepsis,” and “prognosis.” The references of the relevant studies were also scanned. The data from the eligible studies were extracted and analyzed for any significant pooled mean difference between survivors and nonsurvivors both on days 1 and 3. The mean difference in the day 1 PCT values between survivors and nonsurvivors was found to be statistically significant (P = 0.02). The mean difference on day 3 was also statistically significant (P = 0.002). However, in a subgroup consisting of studies on patients with severe sepsis and septic shock, day 1 difference was not found to be significant (P = 0.62). We found heterogeneity of 90% in our study population, which decreased to 62% after exclusion of studies conducted in emergency department patients. Procalcitonin levels in early stages of sepsis are significantly lower among survivors as compared with nonsurvivors of sepsis.