Biblio du mois : Novembre 2017

4 décembre 2017La Biblio du mois

 

Allez c’est Noël avant l’heure : le calendrier-biblio de l’Avent de l’AJAR !

Et oui 1 article par jour jusqu’au jour J, rien que pour vous.

Ce mois-ci, beaucoup d’études françaises (oui il faut savoir défendre ses couleurs quand elles le méritent) très intéressantes. Tout y est : Infectieux, Ventilation, ACR, Transfusion, Chirurgie, de la Néphro, du KT, du Cancer, de la Douleur … pour les DESAR-MIR (et bien plus) ! On finit l’année en beauté (surtout après ces évènements de rentrée) !

Des études bonus pour les discussions en famille :

A quand les génériques de capsules de fécès ? (ou vous pourrez toujours cité l’exemple du BJA du beau métier d’Anesthésiste-Réanimateur ;-))

Et pour ceux qui travaillent les jours de fête, on pense à vous, surtout après l’étude de Pincus : C’est parti pour les nuits blanches au clou gamma ?

Vous pourrez toujours vous réconforter en garde sur l’étude sur le café et ses bienfaits 😉

A l’année prochaine pour de nouveaux évènements encore plus innovants que cette année ! Suivez-nous vite sur nos réseaux pour ne rien manquer :

bonjour@ajar-online.fr   –    

Twitter : @AJAnesthRea

FB: fb.com/AJARParis    –   LinkedIn

 

 

 

 

Stratégies de transfusion en chirurgie cardiaque

 

Mezer et al., NEJM, 2017

http://www.nejm.org/doi/full/10.1056/NEJMoa1711818?query=TOC

DOI: 10.1056/NEJMoa1711818

 

Background

The effect of a restrictive versus liberal red-cell transfusion strategy on clinical outcomes in patients undergoing cardiac surgery remains unclear.

Methods

In this multicenter, open-label, noninferiority trial, we randomly assigned 5243 adults undergoing cardiac surgery who had a European System for Cardiac Operative Risk Evaluation (EuroSCORE) I of 6 or more (on a scale from 0 to 47, with higher scores indicating a higher risk of death after cardiac surgery) to a restrictive red-cell transfusion threshold (transfuse if hemoglobin level was <7.5 g per deciliter, starting from induction of anesthesia) or a liberal red-cell transfusion threshold (transfuse if hemoglobin level was <9.5 g per deciliter in the operating room or intensive care unit [ICU] or was <8.5 g per deciliter in the non-ICU ward). The primary composite outcome was death from any cause, myocardial infarction, stroke, or new-onset renal failure with dialysis by hospital discharge or by day 28, whichever came first. Secondary outcomes included red-cell transfusion and other clinical outcomes.

Results

The primary outcome occurred in 11.4% of the patients in the restrictive-threshold group, as compared with 12.5% of those in the liberal-threshold group (absolute risk difference, −1.11 percentage points; 95% confidence interval [CI], −2.93 to 0.72; odds ratio, 0.90; 95% CI, 0.76 to 1.07; P<0.001 for noninferiority). Mortality was 3.0% in the restrictive-threshold group and 3.6% in the liberal-threshold group (odds ratio, 0.85; 95% CI, 0.62 to 1.16). Red-cell transfusion occurred in 52.3% of the patients in the restrictive-threshold group, as compared with 72.6% of those in the liberal-threshold group (odds ratio, 0.41; 95% CI, 0.37 to 0.47). There were no significant between-group differences with regard to the other secondary outcomes.

Conclusions

In patients undergoing cardiac surgery who were at moderate-to-high risk for death, a restrictive strategy regarding red-cell transfusion was noninferior to a liberal strategy with respect to the composite outcome of death from any cause, myocardial infarction, stroke, or new-onset renal failure with dialysis, with less blood transfused.

 

Arrêt cardiaque pendant épreuve de sport de compétition ?

 

Landry et al., NEJM, 2017

http://www.nejm.org/doi/full/10.1056/NEJMoa1615710

DOI: 10.1056/NEJMoa1615710

 

Background

The incidence of sudden cardiac arrest during participation in sports activities remains unknown. Preparticipation screening programs aimed at preventing sudden cardiac arrest during sports activities are thought to be able to identify at-risk athletes; however, the efficacy of these programs remains controversial. We sought to identify all sudden cardiac arrests that occurred during participation in sports activities within a specific region of Canada and to determine their causes.

Methods

In this retrospective study, we used the Rescu Epistry cardiac arrest database (which contains records of every cardiac arrest attended by paramedics in the network region) to identify all out-of-hospital cardiac arrests that occurred from 2009 through 2014 in persons 12 to 45 years of age during participation in a sport. Cases were adjudicated as sudden cardiac arrest (i.e., having a cardiac cause) or as an event resulting from a noncardiac cause, on the basis of records from multiple sources, including ambulance call reports, autopsy reports, in-hospital data, and records of direct interviews with patients or family members.

Results

Over the course of 18.5 million person-years of observation, 74 sudden cardiac arrests occurred during participation in a sport; of these, 16 occurred during competitive sports and 58 occurred during noncompetitive sports. The incidence of sudden cardiac arrest during competitive sports was 0.76 cases per 100,000 athlete-years, with 43.8% of the athletes surviving until they were discharged from the hospital. Among the competitive athletes, two deaths were attributed to hypertrophic cardiomyopathy and none to arrhythmogenic right ventricular cardiomyopathy. Three cases of sudden cardiac arrest that occurred during participation in competitive sports were determined to have been potentially identifiable if the athletes had undergone preparticipation screening.

Conclusions

In our study involving persons who had out-of-hospital cardiac arrest, the incidence of sudden cardiac arrest during participation in competitive sports was 0.76 cases per 100,000 athlete-years. The occurrence of sudden cardiac arrest due to structural heart disease was uncommon during participation in competitive sports.

Du STEC dans la farine ?

 

Crowe et al., NEJM, 2017

http://www.nejm.org/doi/full/10.1056/NEJMoa1615910

DOI: 10.1056/NEJMoa1615910

Background

In 2016, a multijurisdictional team investigated an outbreak of Shiga toxin–producing Escherichia coli (STEC) serogroup O121 and O26 infections linked to contaminated flour from a large domestic producer.

Methods

A case was defined as infection with an outbreak strain in which illness onset was between December 21, 2015, and September 5, 2016. To identify exposures associated with the outbreak, outbreak cases were compared with non-STEC enteric illness cases, matched according to age group, sex, and state of residence. Products suspected to be related to the outbreak were collected for STEC testing, and a common point of contamination was sought. Whole-genome sequencing was performed on isolates from clinical and food samples.

Results

A total of 56 cases were identified in 24 states. Univariable exact conditional logistic-regression models of 22 matched sets showed that infection was significantly associated with the use of one brand of flour (odds ratio, 21.04; 95% confidence interval [CI], 4.69 to 94.37) and with tasting unbaked homemade dough or batter (odds ratio, 36.02; 95% CI, 4.63 to 280.17). Laboratory testing isolated the outbreak strains from flour samples, and whole-genome sequencing revealed that the isolates from clinical and food samples were closely related to one another genetically. Trace-back investigation identified a common flour-production facility.

Conclusions

This investigation implicated raw flour as the source of an outbreak of STEC infections. Although it is a low-moisture food, raw flour can be a vehicle for foodborne pathogens.

 

Choix d’un remplacement valvulaire : mécanique ou biologique ?

 

Goldstone et al., NEJM, 2017

http://www.nejm.org/doi/full/10.1056/NEJMoa1613792

DOI: 10.1056/NEJMoa1613792

 

Background

In patients undergoing aortic-valve or mitral-valve replacement, either a mechanical or biologic prosthesis is used. Biologic prostheses have been increasingly favored despite limited evidence supporting this practice.

Methods

We compared long-term mortality and rates of reoperation, stroke, and bleeding between inverse-probability-weighted cohorts of patients who underwent primary aortic-valve replacement or mitral-valve replacement with a mechanical or biologic prosthesis in California in the period from 1996 through 2013. Patients were stratified into different age groups on the basis of valve position (aortic vs. mitral valve).

Results

From 1996 through 2013, the use of biologic prostheses increased substantially for aortic-valve and mitral-valve replacement, from 11.5% to 51.6% for aortic-valve replacement and from 16.8% to 53.7% for mitral-valve replacement. Among patients who underwent aortic-valve replacement, receipt of a biologic prosthesis was associated with significantly higher 15-year mortality than receipt of a mechanical prosthesis among patients 45 to 54 years of age (30.6% vs. 26.4% at 15 years; hazard ratio, 1.23; 95% confidence interval [CI], 1.02 to 1.48; P=0.03) but not among patients 55 to 64 years of age. Among patients who underwent mitral-valve replacement, receipt of a biologic prosthesis was associated with significantly higher mortality than receipt of a mechanical prosthesis among patients 40 to 49 years of age (44.1% vs. 27.1%; hazard ratio, 1.88; 95% CI, 1.35 to 2.63; P<0.001) and among those 50 to 69 years of age (50.0% vs. 45.3%; hazard ratio, 1.16; 95% CI, 1.04 to 1.30; P=0.01). The incidence of reoperation was significantly higher among recipients of a biologic prosthesis than among recipients of a mechanical prosthesis. Patients who received mechanical valves had a higher cumulative incidence of bleeding and, in some age groups, stroke than did recipients of a biologic prosthesis.

Conclusions

The long-term mortality benefit that was associated with a mechanical prosthesis, as compared with a biologic prosthesis, persisted until 70 years of age among patients undergoing mitral-valve replacement and until 55 years of age among those undergoing aortic-valve replacement.

 

 

Revue sur les diurétiques dans l’insuffisance cardiaque !

 

Ellison et al., NEJM, 2017

http://www.nejm.org/doi/full/10.1056/NEJMra1703100

DOI: 10.1056/NEJMra1703100

 

 

Revue sur la NASH

 

Diehl et al., NEJM, 2017

http://www.nejm.org/doi/full/10.1056/NEJMra1503519

DOI: 10.1056/NEJMra1503519

 

Revue sur la GVH Aigue

Zeiser et al., NEJM, 2017

http://www.nejm.org/doi/full/10.1056/NEJMra1609337?query=TOC

DOI: 10.1056/NEJMra1609337

 

 

 

Un modèle avec des examens de routine prédictif de l’évolution d’IRA vers l’IRC ?

 

T. James, et al., JAMA, 2017

https://jamanetwork.com/journals/jama/article-abstract/2662889

doi:10.1001/jama.2017.16326

 

 

Importance  Some patients will develop chronic kidney disease after a hospitalization with acute kidney injury; however, no risk-prediction tools have been developed to identify high-risk patients requiring follow-up.

Objective  To derive and validate predictive models for progression of acute kidney injury to advanced chronic kidney disease.

Design, Setting, and Participants  Data from 2 population-based cohorts of patients with a prehospitalization estimated glomerular filtration rate (eGFR) of more than 45 mL/min/1.73 m2 and who had survived hospitalization with acute kidney injury (defined by a serum creatinine increase during hospitalization > 0.3 mg/dL or > 50% of their prehospitalization baseline), were used to derive and validate multivariable prediction models. The risk models were derived from 9973 patients hospitalized in Alberta, Canada (April 2004-March 2014, with follow-up to March 2015). The risk models were externally validated with data from a cohort of 2761 patients hospitalized in Ontario, Canada (June 2004-March 2012, with follow-up to March 2013).

Exposures  Demographic, laboratory, and comorbidity variables measured prior to discharge.

Main Outcomes and Measures  Advanced chronic kidney disease was defined by a sustained reduction in eGFR less than 30 mL/min/1.73 m2 for at least 3 months during the year after discharge. All participants were followed up for up to 1 year.

Results  The participants (mean [SD] age, 66 [15] years in the derivation and internal validation cohorts and 69 [11] years in the external validation cohort; 40%-43% women per cohort) had a mean (SD) baseline serum creatinine level of 1.0 (0.2) mg/dL and more than 20% had stage 2 or 3 acute kidney injury. Advanced chronic kidney disease developed in 408 (2.7%) of 9973 patients in the derivation cohort and 62 (2.2%) of 2761 patients in the external validation cohort. In the derivation cohort, 6 variables were independently associated with the outcome: older age, female sex, higher baseline serum creatinine value, albuminuria, greater severity of acute kidney injury, and higher serum creatinine value at discharge. In the external validation cohort, a multivariable model including these 6 variables had a C statistic of 0.81 (95% CI, 0.75-0.86) and improved discrimination and reclassification compared with reduced models that included age, sex, and discharge serum creatinine value alone (integrated discrimination improvement, 2.6%; 95% CI, 1.1%-4.0%; categorical net reclassification index, 13.5%; 95% CI, 1.9%-25.1%) or included age, sex, and acute kidney injury stage alone (integrated discrimination improvement, 8.0%; 95% CI, 5.1%-11.0%; categorical net reclassification index, 79.9%; 95% CI, 60.9%-98.9%).

Conclusions and Relevance  A multivariable model using routine laboratory data was able to predict advanced chronic kidney disease following hospitalization with acute kidney injury. The utility of this model in clinical care requires further research.

 

 

Des capsules de merde pour se soigner ?

(On n’arrête pas le progrès ;-))

 

Kao et al., JAMA, 2017

https://jamanetwork.com/journals/jama/article-abstract/2664458

doi:10.1001/jama.2017.17077

 

Importance  Fecal microbiota transplantation (FMT) is effective in preventing recurrent Clostridium difficile infection (RCDI). However, it is not known whether clinical efficacy differs by route of delivery.

Objective  To determine whether FMT by oral capsule is noninferior to colonoscopy delivery in efficacy.

Design, Setting, and Participants  Noninferiority, unblinded, randomized trial conducted in 3 academic centers in Alberta, Canada. A total of 116 adult patients with RCDI were enrolled between October 2014 and September 2016, with follow-up to December 2016. The noninferiority margin was 15%.

Interventions  Participants were randomly assigned to FMT by capsule or by colonoscopy at a 1:1 ratio.

Main Outcomes and Measures  The primary outcome was the proportion of patients without RCDI 12 weeks after FMT. Secondary outcomes included (1) serious and minor adverse events, (2) changes in quality of life by the 36-Item Short Form Survey on a scale of 0 (worst possible quality of life) to 100 (best quality of life), and (3) patient perception on a scale of 1 (not at all unpleasant) to 10 (extremely unpleasant) and satisfaction on a scale of 1 (best) to 10 (worst).

Results  Among 116 patients randomized (mean [SD] age, 58 [19] years; 79 women [68%]), 105 (91%) completed the trial, with 57 patients randomized to the capsule group and 59 to the colonoscopy group. In per-protocol analysis, prevention of RCDI after a single treatment was achieved in 96.2% in both the capsule group (51/53) and the colonoscopy group (50/52) (difference, 0%; 1-sided 95% CI, −6.1% to infinity; P < .001), meeting the criterion for noninferiority. One patient in each group died of underlying cardiopulmonary illness unrelated to FMT. Rates of minor adverse events were 5.4% for the capsule group vs 12.5% for the colonoscopy group. There was no significant between-group difference in improvement in quality of life. A significantly greater proportion of participants receiving capsules rated their experience as “not at all unpleasant” (66% vs 44%; difference, 22% [95% CI, 3%-40%]; P = .01).

Conclusions and Relevance  Among adults with RCDI, FMT via oral capsules was not inferior to delivery by colonoscopy for preventing recurrent infection over 12 weeks. Treatment with oral capsules may be an effective approach to treating RCDI.

 

Fracture de hanche : urgence à opérer dans les 24h ?

 

Pincus et al., JAMA, 2017

https://jamanetwork.com/journals/jama/article-abstract/2664460

doi:10.1001/jama.2017.17606

 

Importance  Although wait times for hip fracture surgery have been linked to mortality and are being used as quality-of-care indicators worldwide, controversy exists about the duration of the wait that leads to complications.

Objective  To use population-based wait-time data to identify the optimal time window in which to conduct hip fracture surgery before the risk of complications increases.

Design, Setting, and Participants  Population-based, retrospective cohort study of adults undergoing hip fracture surgery between April 1, 2009, and March 31, 2014, at 72 hospitals in Ontario, Canada. Risk-adjusted restricted cubic splines modeled the probability of each complication according to wait time. The inflection point (in hours) when complications began to increase was used to define early and delayed surgery. To evaluate the robustness of this definition, outcomes among propensity-score matched early and delayed surgical patients were compared using percent absolute risk differences (RDs, with 95% CIs).

Exposure  Time elapsed from hospital arrival to surgery (in hours).

Main Outcomes and Measures  Mortality within 30 days. Secondary outcomes included a composite of mortality or other medical complications (myocardial infarction, deep vein thrombosis, pulmonary embolism, and pneumonia).

Results  Among 42 230 patients with hip fracture (mean [SD] age, 80.1 years [10.7], 70.5% women) who met study entry criteria, overall mortality at 30 days was 7.0%. The risk of complications increased when wait times were greater than 24 hours, irrespective of the complication considered. Compared with 13 731 propensity-score matched patients who received surgery earlier, 13 731 patients who received surgery after 24 hours had a significantly higher risk of 30-day mortality (898 [6.5%] vs 790 [5.8%]; % absolute RD, 0.79; 95% CI, 0.23-1.35) and the composite outcome (1680 [12.2%]) vs 1383 [10.1%]; % absolute RD, 2.16; 95% CI, 1.43-2.89).

Conclusions and Relevance  Among adults undergoing hip fracture surgery, increased wait time was associated with a greater risk of 30-day mortality and other complications. A wait time of 24 hours may represent a threshold defining higher risk.

 

Tazo-Vanco associée à d’avantage d’insuffisance rénale aigue chez l’enfant ?

 

Downes et al., JAMA Pediatrics, 2017

https://jamanetwork.com.hellebore.biusante.parisdescartes.fr/journals/jamapediatrics/fullarticle/2654886

doi:10.1001/jamapediatrics.2017.3219

 

Importance  β-Lactam antibiotics are often coadministered with intravenous (IV) vancomycin hydrochloride for children with suspected serious infections. For adults, the combination of IV vancomycin plus piperacillin sodium/tazobactam sodium is associated with a higher risk of acute kidney injury (AKI) compared with vancomycin plus 1 other β-lactam antibiotic. However, few studies have evaluated the safety of this combination for children.

Objective  To assess the risk of AKI in children during concomitant therapy with vancomycin and 1 antipseudomonal β-lactam antibiotic throughout the first week of hospitalization.

Design, Setting, and Participants  This retrospective cohort study focused on children hospitalized for 3 or more days who received IV vancomycin plus 1 other antipseudomonal β-lactam combination therapy at 1 of 6 large children’s hospitals from January 1, 2007, through December 31, 2012. The study used the Pediatric Health Information System Plus database, which contains administrative and laboratory data from 6 pediatric hospitals in the United States. Patients with underlying kidney disease or abnormal serum creatinine levels on hospital days 0 to 2 were among those excluded. Patients 6 months to 18 years of age who were admitted through the emergency department of the hospital were included. Data were collected from July 2015 to March 2016. Data analysis took place from April 2016 through July 2017. (Exact dates are not available because the data collection and analysis processes were iterative.)

Main Outcomes and Measures  The primary outcome was AKI on hospital days 3 to 7 and within 2 days of receiving combination therapy. Acute kidney injury was defined using KDIGO criteria and was based on changes in serum creatinine level from hospital days 0 to 2 through hospital days 3 to 7. Multiple logistic regression was performed using a discrete-time failure model to test the association between AKI and receipt of IV vancomycin plus piperacillin/tazobactam or vancomycin plus 1 other antipseudomonal β-lactam antibiotic.

Results  A total of 1915 hospitalized children who received combination therapy were identified. Of the 1915 patients, a total of 866 (45.2%) were female and 1049 (54.8%) were male, 1049 (54.8%) were identified as white in race/ethnicity, and the median (interquartile range) age was 5.6 (2.1-12.7) years. Among the cohort who received IV vancomycin plus 1 other antipseudomonal β-lactam antibiotic, 157 patients (8.2%) had antibiotic-associated AKI. This number included 117 of 1009 patients (11.7%) who received IV vancomycin plus piperacillin/tazobactam combination therapy. After adjustment for age, intensive care unit level of care, receipt of nephrotoxins, and hospital, IV vancomycin plus piperacillin/tazobactam combination therapy was associated with higher odds of AKI each hospital day compared with vancomycin plus 1 other antipseudomonal β-lactam antibiotic combination (adjusted odds ratio, 3.40; 95% CI, 2.26-5.14).

Conclusions and Relevance  Coadministration of IV vancomycin and piperacillin/tazobactam may increase the risk of AKI in hospitalized children. Pediatricians must be cognizant of the potential added risk of this combination therapy when making empirical antibiotic choices.

 

 

 

Prévenir mieux que guérir l’IRA en Chirurgie cardiaque ?

 

Meersch et al., ICM, 2017

http://icmjournal.esicm.org/journals/abstract.html?v=43&j=134&i=11&a=4670_10.1007_s00134-016-4670-3&doi=

 

Purpose

Care bundles are recommended in patients at high risk for acute kidney injury (AKI), although they have not been proven to improve outcomes. We sought to establish the efficacy of an implementation of the Kidney Disease Improving Global Outcomes (KDIGO) guidelines to prevent cardiac surgery-associated AKI in high risk patients defined by renal biomarkers.

Methods

In this single-center trial, we examined the effect of a “KDIGO bundle” consisting of optimization of volume status and hemodynamics, avoidance of nephrotoxic drugs, and preventing hyperglycemia in high risk patients defined as urinary [TIMP-2]·[IGFBP7] > 0.3 undergoing cardiac surgery. The primary endpoint was the rate of AKI defined by KDIGO criteria within the first 72 h after surgery. Secondary endpoints included AKI severity, need for dialysis, length of stay, and major adverse kidney events (MAKE) at days 30, 60, and 90.

Results

AKI was significantly reduced with the intervention compared to controls [55.1 vs. 71.7%; ARR 16.6% (95 CI 5.5–27.9%); p = 0.004]. The implementation of the bundle resulted in significantly improved hemodynamic parameters at different time points (p < 0.05), less hyperglycemia (p < 0.001) and use of ACEi/ARBs (p < 0.001) compared to controls. Rates of moderate to severe AKI were also significantly reduced by the intervention compared to controls. There were no significant effects on other secondary outcomes.

Conclusion

An implementation of the KDIGO guidelines compared with standard care reduced the frequency and severity of AKI after cardiac surgery in high risk patients. Adequately powered multicenter trials are warranted to examine mortality and long-term renal outcomes.

 

La position latérale de Trendelenburg contre les PAVM ?

 

Bassi et al., ICM, 2017

http://icmjournal.esicm.org/journals/abstract.html?v=43&j=134&i=11&a=4858_10.1007_s00134-017-4858-1&doi=

 

 

Purpose

The lateral Trendelenburg position (LTP) may hinder the primary pathophysiologic mechanism of ventilator-associated pneumonia (VAP). We investigated whether placing patients in the LTP would reduce the incidence of VAP in comparison with the semirecumbent position (SRP).

Methods

This was a randomized, multicenter, controlled study in invasively ventilated critically ill patients. Two preplanned interim analyses were performed. Patients were randomized to be placed in the LTP or the SRP. The primary outcome, assessed by intention-to-treat analysis, was incidence of microbiologically confirmed VAP. Major secondary outcomes included mortality, duration of mechanical ventilation, and intensive care unit length of stay.

Results

At the second interim analysis, the trial was stopped because of low incidence of VAP, lack of benefit in secondary outcomes, and occurrence of adverse events. A total of 194 patients in the LTP group and 201 in the SRP group were included in the final intention-to-treat analysis. The incidence of microbiologically confirmed VAP was 0.5% (1/194) and 4.0% (8/201) in LTP and SRP patients, respectively (relative risk 0.13, 95% CI 0.02–1.03, p = 0.04). The 28-day mortality was 30.9% (60/194) and 26.4% (53/201) in LTP and SRP patients, respectively (relative risk 1.17, 95% CI 0.86–1.60, p = 0.32). Likewise, no differences were found in other secondary outcomes. Six serious adverse events were described in LTP patients (p = 0.01 vs. SRP).

Conclusions

The LTP slightly decreased the incidence of microbiologically confirmed VAP. Nevertheless, given the early termination of the trial, the low incidence of VAP, and the adverse events associated with the LTP, the study failed to prove any significant benefit. Further clinical investigation is strongly warranted; however, at this time, the LTP cannot be recommended as a VAP preventive measure.

 

Pose de cathéter sous-clavier écho-guidée : Hors plan > Dans le plan ?

 

Vezzani et al., ICM, 2017

http://icmjournal.esicm.org/journals/abstract.html?v=43&j=134&i=11&a=4756_10.1007_s00134-017-4756-6&doi=

 

Purpose

The aim of this study was to compare the success rate and safety of short-axis versus long-axis approaches to ultrasound-guided subclavian vein cannulation.

Methods

A total of 190 patients requiring central venous cannulation following cardiac surgery were randomized to either short-axis or long-axis ultrasound-guided cannulation of the subclavian vein. Each cannulation was performed by anesthesiologists with at least 3 years’ experience of ultrasound-guided central vein cannulation (>150 procedures/year, 50% short-axis and 50% long-axis). Success rate, insertion time, number of needle redirections, number of separate skin or vessel punctures, rate of mechanical complications, catheter misplacements, and incidence of central line-associated bloodstream infection were documented for each procedure.

Results

The subclavian vein was successfully cannulated in all 190 patients. The mean insertion time was significantly shorter (p = 0.040) in the short-axis group (69 ± 74 s) than in the long-axis group (98 ± 103 s). The short-axis group was also associated with a higher overall success rate (96 vs. 78%, p < 0.001), first-puncture success rate (86 vs. 67%, p = 0.003), and first-puncture single-pass success rate (72 vs. 48%, p = 0.002), and with fewer needle redirections (0.39 ± 0.88 vs. 0.88 ± 1.15, p = 0.001), skin punctures (1.12 ± 0.38 vs. 1.28 ± 0.54, p = 0.019), and complications (3 vs. 13%, p = 0.028).

Conclusions

The short-axis procedure for ultrasound-guided subclavian cannulation offers advantages over the long-axis approach in cardiac surgery patients.

 

Etude VHYPER : VNI post-extubation chez insuffisants respiratoires chroniques ?

 

Vargas et al., ICM, 2017

http://icmjournal.esicm.org/journals/abstract.html?v=43&j=134&i=11&a=4785_10.1007_s00134-017-4785-1&doi=

 

Purpose

Early noninvasive ventilation (NIV) after extubation decreases the risk of respiratory failure and lowers 90-day mortality in patients with hypercapnia. Patients with chronic respiratory disease are at risk of extubation failure. Therefore, it could be useful to determine the role of NIV with a discontinuous approach, not limited to patients with hypercapnia. We assessed the efficacy of early NIV in decreasing respiratory failure after extubation in patients with chronic respiratory disorders.

Methods

A prospective randomized controlled multicenter study was conducted. We enrolled 144 mechanically ventilated patients with chronic respiratory disorders who tolerated a spontaneous breathing trial. Patients were randomly allocated after extubation to receive either NIV (NIV group, n = 72), performed with a discontinuous approach, for the first 48 h, or conventional oxygen treatment (usual care group, n = 72). The primary endpoint was decreased respiratory failure within 48 h after extubation. Analysis was by intention to treat. This trial was registered with ClinicalTrials.gov (NCT01047852).

Results

Respiratory failure after extubation was less frequent in the NIV group: 6 (8.5%) versus 20 (27.8%); p = 0.0016. Six patients (8.5%) in the NIV group versus 13 (18.1%) in the usual care group were reintubated; p = 0.09. Intensive care unit (ICU) mortality and 90-day mortality did not differ significantly between the two groups (p = 0.28 and p = 0.33, respectively). Median postrandomization ICU length of stay was lower in the usual care group: 3 days (IQR 2–6) versus 4 days (IQR 2–7; p = 0.008). Patients with hypercapnia during a spontaneous breathing trial were at risk of developing postextubation respiratory failure [adjusted odds ratio (95% CI) = 4.56 (1.59–14.00); p = 0.006] and being intubated [adjusted odds ratio (95% CI) = 3.60 (1.07–13.31); p = 0.04].

Conclusions

Early NIV performed following a sequential protocol for the first 48 h after extubation decreased the risk of respiratory failure in patients with chronic respiratory disorders. Reintubation and mortality did not differ between NIV and conventional oxygen therapy.

 

 

Etude EAT-ICU : Objectifs de nutrition pour quel outcome ?

 

Allingstrup et al., ICM, 2017

http://icmjournal.esicm.org/journals/abstract.html?v=43&j=134&i=11&a=4880_10.1007_s00134-017-4880-3&doi=

 

Purpose

We assessed the effects of early goal-directed nutrition (EGDN) vs. standard nutritional care in adult intensive care unit (ICU) patients.

Methods

We randomised acutely admitted, mechanically ventilated ICU patients expected to stay longer than 3 days in the ICU. In the EGDN group we estimated nutritional requirements by indirect calorimetry and 24-h urinary urea aiming at covering 100% of requirements from the first full trial day using enteral and parenteral nutrition. In the standard of care group we aimed at providing 25 kcal/kg/day by enteral nutrition. If this was not met by day 7, patients were supplemented with parenteral nutrition. The primary outcome was physical component summary (PCS) score of SF-36 at 6 months. We performed multiple imputation for data of the non-responders.

Results

We randomised 203 patients and included 199 in the intention-to-treat analyses; baseline variables were reasonably balanced between the two groups. The EGDN group had less negative energy (p < 0.001) and protein (p < 0.001) balances in the ICU as compared to the standard of care group. The PCS score at 6 months did not differ between the two groups (mean difference 0.0, 95% CI −5.9 to 5.8, p = 0.99); neither did mortality, rates of organ failures, serious adverse reactions or infections in the ICU, length of ICU or hospital stay, or days alive without life support at 90 days.

Conclusions

EGDN did not appear to affect physical quality of life at 6 months or other important outcomes as compared to standard nutrition care in acutely admitted, mechanically ventilated, adult ICU patients.

 

 

Avènement de la BIPAP-APRV dans le SDRA ?

 

Zhou et al., ICM, 2017

http://icmjournal.esicm.org/journals/abstract.html?v=43&j=134&i=11&a=4912_10.1007_s00134-017-4912-z&doi=

 

Purpose

Experimental animal models of acute respiratory distress syndrome (ARDS) have shown that the updated airway pressure release ventilation (APRV) methodologies may significantly improve oxygenation, maximize lung recruitment, and attenuate lung injury, without circulatory depression. This led us to hypothesize that early application of APRV in patients with ARDS would allow pulmonary function to recover faster and would reduce the duration of mechanical ventilation as compared with low tidal volume lung protective ventilation (LTV).

Methods

A total of 138 patients with ARDS who received mechanical ventilation for <48 h between May 2015 to October 2016 while in the critical care medicine unit (ICU) of the West China Hospital of Sichuan University were enrolled in the study. Patients were randomly assigned to receive APRV (n = 71) or LTV (n = 67). The settings for APRV were: high airway pressure (Phigh) set at the last plateau airway pressure (Pplat), not to exceed 30 cmH2O) and low airway pressure ( Plow) set at 5 cmH2O; the release phase (Tlow) setting adjusted to terminate the peak expiratory flow rate to ≥ 50%; release frequency of 10–14 cycles/min. The settings for LTV were: target tidal volume of 6 mL/kg of predicted body weight; Pplat not exceeding 30 cmH2O; positive end-expiratory pressure (PEEP) guided by the PEEP–FiO2 table according to the ARDSnet protocol. The primary outcome was the number of days without mechanical ventilation from enrollment to day 28. The secondary endpoints included oxygenation, Pplat, respiratory system compliance, and patient outcomes.

Results

Compared with the LTV group, patients in the APRV group had a higher median number of ventilator-free days {19 [interquartile range (IQR) 8–22] vs. 2 (IQR 0–15); P < 0.001}. This finding was independent of the coexisting differences in chronic disease. The APRV group had a shorter stay in the ICU (P = 0.003). The ICU mortality rate was 19.7% in the APRV group versus 34.3% in the LTV group (P = 0.053) and was associated with better oxygenation and respiratory system compliance, lower Pplat, and less sedation requirement during the first week following enrollment (P < 0.05, repeated-measures analysis of variance).

Conclusions

Compared with LTV, early application of APRV in patients with ARDS improved oxygenation and respiratory system compliance, decreased Pplat and reduced the duration of both mechanical ventilation and ICU stay.

 

Microbiologie molcéulaire en Réa : Oui !

 

 

Cambau et al., ICM, 2017

http://icmjournal.esicm.org/journals/abstract.html?v=43&j=134&i=11&a=4766_10.1007_s00134-017-4766-4&doi=

 

 

Purpose

Microbiological diagnosis (MD) of infections remains insufficient. The resulting empirical antimicrobial therapy leads to multidrug resistance and inappropriate treatments. We therefore evaluated the cost-effectiveness of direct molecular detection of pathogens in blood for patients with severe sepsis (SES), febrile neutropenia (FN) and suspected infective endocarditis (SIE).

Methods

Patients were enrolled in a multicentre, open-label, cluster-randomised crossover trial conducted during two consecutive periods, randomly assigned as control period (CP; standard diagnostic workup) or intervention period (IP; additional testing with LightCycler®SeptiFast). Multilevel models used to account for clustering were stratified by clinical setting (SES, FN, SIE).

Results

A total of 1416 patients (907 SES, 440 FN, 69 SIE) were evaluated for the primary endpoint (rate of blood MD). For SES patients, the MD rate was higher during IP than during CP [42.6% (198/465) vs. 28.1% (125/442), odds ratio (OR) 1.89, 95% confidence interval (CI) 1.43–2.50; P < 0.001], with an absolute increase of 14.5% (95% CI 8.4–20.7). A trend towards an association was observed for SIE [35.4% (17/48) vs. 9.5% (2/21); OR 6.22 (0.98–39.6)], but not for FN [32.1% (70/218) vs. 30.2% (67/222), P = 0.66]. Overall, turn-around time was shorter during IP than during CP (22.9 vs. 49.5 h, P < 0.001) and hospital costs were similar (median, mean ± SD: IP €14,826, €18,118 ± 17,775; CP €17,828, €18,653 ± 15,966). Bootstrap analysis of the incremental cost-effectiveness ratio showed weak dominance of intervention in SES patients.

Conclusion

Addition of molecular detection to standard care improves MD and thus efficiency of healthcare resource usage in patients with SES.

 

Recommandations pour la prise en charge du palu grave et de la dengue sévère

 

Don dorp et al., ICM, 2017

http://icmjournal.esicm.org/journals/abstract.html?v=43&j=134&i=11&a=4602_10.1007_s00134-016-4602-2&doi=

 

Revue sur la dysfonction diaphragmatique en Réanimation

 

Dres et al., ICM, 2017

http://icmjournal.esicm.org/journals/abstract.html?v=43&j=134&i=10&a=4928_10.1007_s00134-017-4928-4&doi=

 

 

 

L’interaction coeur-poumons : source potentielle de VILI ?

 

 

Katira et al., AJRCCM,2017

http://www.atsjournals.org/doi/full/10.1164/rccm.201611-2268OC

 

 

Rationale: In the original 1974 in vivo study of ventilator-induced lung injury, Webb and Tierney reported that high Vt with zero positive end-expiratory pressure caused overwhelming lung injury, subsequently shown by others to be due to lung shear stress.

Objectives: To reproduce the lung injury and edema examined in the Webb and Tierney study and to investigate the underlying mechanism thereof.

Methods: Sprague-Dawley rats weighing approximately 400 g received mechanical ventilation for 60 minutes according to the protocol of Webb and Tierney (airway pressures of 14/0, 30/0, 45/10, 45/0 cm H2O). Additional series of experiments (20 min in duration to ensure all animals survived) were studied to assess permeability (n = 4 per group), echocardiography (n = 4 per group), and right and left ventricular pressure (n = 5 and n = 4 per group, respectively).

Measurements and Main Results: The original Webb and Tierney results were replicated in terms of lung/body weight ratio (45/0 > 45/10 ≈ 30/0 ≈ 14/0; P < 0.05) and histology. In 45/0, pulmonary edema was overt and rapid, with survival less than 30 minutes. In 45/0 (but not 45/10), there was an increase in microvascular permeability, cyclical abolition of preload, and progressive dilation of the right ventricle. Although left ventricular end-diastolic pressure decreased in 45/10, it increased in 45/0.

Conclusions: In a classic model of ventilator-induced lung injury, high peak pressure (and zero positive end-expiratory pressure) causes respiratory swings (obliteration during inspiration) in right ventricular filling and pulmonary perfusion, ultimately resulting in right ventricular failure and dilation. Pulmonary edema was due to increased permeability, which was augmented by a modest (approximately 40%) increase in hydrostatic pressure. The lung injury and acute cor pulmonale is likely due to pulmonary microvascular injury, the mechanism of which is uncertain, but which may be due to cyclic interruption and exaggeration of pulmonary blood flow.

 

Validation du qSOFA pour les Pneumopathies aigues communautaires

 

Ranzani et al., AJRCCM, 2017

http://www.atsjournals.org/doi/full/10.1164/rccm.201611-2262OC

 

Rationale: The Sepsis-3 Task Force updated the clinical criteria for sepsis, excluding the need for systemic inflammatory response syndrome (SIRS) criteria. The clinical implications of the proposed flowchart including the quick Sequential (Sepsis-related) Organ Failure Assessment (qSOFA) and SOFA scores are unknown.

Objectives: To perform a clinical decision-making analysis of Sepsis-3 in patients with community-acquired pneumonia.

Methods: This was a cohort study including adult patients with community-acquired pneumonia from two Spanish university hospitals. SIRS, qSOFA, the Confusion, Respiratory Rate and Blood Pressure (CRB) score, modified SOFA (mSOFA), the Confusion, Urea, Respiratory Rate, Blood Pressure and Age (CURB-65) score, and Pneumonia Severity Index (PSI) were calculated with data from the emergency department. We used decision-curve analysis to evaluate the clinical usefulness of each score and the primary outcome was in-hospital mortality.

Measurements and Main Results: Of 6,874 patients, 442 (6.4%) died in-hospital. SIRS presented the worst discrimination, followed by qSOFA, CRB, mSOFA, CURB-65, and PSI. Overall, overestimation of in-hospital mortality and miscalibration was more evident for qSOFA and mSOFA. SIRS had lower net benefit than qSOFA and CRB, significantly increasing the risk of over-treatment and being comparable with the “treat-all” strategy. PSI had higher net benefit than mSOFA and CURB-65 for mortality, whereas mSOFA seemed more applicable when considering mortality/intensive care unit admission. Sepsis-3 flowchart resulted in better identification of patients at high risk of mortality.

Conclusions: qSOFA and CRB outperformed SIRS and presented better clinical usefulness as prompt tools for patients with community-acquired pneumonia in the emergency department. Among the tools for a comprehensive patient assessment, PSI had the best decision-aid tool profile.

 

 

L’oligurie prédictive d’IRA en per-opératoire ?

 

Mizota et al., BJA, 2017

https://academic-oup-com.hellebore.biusante.parisdescartes.fr/bja/article-abstract/119/6/1127/4372406?redirectedFrom=fulltext

 

 

Background. The threshold of intraoperative urine output below which the risk of acute kidney injury (AKI) increases is unclear. The aim of this retrospective cohort study was to investigate the relationship between intraoperative urine output during major abdominal surgery and the development of postoperative AKI and to identify an optimal threshold for predicting the differential risk of AKI.Methods. Perioperative data were collected retrospectively on 3560 patients undergoing major abdominal surgery (liver, colorectal, gastric, pancreatic, or oesophageal resection) at Kyoto University Hospital. We evaluated the relationship between intraoperative urine output and the development of postoperative AKI as defined by recent guidelines. Logistic regression analysis was performed to adjust for patient and operative variables, and the minimum P-value approach was used to determine the threshold of intraoperative urine output that independently altered the risk of AKI.Results. The overall incidence of AKI in the study population was 6.3%. Using the minimum P-value approach, a threshold of 0.3 ml kg−1 h−1 was identified, below which there was an increased risk of AKI (adjusted odds ratio, 2.65; 95% confidence interval, 1.77–3.97; P<0.001). The addition of oliguria <0.3 ml kg−1 h−1 to a model with conventional risk factors significantly improved risk stratification for AKI (net reclassification improvement, 0.159; 95% confidence interval, 0.049–0.270; P=0.005).Conclusions. Among patients undergoing major abdominal surgery, intraoperative oliguria <0.3 ml kg−1 h−1 was significantly associated with increased risk of postoperative AKI.

 

 

Exemple de rôle ou Rôle exemplaire de l’Anesthésiste-Réanimateur en péri-opératoire

 

Howell et al., BJA, 2017

https://academic-oup-com.hellebore.biusante.parisdescartes.fr/bja/article/119/suppl_1/i15/4638476

 

Revue sur les innovations thérapeutiques en Cardiologie

 

Foex et al., BJA, 2017

https://academic-oup-com.hellebore.biusante.parisdescartes.fr/bja/article/119/suppl_1/i23/4638466

 

 

Effet de l’Optiflow sur l’effort respiratoire

 

Delorme et al., CCM, 2017

http://journals.lww.com/ccmjournal/Abstract/2017/12000/Effects_of_High_Flow_Nasal_Cannula_on_the_Work_of.3.aspx

doi: 10.1097/CCM.0000000000002693

 

Objectives: High-flow nasal cannula is increasingly used in the management of respiratory failure. However, little is known about its impact on respiratory effort, which could explain part of the benefits in terms of comfort and efficiency. This study was designed to assess the effects of high-flow nasal cannula on indexes of respiratory effort (i.e., esophageal pressure variations, esophageal pressure-time product/min, and work of breathing/min) in adults.

Design: A randomized controlled crossover study was conducted in 12 patients with moderate respiratory distress (i.e., after partial recovery from an acute episode, allowing physiologic measurements).

Setting: Institut Universitaire de Cardiologie et de Pneumologie de Québec, QC, Canada.

Subjects: Twelve adult patients with respiratory distress symptoms were enrolled in this study.

Interventions: Four experimental conditions were evaluated: baseline with conventional oxygen therapy and high-flow nasal cannula at 20, 40, and 60 L/min. The primary outcomes were the indexes of respiratory effort (i.e., esophageal pressure variations, esophageal pressure-time product/min, and work of breathing/min). Secondary outcomes included tidal volume, respiratory rate, minute volume, dynamic lung compliance, inspiratory resistance, and blood gases.

Measurements and Main Results: Esophageal pressure variations decreased from 9.8 (5.8–14.6) cm H2O at baseline to 4.9 (2.1–9.1) cm H2O at 60 L/min (p = 0.035). Esophageal pressure-time product/min decreased from 165 (126–179) to 72 (54–137) cm H2O • s/min, respectively (p = 0.033). Work of breathing/min decreased from 4.3 (3.5–6.3) to 2.1 (1.5–5.0) J/min, respectively (p = 0.031). Respiratory pattern variables and capillary blood gases were not significantly modified between experimental conditions. Dynamic lung compliance increased from 38 (24–64) mL/cm H2O at baseline to 59 (43–175) mL/cm H2O at 60 L/min (p = 0.007), and inspiratory resistance decreased from 9.6 (5.5–13.4) to 5.0 (1.0–9.1) cm H2O/L/s, respectively (p = 0.07).

Conclusions: High-flow nasal cannula, when set at 60 L/min, significantly reduces the indexes of respiratory effort in adult patients recovering from acute respiratory failure. This effect is associated with an improvement in respiratory mechanics.

 

Recommendations sur la prise en charge de l’Insuffisance Surrénalienne relative en Réanimation

 

Djilalli et al., CCM, 2017

http://journals.lww.com/ccmjournal/Fulltext/2017/12000/Guidelines_for_the_Diagnosis_and_Management_of.16.aspx

doi: 10.1097/CCM.0000000000002737

 

 

 

Comparaison de 4 scores prédictifs de complications cardiaques post-opératoires en chirurgie non cardiaque

 

Cohn et al.,

http://www.sciencedirect.com/science/article/pii/S000291491731603X

https://doi.org/10.1016/j.amjcard.2017.09.031

 

The 2014 American College of Cardiology/American Heart Association Perioperative Guidelines suggest using the Revised Cardiac Risk Index, myocardial infarction or cardiac arrest, or American College of Surgeons—National Surgical Quality Improvement Program calculators for combined patient-surgical risk assessment. There are no published data comparing their performance.

This study compared these risk calculators and a reconstructed Revised Cardiac Risk Index in predicting postoperative cardiac complications, both during hospitalization and 30 days after operation, in a patient cohort who underwent select surgical procedures in various risk categories. Cardiac complications occurred in 14 of 663 patients (2.1%), of which 11 occurred during hospitalization. Only 3 of 663 patients (0.45%) had a myocardial infarction or cardiac arrest. Because these calculators used different risk factors, different outcomes, and different durations of observation, a true direct comparison is not possible. We found that all 4 risk calculators performed well in the setting they were originally studied but were less accurate when applied in a different manner.

In conclusion, all calculators were useful in defining low-risk patients in whom further cardiac testing was unnecessary, and the myocardial infarction or cardiac arrest may be the most reliable in selecting higher risk patients.

 

 

 

 

La profondeur de l’anesthésie influe sur la douleur post-opératoire ?

 

Faiz et al., J Pain Res, 2017

https://www.dovepress.com/an-investigation-into-the-effect-of-depth-of-anesthesia-on-postoperati-peer-reviewed-article-JPR

https://doi.org/10.2147/JPR.S142186

 

 

Backgrounds and objective: Some studies have shown that deeper anesthesia is more effective on postoperative analgesia and reduces the need for sedative drugs. This study sought to ­investigate the effect of depth of anesthesia on postoperative pain in laparoscopic cholecystectomy.
Materials and methods: In this double-blind clinical trial, 60 patients undergoing laparoscopic cholecystectomy were randomly divided into two groups: low bispectral index (L-BIS=35–44) and high bispectral index (H-BIS=45–55). Anesthesia protocol was the same for both groups (propofol and remifentanil). The pain intensity (at rest and during cough) was evaluated based on the visual analog scale scores in recovery and at 8, 16 and 24 hours after surgery.
Results: The mean pain score was significantly lower in patients in the L-BIS group at all examined times at rest and during cough than that in the H-BIS group. The number of patients in need of additional sedative drug in the H-BIS group in recovery was significantly more than that in the L-BIS group (27 vs 18 patients, P=0.007). The incidence of nausea in the recovery room 8 hours after the surgery was significantly less in the L-BIS group than that in the H-BIS group, while at 16 and 24 hours, no case of nausea was reported in the two groups.
Conclusion: Given the results of this study, it seems that general anesthesia with propofol and remifentanil with L-BIS causes less need for additional analgesic drug and less nausea and vomiting compared to anesthesia with H-BIS.

 

 

Méta-analyse sur l’ALR & Cancer : réduction des récidives ?

 

Pérez-González et al., Regional Anesthesia & Pain Medicine, 2017

http://journals.lww.com/rapm/Abstract/2017/11000/Impact_of_Regional_Anesthesia_on_Recurrence,.9.aspx

le pdf : ici

 

 

Enfin ! On peut se droguer au café tranquille maintenant ?

 

Merci à Gunter et al., Annals of Internal Medicine, 2017

http://annals.org/aim/article-abstract/2643435/coffee-drinking-mortality-10-european-countries-multinational-cohort-study?doi=10.7326%2fM16-2945

 

Pour partager sur les réseaux sociaux :
© 2014 - AJAR Paris – IDF. Tous droits réservés. Wordpress & We Create Web Designs