No Benefit From Early Goal Directed Therapy

The first of three major trials assessing early goal directed therapy (EGDT) in sepsis – the American ProCESS Trial – has been published.
It showed what many of us thought – that the specific monitoring via a central line of central venous oxygen saturation – was not necessary for improved survival.
However the trial randomised 1341 patients to one of three arms:
(1) protocolised EGDT
(2) protocol-based standard therapy that did not require the placement of a central venous catheter, administration of inotropes, or blood transfusions
(3) ‘usual care’ which was not standardised.
There were no differences in any of the primary or secondary outcomes between the groups.
Interestingly, in the six hours of early care that the trial dictated, the volume of intravenous fluids administered differed significantly among the groups (2.8 litres in the protocol-based EGDT group, 3.3 litres in the protocol-based standard-therapy group, and 2.3 litres in the usual-care group).
There was also a difference in the amount of vasopressor given, with more patients in the two protocol-based groups receiving vasopressors (54.9% in the protocol-based EGDT group, 52.2% in the protocol-based standard-therapy group, 44.1% in the usual-care group).
The use of intravenous fluids, vasopressors, dobutamine, and blood transfusions between 6 and 72 hours did not differ significantly among the groups.
Overall 60 day mortality was in the region of 20% for all groups.
What are the take home points here? Firstly, overall sepsis outcomes have improved over recent years, and early recognition and antibiotic administration may be the most important components of care. In the early emergency department phase of care, protocolised fluid and vasopressor therapy may not be as important as we thought. Good clinical assessment and regular review seem to be as effective and perhaps more important than any specific monitoring modality or oxygen delivery-targeted drug and blood therapy.
We all await the ARISE and ProMISE studies which may shed more light on the most important components of early sepsis care.
A Randomized Trial of Protocol-Based Care for Early Septic Shock
NEJM Mar 18 2014 (Full Text Link)
[EXPAND Abstract]


Background: In a single-center study published more than a decade ago involving patients presenting to the emergency department with severe sepsis and septic shock, mortality was markedly lower among those who were treated according to a 6-hour protocol of early goal-directed therapy (EGDT), in which intravenous fluids, vasopressors, inotropes, and blood transfusions were adjusted to reach central hemodynamic targets, than among those receiving usual care. We conducted a trial to determine whether these findings were generalizable and whether all aspects of the protocol were necessary.

Methods: In 31 emergency departments in the United States, we randomly assigned patients with septic shock to one of three groups for 6 hours of resuscitation: protocol-based EGDT; protocol-based standard therapy that did not require the placement of a central venous catheter, administration of inotropes, or blood transfusions; or usual care. The primary end point was 60-day in-hospital mortality. We tested sequentially whether protocol-based care (EGDT and standard-therapy groups combined) was superior to usual care and whether protocol-based EGDT was superior to protocol-based standard therapy. Secondary outcomes included longer-term mortality and the need for organ support.

Results: We enrolled 1341 patients, of whom 439 were randomly assigned to protocol-based EGDT, 446 to protocol-based standard therapy, and 456 to usual care. Resuscitation strategies differed significantly with respect to the monitoring of central venous pressure and oxygen and the use of intravenous fluids, vasopressors, inotropes, and blood transfusions. By 60 days, there were 92 deaths in the protocol-based EGDT group (21.0%), 81 in the protocol-based standard-therapy group (18.2%), and 86 in the usual-care group (18.9%) (relative risk with protocol-based therapy vs. usual care, 1.04; 95% confidence interval [CI], 0.82 to 1.31; P=0.83; relative risk with protocol-based EGDT vs. protocol-based standard therapy, 1.15; 95% CI, 0.88 to 1.51; P=0.31). There were no significant differences in 90-day mortality, 1-year mortality, or the need for organ support.

Conclusions: In a multicenter trial conducted in the tertiary care setting, protocol-based resuscitation of patients in whom septic shock was diagnosed in the emergency department did not improve outcomes

[/EXPAND]

Use a table for selecting PEEP in ARDS

PEEPtable.001Selecting the right amount of PEEP to recruit collapsed alveoli in patients with ARDS is important but the best method isn’t proven. Using a table to select PEEP based on FiO2 was significantly but weakly associated with improved lung recruitability (on CT scan) when compare with other methods of selecting PEEP, and was the best method for avoiding higher PEEP in patients with lower recruitability.
This is a small study and the results do not necessarily translate to improved clinical outcomes, but they may be of interest to emergency and retrieval medicine physicians who require a simple and safe strategy when managing ARDS patients without the luxury of time or of access to highly sophisticated ICU ventilators.
Bedside selection of positive end-expiratory pressure in mild, moderate, and severe acute respiratory distress syndrome
Crit Care Med. 2014 Feb;42(2):252-64
[EXPAND Abstract]


OBJECTIVE: Positive end-expiratory pressure exerts its effects keeping open at end-expiration previously collapsed areas of the lung; consequently, higher positive end-expiratory pressure should be limited to patients with high recruitability. We aimed to determine which bedside method would provide positive end-expiratory pressure better related to lung recruitability.

DESIGN: Prospective study performed between 2008 and 2011.

SETTING: Two university hospitals (Italy and Germany).

PATIENTS: Fifty-one patients with acute respiratory distress syndrome.

INTERVENTIONS: Whole lung CT scans were taken in static conditions at 5 and 45 cm H2O during an end-expiratory/end-inspiratory pause to measure lung recruitability. To select individual positive end-expiratory pressure, we applied bedside methods based on lung mechanics (ExPress, stress index), esophageal pressure, and oxygenation (higher positive end-expiratory pressure table of lung open ventilation study).

MEASUREMENTS AND MAIN RESULTS: Patients were classified in mild, moderate and severe acute respiratory distress syndrome. Positive end-expiratory pressure levels selected by the ExPress, stress index, and absolute esophageal pressures methods were unrelated with lung recruitability, whereas positive end-expiratory pressure levels selected by the lung open ventilation method showed a weak relationship with lung recruitability (r = 0.29; p < 0.0001). When patients were classified according to the acute respiratory distress syndrome Berlin definition, the lung open ventilation method was the only one which gave lower positive end-expiratory pressure levels in mild and moderate acute respiratory distress syndrome compared with severe acute respiratory distress syndrome (8 ± 2 and 11 ± 3 cm H2O vs 15 ± 3 cm H2O; p < 0.05), whereas ExPress, stress index, and esophageal pressure methods gave similar positive end-expiratory pressure values in mild, moderate, and severe acute respiratory distress syndrome. The positive end-expiratory pressure selected by the different methods were unrelated to each other with the exception of the two methods based on lung mechanics (ExPress and stress index).
CONCLUSIONS: Bedside positive end-expiratory pressure selection methods based on lung mechanics or absolute esophageal pressures provide positive end-expiratory pressure levels unrelated to lung recruitability and similar in mild, moderate, and severe acute respiratory distress syndrome, whereas the oxygenation-based method provided positive end-expiratory pressure levels related with lung recruitability progressively increasing from mild to moderate and severe acute respiratory distress syndrome.

[/EXPAND]

i-STAT® analysis of intraosseous aspirate

In the absence of vascular access we may resort to sending intraosseous aspirates for analysis, but in some laboratories there is concern that the samples can block autoanalysers.
A study on haematology/oncology patients undergoing diagnostic bone marrow aspiration showed clinically acceptable agreement between venous and intraosseous measurements for pH, base excess, sodium, ionised calcium and glucose using an an i-STAT® point-of-care analyser.
Key points are:

  • The first 1-2 ml should be discarded (as in this study)
  • Lactate hasn’t been assessed
  • These patients weren’t critically ill

Analysis of bloodgas, electrolytes and glucose from intraosseous samples using an i-STAT® point-of-care analyser
Resuscitation. 2014 Mar;85(3):359-63
[EXPAND Abstract]


BACKGROUND: Intraosseous access is used in emergency medicine as an alternative when intravenous access is difficult to obtain. Intraosseous samples can be used for laboratory testing to guide treatment. Many laboratories are reluctant to analyse intraosseous samples, as they frequently block conventional laboratory equipment. We aimed to evaluate the feasibility and accuracy of analysis of intraosseous samples using an i-STAT(®) point-of-care analyser.

METHODS: Intravenous and intraosseous samples of twenty children presenting for scheduled diagnostic bone marrow aspiration were analysed using an i-STAT(®) point-of-care analyser. Sample types were compared using Bland Altman plots and by calculating intraclass correlation coefficients and coefficients of variance.

RESULTS: The handheld i-STAT(®)point-of-care analyser proved suitable for analysing intraosseous samples without technical difficulties. Differences between venous and intraosseous samples were clinically acceptable for pH, base excess, sodium, ionised calcium and glucose in these haemodynamically stable patients. The intraclass correlation coefficient was excellent (>0.8) for comparison of intraosseous and intravenous base excess, and moderate (around 0.6) for bicarbonate, sodium and glucose. The coefficient of variance of intraosseous samples was smaller than that of venous samples for most variables.

CONCLUSION: Analysis of intraosseous samples with a bedside, single-use cartridge-based analyser is feasible and avoids the problem of bone marrow contents damaging conventional laboratory equipment. In an emergency situation point-of-care analysis of intraosseous aspirates may be a useful guide to treatment.

[/EXPAND]

Is 4 Joules per kg enough in kids?

glash-sim-paed-face-smResearchers from the Iberian-American Paediatric Cardiac Arrest Study Network challenge the evidence base behind defibrillation shock dose recommendations in children.
In a study of in-hospital pediatric cardiac arrest due to VT or VF, clinical outcome was not related to the cause or location of arrest, type of defibrillator and waveform, energy dose per shock, number of shocks, or cumulative energy dose, although there was a trend to better survival with higher doses per shock. 50% of children required more than the recommended 4J per kg and in over a quarter three or more shocks were needed to achieve defibrillation.
 
Shockable rhythms and defibrillation during in-hospital pediatric cardiac arrest
Resuscitation. 2014 Mar;85(3):387-91
[EXPAND Abstract]


OBJECTIVE: To analyze the results of cardiopulmonary resuscitation (CPR) that included defibrillation during in-hospital cardiac arrest (IH-CA) in children.

METHODS: A prospective multicenter, international, observational study on pediatric IH-CA in 12 European and Latin American countries, during 24 months. Data from 502 children between 1 month and 18 years were collected using the Utstein template. Patients with a shockable rhythm that was treated by electric shock(s) were included. The primary endpoint was survival at hospital discharge. Univariate logistic regression analysis was performed to find outcome factors.

RESULTS: Forty events in 37 children (mean age 48 months, IQR: 7-15 months) were analyzed. An underlying disease was present in 81.1% of cases and 24.3% had a previous CA. The main cause of arrest was a cardiac disease (56.8%). In 17 episodes (42.5%) ventricular fibrillation (VF) or pulseless ventricular tachycardia (pVT) was the first documented rhythm, and in 23 (57.5%) it developed during CPR efforts. In 11 patients (27.5%) three or more shocks were needed to achieve defibrillation. Return of spontaneous circulation (ROSC) was obtained in 25 cases (62.5%), that was sustained in 20 (50.0%); however only 12 children (32.4%) survived to hospital discharge. Children with VF/pVT as first documented rhythm had better sustained ROSC (64.7% vs. 39.1%, p=0.046) and survival to hospital discharge rates (58.8% vs. 21.7%, p=0.02) than those with subsequent VF/pVT. Survival rate was inversely related to duration of CPR. Clinical outcome was not related to the cause or location of arrest, type of defibrillator and waveform, energy dose per shock, number of shocks, or cumulative energy dose, although there was a trend to better survival with higher doses per shock (25.0% with <2Jkg(-1), 43.4% with 2-4Jkg(-1) and 50.0% with >4Jkg(-1)) and worse with higher number of shocks and cumulative energy dose.

CONCLUSION: The termination of pediatric VF/pVT in the IH-CA setting is achieved in a low percentage of instances with one electrical shock at 4Jkg(-1). When VF/pVT is the first documented rhythm, the results of defibrillation are better than in the case of subsequent VF/pVT. No clear relationship between defibrillation protocol and ROSC or survival has been observed. The optimal pediatric defibrillation dose remains to be determined; therefore current resuscitation guidelines cannot be considered evidence-based, and additional research is needed.

[/EXPAND]