In the absence of vascular access we may resort to sending intraosseous aspirates for analysis, but in some laboratories there is concern that the samples can block autoanalysers.
A study on haematology/oncology patients undergoing diagnostic bone marrow aspiration showed clinically acceptable agreement between venous and intraosseous measurements for pH, base excess, sodium, ionised calcium and glucose using an an i-STAT® point-of-care analyser.
Key points are:
The first 1-2 ml should be discarded (as in this study)
BACKGROUND: Intraosseous access is used in emergency medicine as an alternative when intravenous access is difficult to obtain. Intraosseous samples can be used for laboratory testing to guide treatment. Many laboratories are reluctant to analyse intraosseous samples, as they frequently block conventional laboratory equipment. We aimed to evaluate the feasibility and accuracy of analysis of intraosseous samples using an i-STAT(®) point-of-care analyser.
METHODS: Intravenous and intraosseous samples of twenty children presenting for scheduled diagnostic bone marrow aspiration were analysed using an i-STAT(®) point-of-care analyser. Sample types were compared using Bland Altman plots and by calculating intraclass correlation coefficients and coefficients of variance.
RESULTS: The handheld i-STAT(®)point-of-care analyser proved suitable for analysing intraosseous samples without technical difficulties. Differences between venous and intraosseous samples were clinically acceptable for pH, base excess, sodium, ionised calcium and glucose in these haemodynamically stable patients. The intraclass correlation coefficient was excellent (>0.8) for comparison of intraosseous and intravenous base excess, and moderate (around 0.6) for bicarbonate, sodium and glucose. The coefficient of variance of intraosseous samples was smaller than that of venous samples for most variables.
CONCLUSION: Analysis of intraosseous samples with a bedside, single-use cartridge-based analyser is feasible and avoids the problem of bone marrow contents damaging conventional laboratory equipment. In an emergency situation point-of-care analysis of intraosseous aspirates may be a useful guide to treatment.
Here’s something to add to the pile of data cautioning us to think before we acidify patients with saline. A study in Anesthesia and Analgesia using propensity matching provides retrospective evidence that patients who developed hyperchloremia after noncardiac surgery had worse outcomes.
BACKGROUND: The use of normal saline is associated with hyperchloremic metabolic acidosis. In this study, we sought to determine the incidence of acute postoperative hyperchloremia (serum chloride >110 mEq/L) and whether this electrolyte disturbance is associated with an increase in length of hospital stay, morbidity, or 30-day postoperative mortality.
METHODS: Data were retrospectively collected on consecutive adult patients (>18 years of age) who underwent inpatient, noncardiac, nontransplant surgery between January 1, 2003 and December 31, 2008. The impact of postoperative hyperchloremia on patient morbidity and length of hospital stay was examined using propensity-matched and logistic multivariable analysis.
RESULTS: The dataset consisted of 22,851 surgical patients with normal preoperative serum chloride concentration and renal function. Acute postoperative hyperchloremia (serum chloride >110 mmol/L) is quite common, with an incidence of 22%. Patients were propensity-matched based on their likelihood to develop acute postoperative hyperchloremia. Of the 4955 patients with hyperchloremia after surgery, 4266 (85%) patients were matched to patients who had normal serum chloride levels after surgery. These 2 groups were well balanced with respect to all variables collected. The hyperchloremic group was at increased risk of mortality at 30 days postoperatively (3.0% vs 1.9%; odds ratio = 1.58; 95% confidence interval, 1.25-1.98) (relative risk 1.6 or risk increase of 1.1%) and had a longer hospital stay (7.0 days [interquartile range 4.1-12.3] compared with 6.3 [interquartile range 4.0-11.3]) than patients with normal postoperative serum chloride levels. Patients with postoperative hyperchloremia were more likely to have postoperative renal dysfunction. Using all preoperative variables and measured outcome variables in a logistic regression analysis, hyperchloremia remained an independent predictor of 30-day mortality with an odds ratio of 2.05 (95% confidence interval, 1.62-2.59).
CONCLUSION: This retrospective cohort trial demonstrates an association between hyperchloremia and poor postoperative outcome. Additional studies are required to demonstrate a causal relationship between these variables.
Age adjusted D-dimer cut-off values (age×10 µg/L) improve specificity without losing sensitivity for venous thromboembolism. This could spare many elderly patients unnecessary imaging. Full text is available free from the BMJ.
Diagnostic accuracy of conventional or age adjusted D-dimer cut-off values in older patients with suspected venous thromboembolism: systematic review and meta-analysis> BMJ. 2013 May 3;346:f2492 Abstract
OBJECTIVE: To review the diagnostic accuracy of D-dimer testing in older patients (>50 years) with suspected venous thromboembolism, using conventional or age adjusted D-dimer cut-off values.
DESIGN Systematic review and bivariate random effects meta-analysis.
DATA SOURCES: We searched Medline and Embase for studies published before 21 June 2012 and we contacted the authors of primary studies.
STUDY SELECTION: Primary studies that enrolled older patients with suspected venous thromboembolism in whom D-dimer testing, using both conventional (500 µg/L) and age adjusted (age × 10 µg/L) cut-off values, and reference testing were performed. For patients with a non-high clinical probability, 2 × 2 tables were reconstructed and stratified by age category and applied D-dimer cut-off level.
RESULTS: 13 cohorts including 12,497 patients with a non-high clinical probability were included in the meta-analysis. The specificity of the conventional cut-off value decreased with increasing age, from 57.6% (95% confidence interval 51.4% to 63.6%) in patients aged 51-60 years to 39.4% (33.5% to 45.6%) in those aged 61-70, 24.5% (20.0% to 29.7% in those aged 71-80, and 14.7% (11.3% to 18.6%) in those aged >80. Age adjusted cut-off values revealed higher specificities over all age categories: 62.3% (56.2% to 68.0%), 49.5% (43.2% to 55.8%), 44.2% (38.0% to 50.5%), and 35.2% (29.4% to 41.5%), respectively. Sensitivities of the age adjusted cut-off remained above 97% in all age categories.
CONCLUSIONS: The application of age adjusted cut-off values for D-dimer tests substantially increases specificity without modifying sensitivity, thereby improving the clinical utility of D-dimer testing in patients aged 50 or more with a non-high clinical probability.
Some good news for remote, rural, prehospital, and retrieval medicine clinicians who rely on point of care testing with the i-STAT® device. An animal study confirmed the reliability of testing aspirates from intraosseous samples taken from the tibia(1).
This is also good news for hospital practitioners when it comes to the acquisition of blood gas results, since there are concerns over the potential damage to blood gas analysers by bone marrow contents in the samples.
The researchers tested blood gases, acid–base status, lactate, haemoglobin, and electrolytes, and compared these with results from an arterial sample.
There was no malfunction of the equipment. Most of the acid–base parameters showed discrepancies between arterial and osseous samples: the average pH and base excess were consistently lower whilst pCO2 and lactate were higher in the intraosseous samples compared to the arterial. However the overall small degree and predictable direction of discrepancy in these values should preserve the clinical usefulness of intraosseous gases if these findings can be replicated in human subjects. pO2 was obviously very different between osseous and arterial samples.
They noted that aspiration of intraosseous samples was generally straightforward, especially immediately after placement of the cannulae, but on a few occasions more forceful aspiration was needed. They point out that this could possibly cause cellular lysis and affect the potassium analysis.
The authors consider the issue of how much aspirate should be discarded before taking a sample after intraosseous cannula insertion, and refer to a prior study which suggested that 2mL is adequate.
Intraosseous aspirate can be tested on an i-STAT® point-of-care analyser
Haemoglobin and electrolytes show good correlation with arterial samples
Acid-base, pCO2, and lactate differ slightly from arterial results but in a predictable direction and results are still likely to be clinically useful in an emergency
It may be worth discarding the first 2 ml of aspirate
These results require validation in human subjects
BACKGROUND: Intraosseous access is an essential method in emergency medicine when other forms of vascular access are unavailable and there is an urgent need for fluid or drug therapy. A number of publications have discussed the suitability of using intraosseous access for laboratory testing. We aimed to further evaluate this issue and to study the accuracy and precision of intraosseous measurements.
METHODS: Five healthy, anaesthetised pigs were instrumented with bilateral tibial intraosseous cannulae and an arterial catheter. Samples were collected hourly for 6h and analysed for blood gases, acid base status, haemoglobin and electrolytes using an I-Stat point of care analyser.
RESULTS: There was no clinically relevant difference between results from left and right intraosseous sites. The variability of the intraosseous sample values, measured as the coefficient of variance (CV), was maximally 11%, and smaller than for the arterial sample values for all variables except SO2. For most variables, there seems to be some degree of systematic difference between intraosseous and arterial results. However, the direction of this difference seems to be predictable.
CONCLUSION: Based on our findings in this animal model, cartridge based point of care instruments appear suitable for the analysis of intraosseous samples. The agreement between intraosseous and arterial analysis seems to be good enough for the method to be clinically useful. The precision, quantified in terms of CV, is at least as good for intraosseous as for arterial analysis. There is no clinically important difference between samples from left and right tibia, indicating a good reproducibility.
A study examining patterns of procalcitonin in a group of critically ill patients(1) showed some interesting findings:
Shock was associated with higher procalcitonin values independent of the presence of infection
Procalcitonin (PCT) levels were less in patients who developed infections later during their ICU stay compared with those who had infections when admitted to ICU.
The accompanying editorial(2) reminds us about commonly used inflammatory biomarkers.
White blood cells are influenced by almost every inflammatory stimulus, rendering them unhelpful in the management of severely ill patients.
Daily monitoring of CRP levels can identify ICU-acquired infections early, and some prognostic information can be provided by how rapidly CRP levels respond to treatment.
PCT rises early in severe sepsis, mainly by pneumonia and bloodstream infections, and can reflect the severity of the systemic inflammatory response syndrome to infection. PCT is more specific than CRP for infection compared with non-infectious causes of systemic inflammatory response syndrome. However PCT can also be increased in noninfectious diseases such as acute pancreatitis and cardiogenic shock.
OBJECTIVE: The utility of procalcitonin for the diagnosis of infection in the critical care setting has been extensively investigated with conflicting results. Herein, we report procalcitonin values relative to baseline patient characteristics, presence of shock, intensive care unit time course, infectious status, and Gram stain of infecting organism.
DESIGN: Prospective, multicenter, observational study of critically ill patients admitted to intensive care unit for >24 hrs. SETTING:: Three tertiary care intensive care units.
PATIENTS: All consenting patients admitted to three mixed medical-surgical intensive care units. Patients who had elective surgery, overdoses, and who were expected to stay <24 hrs were excluded.
INTERVENTIONS: Patients were followed prospectively to ascertain the presence of prevalent (present at admission) or incident (developed during admission) infections and clinical outcomes. Procalcitonin levels were measured daily for 10 days and were analyzed as a function of the underlying patient characteristics, presence of shock, time of infection, and pathogen isolated.
MAIN RESULTS: Five hundred ninety-eight patients were enrolled. Medical and surgical infected cohorts had similar baseline procalcitonin values (3.0 [0.7-15.3] vs. 3.7 [0.6-9.8], p = .68) and peak procalcitonin (4.5 [1.0-22.9] vs. 5.0 [0.9-16.0], p = .91). Infected patients were sicker than their noninfected counterparts (Acute Physiology and Chronic Health Evaluation II 22.9 vs. 19.3, p < .001); those with infection at admission had a trend toward higher peak procalcitonin values than did those whose infection developed in the intensive care unit (4.9 vs. 1.4, p = .06). The presence of shock was significantly associated with elevations in procalcitonin in cohorts who were and were not infected (both groups p < .003 on days 1-5).
CONCLUSIONS: Procalcitonin dynamics were similar between surgical and medical cohorts. Shock had an association with higher procalcitonin values independent of the presence of infection. Trends in differences in procalcitonin values were seen in patients who had incident vs. prevalent infections.
Kids in hospital with injury, infection or other illness, and those undergoing the physiological stress of surgery, produce (appropriately) elevated antidiuretic hormone levels which contribute to the risk of hyponatraemia by impairing free water excretion in the kidney.
Deaths have occurred on general paediatric and surgery wards when fluid regimens containing low concentrations of sodium (classically 0.18% or 0.225% NaCl) have resulted in hyponatraemia in children without adequate electrolyte monitoring, leading some bodies to recommend at least 0.45% NaCl solutions for maintenance fluid therapy in children.
However two recent studies1,2 on postoperative children show an increased risk of hyponatraemia even with 0.45% saline, when compared with 0.9% saline or Hartmann’s solution (Hartmann’s is similar – almost identical – to Ringer’s lactate).
I like the fact that paediatricians used Hartmann’s in one of these studies1. I have worked with several paediatricians who never use Hartmann’s, either from lack of experience or because of concern about its lactate content (not appreciating the lactate is metabolised by the liver to bicarbonate).
This is ironic, since Alexis Hartmann (1898–1964) was a paediatrician.
Want more fluid therapy irony? The ‘balanced salt solution’ used by Brits and Australasians is Hartmann’s solution – named after an American. The one used by Americans is Lactated Ringer’s solution – named after the British physician Sydney Ringer (1834-1910).
Medical history enthusiasts can read more about Hartmann and Ringer here.
OBJECTIVE: To compare the difference in plasma sodium at 16-18 h following major surgery in children who were prescribed either Hartmann’s and 5% dextrose or 0.45% saline and 5% dextrose.
DESIGN: A prospective, randomised, open label study.
SETTING: The paediatric intensive care unit (650 admissions per annum) in a tertiary children’s hospital in Brisbane, Australia.
PATIENTS: The study group comprised 82 children undergoing spinal instrumentation, craniotomy for brain tumour resection, or cranial vault remodelling.
INTERVENTIONS: Patients received either Hartmann’s and 5% dextrose at full maintenance rate or 0.45% saline and 5% dextrose at two-thirds maintenance rate.
MAIN OUTCOMES MEASURES: Primary outcome measure: plasma sodium at 16-18 h postoperatively; secondary outcome measure: number of fluid boluses administered.
RESULTS: Mean postoperative plasma sodium levels of children receiving 0.45% saline and 5% dextrose were 1.4 mmol/l (95% CI 0.4 to 2.5) lower than those receiving Hartmann’s and 5% dextrose (p=0.008). In the 0.45% saline group, seven patients (18%) became hyponatraemic (Na <135 mmol/l) at 16-18 h postoperatively; in the Hartmann’s group no patient became hyponatraemic (p=0.01). No child in either fluid group became hypernatraemic.
CONCLUSIONS: The postoperative fall in plasma sodium was smaller in children who received Hartmann’s and 5% dextrose compared to those who received 0.45% saline and 5% dextrose. It is suggested that Hartmann’s and 5% dextrose should be administered at full maintenance rate postoperatively to children who have undergone major surgery in preference to hypotonic fluids.
OBJECTIVE: The objective of this randomized controlled trial was to evaluate the risk of hyponatremia following administration of a isotonic (0.9% saline) compared to a hypotonic (0.45% saline) parenteral maintenance solution (PMS) for 48 hours to postoperative pediatric patients.
METHODS: Surgical patients 6 months to 16 years of age with an expected postoperative stay of >24 hours were eligible. Patients with an uncorrected baseline plasma sodium level abnormality, hemodynamic instability, chronic diuretic use, previous enrollment, and those for whom either hypotonic PMS or isotonic PMS was considered contraindicated or necessary, were excluded. A fully blinded randomized controlled trial was performed. The primary outcome was acute hyponatremia. Secondary outcomes included severe hyponatremia, hypernatremia, adverse events attributable to acute plasma sodium level changes, and antidiuretic hormone levels.
RESULTS: A total of 258 patients were enrolled and assigned randomly to receive hypotonic PMS (N = 130) or isotonic PMS (N = 128). Baseline characteristics were similar for the 2 groups. Hypotonic PMS significantly increased the risk of hyponatremia, compared with isotonic PMS (40.8% vs 22.7%; relative risk: 1.82 [95% confidence interval: 1.21-2.74]; P = .004). Admission to the pediatric critical care unit was not an independent risk factor for the development of hyponatremia. Isotonic PMS did not increase the risk of hypernatremia (relative risk: 1.30 [95% confidence interval: 0.30-5.59]; P = .722). Antidiuretic hormone levels and adverse events were not significantly different between the groups.
CONCLUSION: Hypotonic Versus Isotonic Maintenance Fluids After Surgery for Children: A Randomized Controlled Trial.
An association is demonstrated between abnormal (both high and low) serum potassium levels and in-hospital mortality in patients with acute myocardial infarction. These findings do not necessarily imply a causal relationship, since abnormal potassium levels might be a marker of increased risk of death due to other illness factors rather than a risk of death per se.
Acknowledging that a randomised trial of potassium replacement is unlikely to happen, the authors pragmatically advise:
Our data suggest that the optimal range of serum potassium levels in AMI patients may be between 3.5 and 4.5 mEq/L and that potassium levels of greater than 4.5 mEq/L are associated with increased mortality and should probably be avoided.
Context Clinical practice guidelines recommend maintaining serum potassium levels between 4.0 and 5.0 mEq/L in patients with acute myocardial infarction (AMI). These guidelines are based on small studies that associated low potassium levels with ventricular arrhythmias in the pre−β-blocker and prereperfusion era. Current studies examining the relationship between potassium levels and mortality in AMI patients are lacking.
Objective To determine the relationship between serum potassium levels and in-hospital mortality in AMI patients in the era of β-blocker and reperfusion therapy.
Design, Setting, and Patients Retrospective cohort study using the Cerner Health Facts database, which included 38 689 patients with biomarker-confirmed AMI, admitted to 67 US hospitals between January 1, 2000, and December 31, 2008. All patients had in-hospital serum potassium measurements and were categorized by mean postadmission serum potassium level (<3.0, 3.0-<3.5, 3.5-<4.0, 4.0-<4.5, 4.5-<5.0, 5.0-<5.5, and ≥5.5 mEq/L). Hierarchical logistic regression was used to determine the association between potassium levels and outcomes after adjusting for patient- and hospital-level factors.
Main Outcome Measures All-cause in-hospital mortality and the composite of ventricular fibrillation or cardiac arrest.
Results There was a U-shaped relationship between mean postadmission serum potassium level and in-hospital mortality that persisted after multivariable adjustment. Compared with the reference group of 3.5 to less than 4.0 mEq/L (mortality rate, 4.8%; 95% CI, 4.4%-5.2%), mortality was comparable for mean postadmission potassium of 4.0 to less than 4.5 mEq/L (5.0%; 95% CI, 4.7%-5.3%), multivariable-adjusted odds ratio (OR), 1.19 (95% CI, 1.04-1.36). Mortality was twice as great for potassium of 4.5 to less than 5.0 mEq/L (10.0%; 95% CI, 9.1%-10.9%; multivariable-adjusted OR, 1.99; 95% CI, 1.68-2.36), and even greater for higher potassium strata. Similarly, mortality rates were higher for potassium levels of less than 3.5 mEq/L. In contrast, rates of ventricular fibrillation or cardiac arrest were higher only among patients with potassium levels of less than 3.0 mEq/L and at levels of 5.0 mEq/L or greater.
Conclusion Among inpatients with AMI, the lowest mortality was observed in those with postadmission serum potassium levels between 3.5 and <4.5 mEq/L compared with those who had higher or lower potassium levels.
Do you have access to thromboelastometry in your Emergency Department? Further research by some of the first discoverers of acute traumatic coagulopathy involved using this tool to identify acute traumatic coagulopathy at 5 mins and predict the need for massive transfusion. Measures of coagulopathy more familiar to ED staff such as the INR took longer or (when point-of-care testing was employed) were less accurate.
OBJECTIVE: To identify an appropriate diagnostic tool for the early diagnosis of acute traumatic coagulopathy and validate this modality through prediction of transfusion requirements in trauma hemorrhage.
DESIGN: Prospective observational cohort study.
SETTING: Level 1 trauma center.
PATIENTS: Adult trauma patients who met the local criteria for full trauma team activation. Exclusion criteria included emergency department arrival >2 hrs after injury, >2000 mL of intravenous fluid before emergency department arrival, or transfer from another hospital.
MEASUREMENTS: Blood was collected on arrival in the emergency department and analyzed with laboratory prothrombin time, point-of-care prothrombin time, and rotational thromboelastometry. Prothrombin time ratio was calculated and acute traumatic coagulopathy defined as laboratory prothrombin time ratio >1.2. Transfusion requirements were recorded for the first 12 hrs following admission.
MAIN RESULTS: Three hundred patients were included in the study. Laboratory prothrombin time results were available at a median of 78 (62-103) mins. Point-of-care prothrombin time ratio had reduced agreement with laboratory prothrombin time ratio in patients with acute traumatic coagulopathy, with 29% false-negative results. In acute traumatic coagulopathy, the rotational thromboelastometry clot amplitude at 5 mins was diminished by 42%, and this persisted throughout clot maturation. Rotational thromboelastometry clotting time was not significantly prolonged. Clot amplitude at a 5-min threshold of ≤35 mm had a detection rate of 77% for acute traumatic coagulopathy with a false-positive rate of 13%. Patients with clot amplitude at 5 mins ≤35 mm were more likely to receive red cell (46% vs. 17%, p < .001) and plasma (37% vs. 11%, p < .001) transfusions. The clot amplitude at 5 mins could identify patients who would require massive transfusion (detection rate of 71%, vs. 43% for prothrombin time ratio >1.2, p < .001).
CONCLUSIONS: In trauma hemorrhage, prothrombin time ratio is not rapidly available from the laboratory and point-of-care devices can be inaccurate. Acute traumatic coagulopathy is functionally characterized by a reduction in clot strength. With a threshold of clot amplitude at 5 mins of ≤35 mm, rotational thromboelastometry can identify acute traumatic coagulopathy at 5 mins and predict the need for massive transfusion.
I enjoyed a paper from Critical Care Medicine this month which relates to a major bugbear of mine: the prescription of 0.9% saline for critically ill patients and the consequent metabolic acidosis this causes. However it did produce some interesting findings that helped me review my own biases here.
In short, an ICU team decided to reduce and where possible eliminate the use of high chloride fluids including 0.9% saline and Gelofusine and replace with lower chloride fluids, mainly Ringer’s Lactate (Hartmann’s solution).
It is known that saline causes a metabolic acidosis by elevating chloride and reducing the strong ion difference. This results in a normal anion gap, hyperchloraemic acidosis. The clinical significance of this is uncertain, but the iatrogenic acidosis is often confused by clinicians as a sign of severe illness, especially those clinicians that don’t look at the chloride or anion gap.
Not surprisingly, changing the fluid policy resulted in less acidosis (and also less hypernatraemia). There was however an increase in severe alkalaemia. The study was not designed to look at patient oriented outcomes.
My observations are:
This is an important reminder that saline causes acidosis
Because of the possibility of worsening alkalosis, fluid therapy choice should be individualised for an ICU patient based on their known acid-base issues; in some cases, saline may be appropriate.
These patients were managed for several days on an ICU. Alkalaemia is common on the ICU for reasons that include hypoalbuminaemia, furosemide use, and iatrogenic hyperventilation. These factors are less relevant in the ED resuscitation population where such a degree of alkalaemia is rarely seen.
The authors point out that their results are “consistent with previous acute treatment studies, which were conducted in the perioperative or experimental setting” – isn’t it a shame that ED-based studies are not forthcoming?
The authors point to an additional finding:
Furthermore, our results suggest that routine use of lactate fluids such as Hartmann’s or Ringer’s lactate is associated with a detectable iatrogenic increase in lactate in the first 48 hrs after ICU admission, when, presumably, lactate clearance is less effective.
A significant benefit of the change in fluid policy was a signficant cost saving, largely due to the omission of Gelofusine.
For me, this study reassures me that my current practice of preferring Ringer’s Lactate to Saline in the resuscitation setting is likely to minimise iatrogenic acidosis without significantly elevating the lactate, in a population rarely afflicted by significant alkalaemia.
Setting: University-affiliated intensive care unit.
Patients: A cohort of 828 consecutive patients admitted over 6 months from February 2008 and cohort of 816 consecutive patients admitted over 6 months from February 2009.
Interventions: We collected biochemical and fluid use data during standard practice without clinician awareness. After a 6-month period of education and preparation, we restricted the use of chloride-rich fluids (0.9% saline [Baxter, Sydney, Australia], Gelofusine [BBraun, Melsungen, Germany], and Albumex 4 [CSL Bioplasma, Melbourne, Australia]) in the intensive care unit and made them available only on specific intensive care unit specialist prescription.
Measurements and Main Results: Saline prescription decreased from 2411 L in the control group to 52 L in the intervention group (p < .001), Gelofusine from 538 to 0 L (p < .001), and Albumex 4 from 269 to 80 L (p < .001). As expected, Hartmann’s lactated solution prescription increased from 469 to 3205 L (p < .001), Plasma-Lyte from 65 to 160 L (p < .05), and chloride-poor Albumex 20 from 87 to 268 L (p < .001). After intervention, the incidence of severe metabolic acidosis (standard base excess5 mEq/L) and alkalemia (pH >7.5) with an increase from 25.4% to 32.8% and 10.5% to 14.7%, respectively (p < .001). The time-weighted mean chloride level decreased from 104.9 ± 4.9 to 102.5 ± 4.6 mmol/L (p < .001), whereas the time-weighted mean standard base excess increased from 0.5 ± 4.5 to 1.8 ± 4.7 mmol/L (p < .001), mean bicarbonate from 25.3 ± 4.0 to 26.4 ± 4.1 mmol/L (p < .001) and mean pH from 7.40 ± 0.06 to 7.42 ± 0.06 (p < .001). Overall fluid costs decreased from $15,077 (U.S.) to $3,915.
Conclusions: In a tertiary intensive care unit in Australia, restricting the use of chloride-rich fluids significantly affected electrolyte and acid-base status. The choice of fluids significantly modulates acid-base status in critically ill patients.
Patients with severe sepsis and an elevated lactate who appear to be normotensive had a mortality similar to those presenting with hypotension. This is demonstrated in a new study on patients who were recruited to a study I have reported before.
The so-called ‘cryptic shock’ group was defined by a systolic BP of at least 90 mmHg, suggesting to me not so much that normotension and hypotension are prognostically equivalent, but that we should perhaps redefine hypotension in sepsis, as we should probably be doing in trauma. Alternatively (and preferably), the BP should be interpreted in the context of what is known to be or likely to be normal for that patient. For example, a systolic BP of 105 mmHg in a 75 year old male would be be ringing serious alarm bells for me in a febrile patient, and I would be working them up for severe sepsis from the start. Interestingly in this study, the cryptic shock group had a higher proportion of patients with diabetes and/or end stage renal disease – diagnoses one would expect to be associated with hypertension – and the median (and IQR) systolic BP in this group was 108 (92, 126). So, although this shock may have been ‘cryptic’ as opposed to ‘overt’ by the definition applied in the paper (a cut off of 90 mmHg), it is likely that some of the patients in the cryptic group were hypotensive compared with their usual blood pressure.
These observations do not detract from a key message the authors include in their discussion, with which I wholeheartedly agree:
“These data highlight the need to screen patients for signs of occult hypoperfusion, and given the high mortality rate associated with an elevated serum lactate, also suggest that patients with biochemical evidence of inadequate oxygen delivery despite normal blood pressure should be included in early sepsis resuscitation pathways.”
This paper makes an important contribution to the sepsis literature by warning against the dismissal of an elevated serum lactate in the setting of apparent haemodynamic stability as being a less acutely ill patient than one presenting with overt hypotension. It provides a reminder to check the lactate in patients with infection and signs of systemic inflammatory response, since this may provide the only early evidence of hypoperfusion.
Introduction We sought to compare the outcomes of patients with cryptic versus overt shock treated with an emergency department (ED) based early sepsis resuscitation protocol.
Methods Pre-planned secondary analysis of a large, multicenter ED-based randomized controlled trial of early sepsis resuscitation. All subjects were treated with a quantitative resuscitation protocol in the ED targeting 3 physiological variables: central venous pressure, mean arterial pressure and either central venous oxygen saturation or lactate clearance. The study protocol was continued until all endpoints were achieved or a maximum of 6 h. Outcomes data of patients who were enrolled with a lactate ≥4 mmol/L and normotension (cryptic shock) were compared to those enrolled with sustained hypotension after fluid challenge (overt shock). The primary outcome was in-hospital mortality.
Results A total of 300 subjects were enrolled, 53 in the cryptic shock group and 247 in the overt shock group. The demographics and baseline characteristics were similar between the groups. The primary endpoint of in-hospital mortality was observed in 11/53 (20%, 95% CI 11–34) in the cryptic shock group and 48/247 (19%, 95% CI 15–25) in the overt shock group, difference of 1% (95% CI −10 to 14; log rank test p = 0.81).
Conclusion Severe sepsis with cryptic shock carries a mortality rate not significantly different from that of overt septic shock. These data suggest the need for early aggressive screening for and treatment of patients with an elevated serum lactate in the absence of hypotension.