Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 14 de 14
Filter
Add more filters










Publication year range
1.
Sci Total Environ ; 946: 174379, 2024 Oct 10.
Article in English | MEDLINE | ID: mdl-38955270

ABSTRACT

Understanding the decay characteristics of severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2) in wastewater and ambient waters is important for multiple applications including assessment of risk of exposure associated with handling wastewater samples, public health risk associated with recreation in wastewater polluted ambient waters and better understanding and interpretation of wastewater-based epidemiology (WBE) results. We evaluated the decay rates of infectious SARS-CoV-2 and viral RNA in wastewater and ambient waters under temperature regimes representative of seasonal fluctuations. Infectious virus was seeded in autoclaved primary wastewater effluent, final dechlorinated wastewater effluent, lake water, and marine water at a final concentration of 6.26 ± 0.07 log10 plaque forming units per milliliter. Each suspension was incubated at either 4°, 25°, and 37 °C. Samples were initially collected on an hourly basis, then approximately every other day for 15 days. All samples were analyzed for infectious virus via a plaque assay using the Vero E6 cell line, and viral gene copy levels were quantified with the US CDC's N1 and N2 reverse transcriptase quantitative polymerase chain reaction (RT-qPCR) assays. The infectious virus decayed significantly faster (p ≤ 0.0214) compared to viral RNA, which persisted for the duration of the study irrespective of the incubation conditions. The initial loss (within 15 min of seeding) as well as decay of infectious SARS-CoV-2 was significantly faster (p ≤ 0.0387) in primary treated wastewater compared to other water types, but viral RNA did not degrade appreciably in this matrix until day 15. Overall, temperature was the most important driver of decay, and after 24 h, no infectious SARS-CoV-2 was detected at 37 °C in any water type. Moreover, the CDC N2 gene assay target decayed significantly (p ≤ 0.0174) faster at elevated temperatures compared to CDC N1, which has important implications for RT-qPCR assay selection for WBE approach.


Subject(s)
RNA, Viral , SARS-CoV-2 , Wastewater , Wastewater/virology , SARS-CoV-2/genetics , COVID-19/transmission , COVID-19/epidemiology , Water Microbiology , Environmental Monitoring/methods , Chlorocebus aethiops
2.
J Virol Methods ; 311: 114645, 2023 01.
Article in English | MEDLINE | ID: mdl-36332716

ABSTRACT

Wastewater monitoring for severe acute respiratory syndrome coronavirus type 2 (SARS-CoV-2), the virus responsible for the global coronavirus disease 2019 (COVID-19) pandemic, has highlighted the need for methodologies capable of assessing viral prevalence during periods of low population infection. To address this need, two volumetrically different, methodologically similar concentration approaches were compared for their abilities to detect viral nucleic acid and infectious SARS-CoV-2 signal from primary influent samples. For Method 1, 2 L of SARS-CoV-2 seeded wastewater was evaluated using a dead-end hollow fiber ultrafilter (D-HFUF) for primary concentration, followed by the CP Select™ for secondary concentration. For Method 2, 100 mL of SARS-CoV-2 seeded wastewater was evaluated using the CP Select™ procedure. Following D-HFUF concentration (Method 1), significantly lower levels of infectious SARS-CoV-2 were lost (P value range: 0.0398-0.0027) compared to viral gene copy (GC) levels detected by the US Centers for Disease Control (CDC) N1 and N2 reverse-transcriptase quantitative polymerase chain reaction (RT-qPCR) assays. Subsamples at different steps in the concentration process were also taken to better characterize the losses of SARS-CoV-2 during the concentration process. During the centrifugation step (prior to CP Select™ concentration), significantly higher losses (P value range: 0.0003 to <0.0001) occurred for SARS-CoV-2 GC levels compared to infectious virus for Method 1, while between the methods, significantly higher infectious viral losses were observed for Method 2 (P = 0.0002). When analyzing overall recovery of endogenous SARS-CoV-2 in wastewater samples, application of Method 1 improved assay sensitivities (P = <0.0001) compared with Method 2; this was especially evident during periods of lower COVID-19 case rates within the sewershed. This study describes a method which can successfully concentrate infectious SARS-CoV-2 and viral RNA from wastewater. Moreover, we demonstrated that large volume wastewater concentration provides additional sensitivity needed to improve SARS-CoV-2 detection, especially during low levels of community disease prevalence.


Subject(s)
COVID-19 , Viruses , Humans , SARS-CoV-2/genetics , COVID-19/diagnosis , Wastewater , Pandemics , RNA, Viral/genetics
3.
J Virol Methods ; 296: 114245, 2021 10.
Article in English | MEDLINE | ID: mdl-34310974

ABSTRACT

Dead-end hollow fiber ultrafiltration combined with a single agar layer assay (D-HFUF-SAL) has potential use in the assessment of sanitary quality of recreational waters through enumeration of coliphage counts as measures of fecal contamination. However, information on applicability across a broad range of sites and water types is limited. Here, we tested the performance of D-HFUF-SAL on 49 marine and freshwater samples. Effect of method used to titer the spiking suspension (SAL versus double agar layer [DAL]) on percent recovery was also evaluated. Average somatic coliphage recovery (72 % ± 27) was significantly higher (p < 0.0001) compared to F+ (53 % ± 19). This was more pronounced for marine (p ≤ 0.0001) compared to freshwaters (p = 0.0134). Neither method affected somatic coliphage, but DAL (28 % ± 12) significantly (p < 0.0001) underestimated F + coliphage recoveries compared to SAL (53 % ± 19). Overall, results indicate that, while D-HFUF-SAL performed well over a wide variety of water types, F + coliphage recoveries were significantly reduced for marine waters suggesting that some components unique to this habitat may interfere with the assay performance. More importantly, our findings indicate that choice of spike titer method merits careful consideration since it may under-estimate method percent recovery.


Subject(s)
Ultrafiltration , Water Microbiology , Coliphages , Feces , Fresh Water
4.
Sci Total Environ ; 774: 145727, 2021 Jun 20.
Article in English | MEDLINE | ID: mdl-33607441

ABSTRACT

Levels of severe acute respiratory coronavirus type 2 (SARS CoV 2) RNA in wastewater could act as an effective means to monitor coronavirus disease 2019 (COVID-19) within communities. However, current methods used to detect SARS CoV 2 RNA in wastewater are limited in their ability to process sufficient volumes of source material, inhibiting our ability to assess viral load. Typically, viruses are concentrated from large liquid volumes using two stage concentration, primary and secondary. Here, we evaluated a dead-end hollow fiber ultrafilter (D-HFUF) for primary concentration, followed by the CP Select™ for secondary concentration from 2 L volumes of primary treated wastewater. Various amendments to each concentration procedure were investigated to optimally recover seeded OC43 (betacoronavirus) from wastewater. During primary concentration, the D-HFUF recovered 69 ± 18% (n = 29) of spiked OC43 from 2 L of wastewater. For secondary concentration, the CP Select™ system using the Wastewater Application settings was capable of processing 100 mL volumes of primary filter eluates in <25 min. A hand-driven syringe elution proved to be significantly superior (p = 0.0299) to the CP Select™ elution for recovering OC43 from filter eluates, 48 ± 2% compared to 31 ± 3%, respectively. For the complete method (primary and secondary concentration combined), the D-HFUF and CP select/syringe elution achieved overall 22 ± 4% recovery of spiked OC43 through (n = 8) replicate filters. Given the lack of available standardized methodology confounded by the inherent limitations of relying on viral RNA for wastewater surveillance of SARS CoV 2, it is important to acknowledge these challenges when interpreting this data to estimate community infection rates. However, the development of methods that can substantially increase sample volumes will likely allow for reporting of quantifiable viral data for wastewater surveillance, equipping public health officials with information necessary to better estimate community infection rates.


Subject(s)
COVID-19 , Coronavirus , Humans , RNA, Viral , SARS-CoV-2 , Wastewater
5.
J Card Surg ; 36(1): 89-96, 2021 Jan.
Article in English | MEDLINE | ID: mdl-33170533

ABSTRACT

OBJECTIVE: Surgical reoperation for aortic homograft structural valve degeneration (SVD) is a high-risk procedure. Transcatheter aortic valve replacement (TAVR) for homograft-SVD is an alternative to reoperation, but descriptions of implantation techniques are limited. This study compares outcome in patients undergoing into two groups by the type of previously implanted aortic valve prosthesis: TAVR for failed aortic homografts (TAVR-H) or for stented aortic bioprostheses (TAVR-BP). METHODS: From 2015 to 2017, TAVR was performed in 41 patients with SVD. Thirty-three patients in the TAVR-BP group (six for SVD of valved conduits), and eight patients in the TAVR-H group. The Valve Academic Research Consortium criteria were used for outcome reporting purposes. RESULTS: The patients with TAVR-BP had predominant prosthetic stenosis (94%, p = .002), whereas TAVR-H individuals presented mostly with regurgitation (88%, p = <.001). Patients with TAVR-H received: Sapien-3 (6), Sapien-XT (1), and CoreValve (1) devices. Low, 40% ventricular fixation in relation to homograft annulus was attempted to anchor the prosthesis on the homograft suture-line. One patient with TAVR-BP experienced intraoperative distal prosthesis migration and Type-B aortic dissection, and two patients in the TAVR-H group had late postoperative proximal device migration. In both groups, there was zero 30-day mortality, stroke, or pacemaker implantation. CONCLUSIONS: TAVR for failing aortic homografts may be a feasible and safe alternative to high-risk surgical reintervention. Precise, 40%-ventricular device positioning appears crucial for prevention of late paravalvular leak/late prosthesis migration and minimizing the risk of repeat surgical intervention.


Subject(s)
Aortic Valve Stenosis , Heart Valve Prosthesis , Transcatheter Aortic Valve Replacement , Allografts , Aortic Valve/surgery , Aortic Valve Stenosis/surgery , Humans , Prosthesis Design , Risk Factors , Time Factors , Treatment Outcome
6.
J Vis Commun Med ; 42(3): 114-119, 2019 Jul.
Article in English | MEDLINE | ID: mdl-31184541

ABSTRACT

Intraoperative photography is used to obtain images for both education and research purposes, but poses challenges due concerns regarding aseptic technique. Waterproof digital cameras have sterilisable cases that can be used by the surgeon for intraoperative photography. We compared the quality of still intraoperative images obtained by a non-scrubbed observer using a 35 mm single lens reflex (SLR) camera to images obtained by the surgeon using a GoPro camera in a sterilised case. Image quality was scored using a 4 point Likert scale by 3 groups of end users with differing experience: faculty surgeons, surgical residents, and 3rd year veterinary students. Mean ± SEM overall image quality scores were higher for the traditional 35 mm digital SLR camera when compared to the GoPro (3.25 ± 0.08 vs. 2.0 ± 0.08, p < .0001), as were scores for each image characteristic (brightness, colour, sharpness, and contrast). Image quality scores for each camera also differed significantly between user groups, with expert users (faculty and residents) giving lower quality scores when compared to scores from novices (students). Findings suggest that GoPro cameras provide lower intraoperative image quality than digital SLR cameras, although lower quality images may be more accepted by novices than by experienced users.


Subject(s)
Operating Rooms , Photography/instrumentation , Surgical Procedures, Operative/methods , Humans , Intraoperative Period
7.
Domest Anim Endocrinol ; 51: 114-21, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25625650

ABSTRACT

Glucagon-like peptide-1 (GLP-1) is an intestinal hormone that induces glucose-dependent stimulation of insulin secretion while suppressing glucagon secretion. Glucagon-like peptide-1 also increases beta cell mass and satiation while decelerating gastric emptying. Liraglutide is a fatty-acid derivative of GLP-1 with a protracted pharmacokinetic profile that is used in people for treatment of type II diabetes mellitus and obesity. The aim of this study was to determine the pharmacokinetics and pharmacodynamics of liraglutide in healthy cats. Hyperglycemic clamps were performed on days 0 (HGC) and 14 (LgHGC) in 7 healthy cats. Liraglutide was administered subcutaneously (0.6 mg/cat) once daily on days 8 through 14. Compared with the HGC (mean ± standard deviation; 455.5 ± 115.8 ng/L), insulin concentrations during LgHGC were increased (760.8 ± 350.7 ng/L; P = 0.0022), glucagon concentrations decreased (0.66 ± 0.4 pmol/L during HGC vs 0.5 ± 0.4 pmol/L during LgHGC; P = 0.0089), and there was a trend toward an increased total glucose infused (median [range] = 1.61 (1.11-2.54) g/kg and 2.25 (1.64-3.10) g/kg, respectively; P = 0.087). Appetite reduction and decreased body weight (9% ± 3%; P = 0.006) were observed in all cats. Liraglutide has similar effects and pharmacokinetics profile in cats to those reported in people. With a half-life of approximately 12 h, once daily dosing might be feasible; however, significant effects on appetite and weight loss may necessitate dosage or dosing frequency reductions. Further investigation of liraglutide in diabetic cats and overweight cats is warranted.


Subject(s)
Cats/metabolism , Glucagon-Like Peptide 1/analogs & derivatives , Hypoglycemic Agents , Liraglutide/pharmacology , Liraglutide/pharmacokinetics , Animals , Appetite/drug effects , Blood Glucose/analysis , Cats/blood , Female , Glucagon/blood , Glucose/administration & dosage , Glucose Clamp Technique , Hyperglycemia/blood , Insulin/blood , Liraglutide/administration & dosage , Male , Weight Loss/drug effects
8.
Domest Anim Endocrinol ; 51: 78-85, 2015 Apr.
Article in English | MEDLINE | ID: mdl-25594949

ABSTRACT

Exenatide extended-release (ER) is a microencapsulated formulation of the glucagon-like peptide 1-receptor agonist exenatide. It has a protracted pharmacokinetic profile that allows a once-weekly injection with comparable efficacy to insulin with an improved safety profile in type II diabetic people. Here, we studied the pharmacology of exenatide ER in 6 healthy cats. A single subcutaneous injection of exenatide ER (0.13 mg/kg) was administered on day 0. Exenatide concentrations were measured for 12 wk. A hyperglycemic clamp (target = 225 mg/dL) was performed on days -7 (clamp I) and 21 (clamp II) with measurements of insulin and glucagon concentrations. Glucose tolerance was defined as the amount of glucose required to maintain hyperglycemia during the clamp. Continuous glucose monitoring was performed on weeks 0, 2, and 6 after injection. Plasma concentrations of exenatide peaked at 1 h and 4 wk after injection. Comparing clamp I with clamp II, fasting blood glucose decreased (mean ± standard deviation = -11 ± 8 mg/dL, P = 0.02), glucose tolerance improved (median [range] +33% [4%-138%], P = 0.04), insulin concentrations increased (+36.5% [-9.9% to 274.1%], P = 0.02), and glucagon concentrations decreased (-4.7% [0%-12.1%], P = 0.005). Compared with preinjection values on continuous glucose monitoring, glucose concentrations decreased and the frequency of readings <50 mg/dL increased at 2 and 6 wk after injection of exenatide ER. This did not correspond to clinical hypoglycemia. No other side effects were observed throughout the study. Exenatide ER was safe and effective in improving glucose tolerance 3 wk after a single injection. Further evaluation is needed to determine its safety, efficacy, and duration of action in diabetic cats.


Subject(s)
Cat Diseases/drug therapy , Diabetes Mellitus/veterinary , Glucagon-Like Peptide 1/analogs & derivatives , Hypoglycemic Agents/pharmacology , Peptides/pharmacology , Venoms/pharmacology , Animals , Blood Glucose/analysis , Cats , Diabetes Mellitus/drug therapy , Drug Synergism , Exenatide , Fasting , Glucagon/blood , Glucagon-Like Peptide-1 Receptor/agonists , Glucose Clamp Technique , Hypoglycemic Agents/pharmacokinetics , Injections, Subcutaneous , Insulin/blood , Microspheres , Peptides/administration & dosage , Peptides/pharmacokinetics , Venoms/administration & dosage , Venoms/pharmacokinetics
9.
Domest Anim Endocrinol ; 47: 119-26, 2014 Apr.
Article in English | MEDLINE | ID: mdl-23428563

ABSTRACT

Type 1 diabetes mellitus is one of the most frequently diagnosed endocrinopathies in dogs, and prevalence continues to increase. Pancreatic islet transplantation is a noninvasive and potentially curative treatment for type 1 diabetes mellitus. Institution of this treatment in dogs will require a readily available source of canine islets. We hypothesized that clinically acceptable islet yield and purity could be achieved by using deceased canine donors and standard centrifugation equipment. Pancreata were procured from dogs euthanized for reasons unrelated to this study. Initial anatomic studies were performed to evaluate efficacy of pancreatic perfusion. Infusion into the accessory pancreatic duct resulted in perfusion of approximately 75% of the pancreas. Additional cannulation of the distal right limb of the pancreas allowed complete perfusion. Collagenase digestion was performed with a Ricordi chamber and temperature-controlled perfusion circuit. Islets were separated from the exocrine tissue with the use of a discontinuous density gradient and a standard laboratory centrifuge. After isolation, islet yield was calculated and viability was assessed with dual fluorescent staining techniques. Islet isolation was completed in 6 dogs. Median (interquartile range) islet yield was 36,756 (28,527) islet equivalents per pancreas. A high degree of islet purity (percentage of endocrine tissue; 87.5% [10%]) and viability (87.4% [12.4%]) were achieved. The islet yield achieved with this technique would require approximately 1 pancreas per 5 kg body weight of the recipient dog. Purity and viability of the isolated islets were comparable with those achieved in human islet transplantation program. According to initial results, clinically relevant islet yield and quality can be obtained from deceased canine donors with the use of standard laboratory equipment.


Subject(s)
Cell Separation/veterinary , Dogs , Islets of Langerhans/physiology , Animals , Cadaver , Cell Separation/methods , Tissue Donors
10.
Vet Comp Oncol ; 6(1): 1-18, 2008 Mar.
Article in English | MEDLINE | ID: mdl-19178659

ABSTRACT

Cisplatin is a platinum chemotherapeutic used in a variety of malignancies. The antineoplastic activity occurs from DNA cross-links and adducts, in addition to the generation of superoxide radicals. Nephrotoxicity is the most well-known and potentially most clinically significant toxicity. Unfortunately, the mechanism for cisplatin nephrotoxicity has not been completely elucidated; however, many theories have been developed. Other toxicities include gastrointestinal, myelosuppression, ototoxicity and neurotoxicity. Saline diuresis is currently the most accepted way to prevent cisplatin nephrotoxicity. Research has focused on pharmaceuticals and enzyme/molecular alterations as alternatives to long-term diuresis. No agents have currently been identified that can protect from all toxicities. Cisplatin has shown activity against osteosarcoma, transitional cell carcinoma, squamous cell carcinoma (SCC), melanoma, mesothelioma, carcinomatosis and germinal cell tumours in the dog. In the cat, cisplatin cannot be utilized because of fulminant pulmonary oedema that occurs at standard doses. Intralesional cisplatin has been utilized in horses for the treatment of SCC and sarcoids.


Subject(s)
Antineoplastic Agents/adverse effects , Antineoplastic Agents/therapeutic use , Cisplatin/adverse effects , Cisplatin/therapeutic use , Kidney Diseases/veterinary , Neoplasms/veterinary , Animals , Dog Diseases/drug therapy , Dogs , Horse Diseases/drug therapy , Horses , Kidney Diseases/chemically induced , Neoplasms/drug therapy
11.
Vet Surg ; 30(6): 515-21, 2001.
Article in English | MEDLINE | ID: mdl-11704946

ABSTRACT

OBJECTIVE: To identify preoperative diagnostic results that predict postoperative complications and survival in feline renal-transplant recipients. STUDY DESIGN: Retrospective clinical study. ANIMALS: Sixty-one feline renal allograft recipients. METHODS: Medical records for 61 consecutive cats that underwent renal allograft transplantation between January 1, 1996, and December 1, 1999, were reviewed. Age, diagnosis, body weight, body condition score, preoperative medical treatment, systolic blood pressure, packed cell volume, biochemical parameters at admission and at the time of surgery, postoperative complications, and postoperative survival were recorded. Associations of preoperative data with the occurrence of postoperative complications were determined using logistic regression. Postoperative survival was graphed using a Kaplan-Meier cumulative-survival plot. Associations of covariates with postoperative survival were analyzed using Cox proportional hazards analysis. RESULTS: Two parameters were significantly associated with occurrence of postoperative central nervous system (CNS) disorders: blood urea nitrogen concentration (odds ratio = 1.083; 95% CI = 1.018 to 1.148) and serum creatinine concentration (odds ratio = 1.8; 95% CI = 1.413 to 2.187) at the time of surgery. Postoperative survival 6 months after transplantation was 59%, though 3-year survival remained at 42%. Of all covariates investigated, only recipient age (relative hazard = 1.183; 95% CI = 1.039 to 1.334) was significantly associated with survival. CONCLUSION AND CLINICAL RELEVANCE: Standard measures of preoperative renal dysfunction do not predict postoperative survival in cats after renal transplantation, although an increase in the degree of preoperative azotemia is associated with an increased risk of CNS disorders after surgery. Increased recipient age is associated with decreased survival after renal transplantation.


Subject(s)
Cat Diseases/mortality , Cat Diseases/surgery , Kidney Transplantation/veterinary , Postoperative Complications/veterinary , Animals , Blood Urea Nitrogen , Cats , Creatinine/blood , Female , Kidney Transplantation/mortality , Logistic Models , Male , Postoperative Complications/diagnosis , Postoperative Complications/mortality , Predictive Value of Tests , Records/veterinary , Retrospective Studies , Survival Analysis
12.
Vet Surg ; 30(2): 161-9, 2001.
Article in English | MEDLINE | ID: mdl-11230770

ABSTRACT

OBJECTIVE: To evaluate the use of a portocaval venograft and ameroid constrictor in the surgical management of intrahepatic portosystemic shunts (PSS). STUDY DESIGN: Prospective, clinical study. ANIMAL POPULATION: Ten client-owned dogs with intrahepatic PSS. METHODS: Portal pressure was measured after temporary suture occlusion of the intrahepatic PSS. In dogs with an increase in portal pressure greater than 8 mm Hg, a single extrahepatic portocaval shunt was created using a jugular vein. An ameroid ring was placed around the venograft and the intrahepatic PSS was attenuated. Transcolonic pertechnetate scintigraphy was performed before surgery, 5 days after surgery, and 8 to 10 weeks after surgery. Dogs with continued portosystemic shunting were evaluated further by laparotomy or portography. Clinical outcome and complications were recorded. RESULTS: Mean (+/- SD) portal pressure increased from 6 +/- 3 to 19 +/- 6 mm Hg with PSS occlusion; in all 10 dogs, the increase in portal pressure was greater than 8 mm Hg. There were no intraoperative complications, and, after creation of the portocaval shunt, the intrahepatic PSS could be completely ligated in 8 of 10 dogs. The final portal pressure was 9 +/- 4 mm Hg. Postoperative complications included coagulopathy and death (1 dog), ascites (3 dogs), and incisional discharge (3 dogs). Five of 8 dogs had continued portosystemic shunting at 8 to 10 weeks after surgery. Multiple extrahepatic PSS were demonstrated in 4 of these dogs. Clinical outcome was excellent in all 9 surviving dogs. CONCLUSIONS AND CLINICAL SIGNIFICANCE: The surgical technique resulted in a high incidence of multiple extrahepatic PSS. Short-term clinical results were promising, but long-term outcome must be evaluated further.


Subject(s)
Dog Diseases/surgery , Portal System/abnormalities , Portal System/surgery , Animals , Dog Diseases/diagnostic imaging , Dogs , Female , Jugular Veins/transplantation , Ligation/instrumentation , Ligation/veterinary , Male , Portal Vein/abnormalities , Portal Vein/surgery , Prospective Studies , Radionuclide Imaging , Suture Techniques/veterinary , Treatment Outcome
13.
J Am Vet Med Assoc ; 216(3): 371-5, 2000 Feb 01.
Article in English | MEDLINE | ID: mdl-10668536

ABSTRACT

OBJECTIVE: To characterize serologic and clinical features and outcome of dogs with leptospirosis that were treated conservatively (i.e., medical management alone) or with hemodialysis. DESIGN: Retrospective study. ANIMALS: 36 dogs with leptospirosis. PROCEDURE: History; results of physical examinations, ultrasonography, and serologic, hematologic, and serum biochemical analyses; time to resolution of azotemia; and outcome were obtained from medical records. Dogs were treated conservatively (n = 22) or with hemodialysis (14). RESULTS: Between 1990 and 1998, amount of rainfall was positively correlated with number of cases of leptospirosis identified per year. Serum antibodies against 6 Leptospira serovars were measured, and titers were highest to Leptospira pomona in 16 (44%) dogs, L bratislava in 9 (25%) dogs, and L hardjo in 1 (3%) dog. Eight (22%) dogs had equally high titers to L pomona and L bratislava, 1 (3%) had equally high titers to L grippotyphosa and L canicola, and 1 (3%) had high titers to L grippotyphosa, L pomona, L canicola, and L bratislava. During initial evaluation, all dogs were azotemic. Thirty (83%) dogs survived, including 12 of 14 (86%) dogs treated with hemodialysis and 18 of 22 (82%) treated conservatively. Serum creatinine concentration was similar in both groups after resolution of clinical signs. CONCLUSIONS AND CLINICAL RELEVANCE: Infection with L pomona and L bratislava was recognized as a cause of leptospirosis in dogs, and resulted in development of acute renal failure with various degrees of azotemia. Prognosis for dogs with mild to moderate azotemia was good with conservative treatment, whereas treatment with hemodialysis appeared to improve prognosis for dogs with severe azotemia.


Subject(s)
Dog Diseases/therapy , Leptospirosis/veterinary , Renal Dialysis/veterinary , Animals , Antibodies, Bacterial/blood , Blood Cell Count/veterinary , Blood Urea Nitrogen , California/epidemiology , Creatinine/blood , Dog Diseases/epidemiology , Dog Diseases/etiology , Dogs , Female , Kidney/diagnostic imaging , Leptospira/immunology , Leptospirosis/complications , Leptospirosis/epidemiology , Leptospirosis/therapy , Male , Prognosis , Rain , Retrospective Studies , Ultrasonography , Uremia/etiology , Uremia/therapy , Uremia/veterinary
14.
Acta Anaesthesiol Scand ; 41(7): 931-8, 1997 Aug.
Article in English | MEDLINE | ID: mdl-9265939

ABSTRACT

BACKGROUND: The effects of inhalation anesthetics on left ventricular (LV) systolic function are well documented, while the effects of these agents on LV diastolic function have mainly been evaluated in animal studies, with conflicting results. METHODS: We investigated the effects of halothane and isoflurane, when used to control the stress response to sternotomy in 33 patients with coronary artery disease (CAD). LV early diastolic relaxation and end-diastolic stiffness were evaluated from mitral Doppler flow profiles, transesophageal two-dimensional echocardiography, and central hemodynamic measurements. Measurements were performed a) after induction of anesthesia, b) after volume loading, c) prior to surgery and d) during surgery, 10 min after introduction of the inhalation anesthetic. The effects of the anesthetics on Doppler indices reflecting early diastolic relaxation, and on the left ventricular end-diastolic pressure-area (LVED P/A) relationship, were studied. RESULTS: When data obtained during surgical stress were compared to the control situation, we found an increase in the LV filling pressures in both groups, while only the isoflurane group showed an increase in heart rate. An increase in end-systolic LV area and decreased fractional area change was present in the halothane group, while an increase in LV end-diastolic area, and similar changes in the mitral Doppler indices (decreases of deceleration rate and time of early diastolic filling), indicating an impairment of early diastolic relaxation, was present in both groups. Isoflurane induced a displacement of the LVED P/A relationship leftwards from the baseline LVED P/A curve. CONCLUSION: Both halothane and isoflurane impair early diastolic relaxation in patients with CAD, when used to control intraoperative surgical stress. In contrast to halothane, isoflurane induced a change in the LVED P/A relationship, suggestive of an increased LVED stiffness.


Subject(s)
Anesthetics, Inhalation/pharmacology , Coronary Disease/physiopathology , Diastole/drug effects , Halothane/pharmacology , Isoflurane/pharmacology , Ventricular Function, Left/drug effects , Adult , Aged , Echocardiography , Female , Humans , Male , Middle Aged , Sternum/surgery
SELECTION OF CITATIONS
SEARCH DETAIL