Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 166
Filter
1.
Sci Rep ; 12(1): 16520, 2022 10 03.
Article in English | MEDLINE | ID: mdl-36192476

ABSTRACT

Effective mitigation of the impacts of invasive ship rats (Rattus rattus) requires a good understanding of their ecology, but this knowledge is very sparse for urban and peri-urban areas. We radiomarked ship rats in Wellington, New Zealand, to estimate detection parameters (σ, ε0, θ, and g0) that describe the process of an animal encountering a device (bait stations, chew cards and WaxTags) from a distance, and then approaching it and deciding whether to interact with it. We used this information in simulation models to estimate optimal device spacing for eradicating ship rats from Wellington, and for confirming eradication. Mean σ was 25.37 m (SD = 11.63), which equates to a circular home range of 1.21 ha. The mean nightly probability of an individual encountering a device at its home range center (ε0) was 0.38 (SD = 0.11), whereas the probability of interacting with the encountered device (θ) was 0.34 (SD = 0.12). The derived mean nightly probability of an individual interacting with a device at its home range center (g0) was 0.13 (SD = 0.08). Importantly, σ and g0 are intrinsically linked through a negative relationship, thus g0 should be derived from σ using a predictive model including individual variability. Simulations using this approach showed that bait stations deployed for about 500 days using a 25 m × 25 m grid consistently achieved eradication, and that a surveillance network of 3.25 chew cards ha-1 or 3.75 WaxTags ha-1 active for 14 nights would be required to confidently declare eradication. This density could be halved if the surveillance network was deployed for 28 nights or if the prior confidence in eradication was high (0.85). These recommendations take no account of differences in detection parameters between habitats. Therefore, if surveillance suggests that individuals are not encountering devices in certain habitats, device density should be adaptively revised. This approach applies to initiatives globally that aim to optimise eradication with limited funding.


Subject(s)
Introduced Species , Animals , New Zealand/epidemiology , Population Density , Rats
3.
Sci Adv ; 7(11)2021 03.
Article in English | MEDLINE | ID: mdl-33692107

ABSTRACT

Efficient decision-making integrates previous experience with new information. Tactical use of misinformation can alter choice in humans. Whether misinformation affects decision-making in other free-living species, including problem species, is unknown. Here, we show that sensory misinformation tactics can reduce the impacts of predators on vulnerable bird populations as effectively as lethal control. We repeatedly exposed invasive mammalian predators to unprofitable bird odors for 5 weeks before native shorebirds arrived for nesting and for 8 weeks thereafter. Chick production increased 1.7-fold at odor-treated sites over 25 to 35 days, with doubled or tripled odds of successful hatching, resulting in a 127% increase in modeled population size in 25 years. We demonstrate that decision-making processes that respond to changes in information reliability are vulnerable to tactical manipulation by misinformation. Altering perceptions of prey availability offers an innovative, nonlethal approach to managing problem predators and improving conservation outcomes for threatened species.

4.
Front Med Technol ; 3: 715969, 2021.
Article in English | MEDLINE | ID: mdl-35047948

ABSTRACT

Background: The COVID-19 pandemic, caused by severe acute respiratory syndrome coronavirus 2 (SARS-CoV-2), has placed a significant demand on healthcare providers (HCPs) to provide respiratory support for patients with moderate to severe symptoms. Continuous Positive Airway Pressure (CPAP) non-invasive ventilation can help patients with moderate symptoms to avoid the need for invasive ventilation in intensive care. However, existing CPAP systems can be complex (and thus expensive) or require high levels of oxygen, limiting their use in resource-stretched environments. Technical Development + Testing: The LeVe ("Light") CPAP system was developed using principles of frugal innovation to produce a solution of low complexity and high resource efficiency. The LeVe system exploits the air flow dynamics of electric fan blowers which are inherently suited to delivery of positive pressure at appropriate flow rates for CPAP. Laboratory evaluation demonstrated that performance of the LeVe system was equivalent to other commercially available systems used to deliver CPAP, achieving a 10 cm H2O target pressure within 2.4% RMS error and 50-70% FiO2 dependent with 10 L/min oxygen from a commercial concentrator. Pilot Evaluation: The LeVe CPAP system was tested to evaluate safety and acceptability in a group of ten healthy volunteers at Mengo Hospital in Kampala, Uganda. The study demonstrated that the system can be used safely without inducing hypoxia or hypercapnia and that its use was well-tolerated by users, with no adverse events reported. Conclusions: To provide respiratory support for the high patient numbers associated with the COVID-19 pandemic, healthcare providers require resource efficient solutions. We have shown that this can be achieved through frugal engineering of a CPAP ventilation system, in a system which is safe for use and well-tolerated in healthy volunteers. This approach may also benefit other respiratory conditions which often go unaddressed in Low and Middle Income Countries (LMICs) for want of context-appropriate technology designed for the limited oxygen resources available.

5.
Ecol Appl ; 30(8): e02200, 2020 12.
Article in English | MEDLINE | ID: mdl-32573866

ABSTRACT

Invasive mammalian predators can cause the decline and extinction of vulnerable native species. Many invasive mammalian predators are dietary generalists that hunt a variety of prey. These predators often rely upon olfaction when foraging, particularly at night. Little is understood about how prey odor cues are used to inform foraging decisions. Prey cues can vary spatially and temporally in their association with prey and can either reveal the location of prey or lead to unsuccessful foraging. Here we examine how two wild-caught invasive mammalian bird predator species (European hedgehogs Erinaceus europaeus and ferrets Mustela putorius furo) respond to unrewarded bird odors over successive exposures, first demonstrating that the odors are perceptually different using house mice (Mus musculus) as a biological olfactometer. We aim to test if introduced predators categorize odor cues of similar prey together, a tactic that could increase foraging efficiency. We exposed house mice to the odors using a standard habituation/dishabituation test in a laboratory setting, and wild-caught European hedgehogs and ferrets in an outdoor enclosure using a similar procedure. Mice discriminated among all bird odors presented, showing more interest in chicken odor than quail or gull odor. Both predator species showed a decline in interest toward unrewarded prey odor (i.e., habituation), but only ferrets generalized their response from one unrewarded bird odor to another bird odor. Hedgehog responses to unrewarded bird odors were highly variable between individuals. Taken together, our results reveal interspecific and intraspecific differences in response to prey odors, which we argue are a consequence of different diet breadth, life and evolutionary histories, and the conditions in each experiment. Generalization of prey odors may have enabled some species of invasive predators to efficiently hunt a range of intraguild prey species, for example, ground-nesting shorebirds. Olfactory manipulation of predators may be a useful conservation tool for threatened prey if it reduces the conspicuousness of vulnerable prey.


Subject(s)
Cues , Predatory Behavior , Animals , Birds , Mammals , Mice , Odorants
6.
Ecol Appl ; 29(1): e01814, 2019 01.
Article in English | MEDLINE | ID: mdl-30312506

ABSTRACT

Foraging mammalian predators face a myriad of odors from potential prey. To be efficient, they must focus on rewarding odors while ignoring consistently unrewarding ones. This may be exploited as a nonlethal conservation tool if predators can be deceived into ignoring odors of vulnerable secondary prey. To explore critical design components and assess the potential gains to prey survival of this technique, we created an individual-based model that simulated the hunting behavior of three introduced mammalian predators on one of their secondary prey (a migratory shorebird) in the South Island of New Zealand. Within this model, we heuristically assessed the outcome of habituating the predators to human-deployed unrewarding bird odors before the bird's arrival at their breeding grounds, i.e., the predators were "primed." Using known home range sizes and probabilities of predators interacting with food lures, our model suggests that wide-ranging predators should encounter a relatively large number of odor points (between 10 and 115) during 27 d of priming when odor is deployed within high-resolution grids (100-150 m). Using this information, we then modeled the effect of different habituation curves (exponential and sigmoidal) on the probability of predators depredating shorebird nests. Our results show that important gains in nest survival can be achieved regardless of the shape of the habituation curve, but particularly if predators are fast olfactory learners (exponential curve), and even if some level of dishabituation occurs after prey become available. Predictions from our model can inform the amount and pattern in which olfactory stimuli need to be deployed in the field to optimize encounters by predators, and the relative gains that can be expected from reduced predation pressure on secondary prey under different scenarios of predator learning. Habituating predators to odors of threatened secondary prey may have particular efficacy as a conservation tool in areas where lethal predator control is not possible or ethical, or where even low predator densities can be detrimental to prey survival. Our approach is also relevant for determining interaction probabilities for devices other than odor points, such as bait stations and camera traps.


Subject(s)
Birds , Odorants , Animals , Humans , Mammals , New Zealand , Predatory Behavior
7.
PLoS One ; 11(6): e0158078, 2016.
Article in English | MEDLINE | ID: mdl-27341209

ABSTRACT

European rabbits (Oryctolagus cuniculus) pose a major threat to agricultural production and conservation values in several countries. In New Zealand, population control via poisoning is a frontline method for limiting rabbit damage, with large areas commonly treated using the metabolic toxin sodium fluoroacetate ('1080') delivered in bait via aerial dispersal. However, this method is expensive and the high application rates of the active ingredient cause public antipathy towards it. To guide reductions in cost and toxin usage, we evaluated the economics and efficacy of rabbit control using an experimental approach of sowing 1080-bait in strips instead of the commonly-used broadcast sowing method (i.e. complete coverage). Over a 4-year period we studied aerial delivery of 0.02% 1080 on diced carrot bait over ~3500 ha of rabbit-prone land in the North and South islands. In each case, experimental sowing via strip patterns using 10-15 kg of bait per hectare was compared with the current best practice of aerial broadcast sowing at 30-35 kg/ha. Operational kill rates exceeded 87% in all but one case and averaged 93-94% across a total of 19 treatment replicates under comparable conditions; there was no statistical difference in overall efficacy observed between the two sowing methods. We project that strip-sowing could reduce by two thirds the amount of active 1080 applied per hectare in aerial control operations against rabbits, both reducing the non-target poisoning risk and promoting cost savings to farming operations. These results indicate that, similarly to the recently-highlighted benefits of adopting strip-sowing for poison control of introduced brushtail possums (Trichosurus vulpecula) in New Zealand, aerial strip-sowing of toxic bait could also be considered a best practice method for rabbit control in pest control policy.


Subject(s)
Agriculture , Introduced Species , Pest Control , Population Control , Animals , Cost-Benefit Analysis , Geography , New Zealand , Rabbits
8.
PLoS One ; 10(3): e0121865, 2015.
Article in English | MEDLINE | ID: mdl-25811977

ABSTRACT

In New Zealand, the introduced marsupial brushtail possum (Trichosurus vulpecula) is a pest species subject to control measures, primarily to limit its ability to transmit bovine tuberculosis (TB) to livestock and for conservation protection. To better define parameters for targeted possum control and TB surveillance, we here applied a novel approach to analyzing GPS data obtained from 44 possums fitted with radio-tracking collars, producing estimates of the animals' short-term nocturnal foraging patterns based on 1-, 3- or 5-nights' contiguous data. Studies were conducted within two semi-arid montane regions of New Zealand's South Island High Country: these regions support low-density possum populations (<2 possums/ha) in which the animals' home ranges are on average larger than in high-density populations in forested habitat. Possum foraging range width (FRW) estimates increased with increasing monitoring periods, from 150-200 m based on a single night's movement data to 300-400 m based on 5 nights' data. The largest average FRW estimates were recorded in winter and spring, and the smallest in summer. The results suggest that traps or poison-bait stations (for lethal control) or monitoring devices (for TB surveillance), set for > 3 consecutive nights at 150 m interval spacings, would likely place >95% of the possums in this type of habitat at risk of encountering these devices, year-round. Modelling control efficacy against operational expenditure, based on these estimations, identified the relative cost-effectiveness of various strategies that could be applied to a typical aerial poisoning operation, to reduce the ongoing TB vectorial risk that possums pose in the High Country regions. These habitat-specific findings are likely to be more relevant than the conventional pest control and monitoring methodologies developed for possums in their more typical forested habitat.


Subject(s)
Marsupialia/microbiology , Public Health Surveillance , Tuberculosis, Bovine/epidemiology , Tuberculosis, Bovine/prevention & control , Animals , Cattle , Cost-Benefit Analysis , Models, Statistical , New Zealand/epidemiology , Tuberculosis, Bovine/transmission
9.
Vet Res ; 45: 122, 2014 Dec 12.
Article in English | MEDLINE | ID: mdl-25496754

ABSTRACT

Controlling infectious diseases at the wildlife/livestock interface is often difficult because the ecological processes driving transmission between wildlife reservoirs and sympatric livestock populations are poorly understood. Thus, assessing how animals use their environment and how this affects interspecific interactions is an important factor in determining the local risk for disease transmission and maintenance. We used data from concurrently monitored GPS-collared domestic cattle and wild boar (Sus scrofa) to assess spatiotemporal interactions and associated implications for bovine tuberculosis (TB) transmission in a complex ecological and epidemiological system, Doñana National Park (DNP, South Spain). We found that fine-scale spatial overlap of cattle and wild boar was seasonally high in some habitats. In general, spatial interactions between the two species were highest in the marsh-shrub ecotone and at permanent water sources, whereas shrub-woodlands and seasonal grass-marshlands were areas with lower predicted relative interactions. Wild boar and cattle generally used different resources during winter and spring in DNP. Conversely, limited differences in resource selection during summer and autumn, when food and water availability were limiting, resulted in negligible spatial segregation and thus probably high encounter rates. The spatial gradient in potential overlap between the two species across DNP corresponded well with the spatial variation in the observed incidence of TB in cattle and prevalence of TB in wild boar. We suggest that the marsh-shrub ecotone and permanent water sources act as important points of TB transmission in our system, particularly during summer and autumn. Targeted management actions are suggested to reduce potential interactions between cattle and wild boar in order to prevent disease transmission and design effective control strategies.


Subject(s)
Animal Distribution , Cattle Diseases/transmission , Feeding Behavior , Swine Diseases/transmission , Tuberculosis, Bovine/transmission , Animals , Cattle/physiology , Cattle Diseases/epidemiology , Cattle Diseases/microbiology , Female , Geographic Information Systems , Incidence , Male , Prevalence , Spain/epidemiology , Swine/physiology , Swine Diseases/epidemiology , Swine Diseases/microbiology , Tuberculosis, Bovine/epidemiology , Tuberculosis, Bovine/microbiology
10.
PLoS One ; 9(7): e102982, 2014.
Article in English | MEDLINE | ID: mdl-25054199

ABSTRACT

Estimating the abundance of wild carnivores is of foremost importance for conservation and management. However, given their elusive habits, direct observations of these animals are difficult to obtain, so abundance is more commonly estimated from sign surveys or radio-marked individuals. These methods can be costly and difficult, particularly in large areas with heavy forest cover. As an alternative, recent research has suggested that wolf abundance can be estimated from occupancy-abundance curves derived from "virtual" surveys of simulated wolf track networks. Although potentially more cost-effective, the utility of this approach hinges on its robustness to violations of its assumptions. We assessed the sensitivity of the occupancy-abundance approach to four assumptions: variation in wolf movement rates, changes in pack cohesion, presence of lone wolves, and size of survey units. Our simulations showed that occupancy rates and wolf pack abundances were biased high if track surveys were conducted when wolves made long compared to short movements, wolf packs were moving as multiple hunting units as opposed to a cohesive pack, and lone wolves were moving throughout the surveyed landscape. We also found that larger survey units (400 and 576 km2) were more robust to changes in these factors than smaller survey units (36 and 144 km2). However, occupancy rates derived from large survey units rapidly reached an asymptote at 100% occupancy, suggesting that these large units are inappropriate for areas with moderate to high wolf densities (>15 wolves/1,000 km2). Virtually-derived occupancy-abundance relationships can be a useful method for monitoring wolves and other elusive wildlife if applied within certain constraints, in particular biological knowledge of the surveyed species needs to be incorporated into the design of the occupancy surveys. Further, we suggest that the applicability of this method could be extended by directly incorporating some of its assumptions into the modelling framework.


Subject(s)
Ecosystem , Models, Theoretical , Wolves , Alberta , Animals , Conservation of Natural Resources , Population Dynamics
11.
Epidemiol Infect ; 141(7): 1407-16, 2013 Jul.
Article in English | MEDLINE | ID: mdl-23433406

ABSTRACT

Sentinel species are increasingly used by disease managers to detect and monitor the prevalence of zoonotic diseases in wildlife populations. Characterizing home-range movements of sentinel hosts is thus important for developing improved disease surveillance methods, especially in systems where multiple host species co-exist. We studied ranging activity of major hosts of bovine tuberculosis (TB) in an upland habitat of New Zealand: we compared home-range coverage by ferrets (Mustela furo), wild deer (Cervus elaphus), feral pigs (Sus scrofa), brushtail possums (Trichosurus vulpecula) and free-ranging farmed cattle (Bos taurus). We also report in detail the proportional utilization of a seasonal (4-monthly) range area for the latter four species. Possums covered the smallest home range (<30 ha), ferrets covered ~100 ha, pigs ~4 km(2), deer and cattle both >30 km2. For any given weekly period, cattle, deer and pigs were shown to utilize 37­45% of their estimated 4-month range, while possums utilized 62% during any weekly period and 85% during any monthly period of their estimated 4-month range. We suggest that present means for estimating TB detection kernels, based on long-term range size estimates for possums and sentinel species, probably overstate the true local surveillance coverage per individual.


Subject(s)
Animals, Wild , Disease Reservoirs/veterinary , Homing Behavior , Sentinel Surveillance/veterinary , Tuberculosis, Bovine/prevention & control , Animals , Cattle , Deer , Ferrets , Geographic Information Systems , Mycobacterium bovis , New Zealand , Seasons , Swine , Trichosurus , Tuberculosis/veterinary
12.
Public Health Nutr ; 12(10): 1946-59, 2009 Oct.
Article in English | MEDLINE | ID: mdl-19402948

ABSTRACT

OBJECTIVE: As part of a larger study designed to understand how to protect the food and nutrition security of individuals living in a protected area of Gabon, we assessed their nutritional status and its relationship to dietary adequacy and health status. DESIGN: A 7 d food consumption survey was conducted during each of the two major seasons using a weighing method. Data were also collected on weight, height and health of individuals as well as on sociodemographic characteristics and potential determinants of the nutrition situation. SETTING: Four rural communities were intentionally selected to represent both inland and coastal settings and access to food markets. SUBJECTS: Approximately 500 individuals representing over 90% of the population of these communities participated in the survey during each season. RESULTS: Undernutrition was present in the area, particularly among children <5 years of age and the elderly. Health was generally good and under-fives were most frequently ill. Energy, Fe and vitamin A requirements of individuals were generally not satisfied; the opposite was true for protein. The estimated prevalence of inadequate intakes of energy and vitamin A was very high in most age groups. Global nutrient adequacy was associated with nutritional outcome. CONCLUSIONS: Individuals do not eat enough and breast-feeding practices are poor. Many suffer from undernutrition, particularly young children and the elderly. The results confirm the need to investigate the determinants of this poor nutrition situation to ensure that protection of natural resources will not be associated with harm to the well-being of the population.


Subject(s)
Conservation of Natural Resources , Diet/standards , Food Supply/standards , Malnutrition/epidemiology , Nutritional Status , Adolescent , Adult , Aged , Child , Child, Preschool , Diet Surveys , Gabon/epidemiology , Humans , Incidence , Infant , Middle Aged , Rural Health , Seasons , Young Adult
13.
Public Health Nutr ; 12(10): 1711-25, 2009 Oct.
Article in English | MEDLINE | ID: mdl-19063764

ABSTRACT

OBJECTIVE: To understand how access to natural resources may contribute to nutrition. DESIGN: In each of the two major seasons, data were collected during a 7 d period using observations, semi-structured interviews, anthropometric measures and a weighed food consumption survey. SETTING: Four rural communities selected to represent inland and coastal areas of the Gamba Complex in Gabon. SUBJECTS: In each community, all individuals from groups vulnerable to malnutrition, i.e. children aged 0-23 months (n 41) and 24-59 months (n 63) and the elderly (n 101), as well as women caregivers (n 96). RESULTS: In most groups, household access to natural resources was associated with household access to food but not with individual nutritional status. In children aged 0-23 months, access to care and to health services and a healthy environment were the best predictors of length-for-age (adjusted R2: 14%). Health status was the only predictor of weight-for-height in children aged 24-59 months (adjusted R2: 14%). In women caregivers, household food security was negatively associated with nutritional status, as was being younger than 20 years (adjusted R2: 16%). Among the elderly, only nutrient adequacy predicted nutritional status (adjusted R2: 5%). CONCLUSION: Improving access to care and health for young children would help reverse the process of undernutrition. Reaching a better understanding of how the access of individuals to both food and other resources relate to household access could further our appreciation of the constraints to good nutrition. This is particularly relevant in women to ensure that their possibly important contribution to the household is not at their own expense.


Subject(s)
Conservation of Natural Resources , Diet/statistics & numerical data , Food Supply , Growth , Health Services Accessibility/statistics & numerical data , Malnutrition/epidemiology , Nutritional Status , Adult , Aged , Body Size , Caregivers , Child, Preschool , Diet Surveys , Female , Gabon/epidemiology , Health Status , Humans , Infant , Infant, Newborn , Male , Prevalence , Rural Health
14.
Arch Environ Contam Toxicol ; 45(2): 216-20, 2003 Aug.
Article in English | MEDLINE | ID: mdl-14565579

ABSTRACT

A study was conducted to evaluate the impact of naled on honey bees as a result of their exposure to aerial ULV applications of this insecticide during three routine mosquito spray missions by Manatee County Mosquito Control District in Florida during the summer of 1999. Naled deposits were collected on filter paper and subsequently analyzed by gas chromatography. Mortality of adult honey bees Apis mellifera L. was estimated based on numbers from dead bee collectors placed in front of the entrance of the beehives. We found that honey bees clustering outside of the beehives were subject to naled exposure. Bee mortality increased when higher naled residues were found around the hives. The highest average naled deposit was 6,227 +/- 696 microg/m2 at the site 1 forest area following the mosquito spray mission on July 15, 1999. The range of naled deposition for this application was 2,818-7,101 microg/m2. The range of dead bees per hive was 0-39 prior to spraying and 9-200 within 24 h following this spray mission. The average yield of honey per hive was significantly lower (p < 0.05) for naled-exposed hives compared with unexposed hives. Because reduction of honey yield also may be affected by other factors, such as location of the hives relative to a food source and vigor of the queen bee, the final assessment of honey yield was complicated.


Subject(s)
Bees/growth & development , Environmental Exposure , Insecticides/poisoning , Naled/poisoning , Agriculture , Animals , Honey , Population Dynamics , Risk Assessment , Survival Analysis
15.
East Afr Med J ; 79(11): 598-603, 2002 Nov.
Article in English | MEDLINE | ID: mdl-12630494

ABSTRACT

OBJECTIVE: To evaluate the efficacy of a multiple micronutrient fortified beverage containing eleven nutrients at physiological levels in prevention of anaemia and improving iron and vitamin A status during pregnancy. DESIGN: A randomised double blind placebo controlled study. SETTING: Mpwapwa and Kongwa Districts in Dodoma Region of Tanzania. SUBJECTS: Five hundred and seventy nine pregnant women were screened for entry into the study and 439 women who met the study criteria were enrolled. INTERVENTIONS: Study participants received either a fortified (F) or non-fortified (NF) orange flavoured drinks identical in appearance, provided in two self administered servings per day for an eight week period. MAIN OUTCOME MEASURES: Comparison of haemoglobin (Hb), serum ferritin (SF) and serum retinol (SR) at baseline and follow up. RESULTS: After eight weeks of supplementation, the F group (n=129) had a significantly higher Hb increase of 0.86 g/dL compared to 0.45 g/dL in the NF group (n=130) p<0.0001. Gestational age at entry into the study, moderated the effect on Hb of the fortified drink. Women at earlier gestational age upon entry, had a higher rise in Hb than women of late gestational age (0.8 g/dL versus 0.04 g/dL rise respectively, p=0.038, n=188). The risk of being anaemic at the end of the study for those in the F group was reduced by 51% (RR=0.49, CI=0.28 to 0.85). Iron stores (by serum ferritin levels) increased by 3 microg/L in the F group (p=0.012) and a decrease of 2 microg/L in the NF group (p=0.115). The follow up ferritin concentration depended on initial ferritin level. Regardless of treatment group, serum retinol concentrations were significantly higher in mothers who had delivered. Mothers who had adequate levels at entry benefited more from the supplement than those with low levels (0.26 micromol/L versus no significant difference). CONCLUSIONS: The multiple micronutrient-fortified beverage given for eight weeks to pregnant women improved their haemoglobin, serum ferritin and retinol status. The risk for anaemia was also significantly reduced. The important predictors of Hb increase at follow up were the fortified beverage, baseline Hb, serum retinol, baseline ferritin and gestational age at entry into study. Anthropological research showed that the beverage was highly acceptable and well liked.


Subject(s)
Anemia, Iron-Deficiency/diet therapy , Beverages , Ferrous Compounds/therapeutic use , Food, Fortified , Micronutrients/therapeutic use , Pregnancy Complications, Hematologic/diet therapy , Vitamins/therapeutic use , Anemia, Iron-Deficiency/blood , Anemia, Iron-Deficiency/diagnosis , Beverages/analysis , Double-Blind Method , Female , Ferritins/blood , Ferrous Compounds/analysis , Follow-Up Studies , Food, Fortified/analysis , Gestational Age , Hemoglobins/analysis , Humans , Micronutrients/analysis , Pregnancy , Pregnancy Complications, Hematologic/blood , Pregnancy Complications, Hematologic/diagnosis , Risk Factors , Rural Health/statistics & numerical data , Tanzania , Treatment Outcome , Vitamin A/blood , Vitamins/analysis
17.
Arch Latinoam Nutr ; 51(1 Suppl 1): 37-41, 2001 Mar.
Article in English | MEDLINE | ID: mdl-11688080

ABSTRACT

Currently the three main widely used strategies to control micronutrient deficiencies are food diversification, fortification, and consumption of medicinal supplements. In Tanzania a fourth strategy has been evaluated in school children, and is to be studied in pregnant and lactating women. The dietary supplement comes in the form of a powder used to prepare a fruit flavored drink. Children consumed for six months 25 grams per school day attended, the powder being added to 200 ml of water. The dietary supplement provides between 40 and 100 percent of the RDA of 10 micronutrients, which includes iron, vitamin A and iodine. Unlike medicinal supplements it provides the multiple vitamins and minerals in physiologic, not megadoses. In a well conducted randomized double blind placebo controlled trial, a dietary supplement in the form of a fortified powder fruit drink produced statistically significant differences not only in vitamin A and iron status, but also in the growth of young school age children.


Subject(s)
Dietary Supplements , Micronutrients , Beverages , Child , Deficiency Diseases/prevention & control , Double-Blind Method , Humans , Tanzania
18.
Thorax ; 56(9): 727-33, 2001 Sep.
Article in English | MEDLINE | ID: mdl-11514695

ABSTRACT

BACKGROUND: The sleep apnoea/hypopnoea syndrome (SAHS) is common and treatment with continuous positive airway pressure (CPAP) is effective. However, not all patients can cope with the demands of using mask positive pressure. Compliance can be improved with an intensive educational programme and patient support, but this is not practical in most centres given the large numbers of patients coming forward for treatment. Several studies have evaluated correlations between various parameters at diagnosis in order to anticipate patients' behaviour and to avoid the social and health implications of undertreated SAHS. We have evaluated the use of additional data derived during a 2 week home CPAP trial to identify factors associated with longer term use of CPAP and compliance. METHODS: Following a diagnostic study, 209 patients were offered a CPAP machine for a 2 week home trial. After completing the trial, patients were reassessed and scored their overall satisfaction with CPAP treatment on a five point scale ranging from "much worse" to "much better" and an Epworth score relating to the loan period. Machine run time was recorded from the integral clock. These data were added to those available at diagnosis to construct models indicative of continuing CPAP and average nightly use at 1 year. RESULTS: 209 patients were offered the 2 week loan at least a year before June 1999 (90.9% men, mean (SD) age 51.0 (10.6) years, body mass index (BMI) 34.6 (7.7) kg/m(2), Epworth score 15 (IQR 11-18), apnoea/hypopnoea index (AHI) 38.1 (22.9) events/h). 153 patients (73.2%) opted to continue CPAP and 56 declined. One year later data were available for 187 patients; 128 (68.5% on an intention to treat analysis) continued to use the machine with a mean use of 5.0 (2.4) hours/night. A logistic regression model indicated that mean CPAP use during the loan period and the overall satisfaction score accurately defined continuing CPAP and "satisfactory" CPAP use at 1 year. For patients with low machine use and no symptomatic improvement during the loan period, the addition of baseline AHI, baseline Epworth score, and the Epworth score at the end of the loan to the equation identifying factors associated with "satisfactory" CPAP use (mean >2 hours/night) improved the value of the model. CONCLUSION: Data derived from a 2 week CPAP trial are useful in identifying patients who will comply with CPAP treatment to 1 year. It can be used to identify patients with significant symptomatic disease who will struggle with CPAP and may benefit from additional education and support. High mean hourly use and a high degree of overall satisfaction during the loan period identified patients likely to use CPAP and be compliant with it at 1 year.


Subject(s)
Positive-Pressure Respiration/methods , Sleep Apnea Syndromes/therapy , Adult , Female , Humans , Long-Term Care/methods , Male , Middle Aged , Patient Compliance , Patient Satisfaction , Positive-Pressure Respiration/standards
19.
Anaesthesia ; 56(3): 235-8, 2001 Mar.
Article in English | MEDLINE | ID: mdl-11251430

ABSTRACT

Non-invasive positive pressure ventilation has previously been used successfully to treat both acute and chronic ventilatory failure secondary to a number of conditions, including scoliosis. We report two patients in whom it was used, on three separate occasions, to treat acute ventilatory failure following corrective spinal surgery. Non-invasive positive pressure ventilation may be useful postoperatively in high-risk patients undergoing major spinal surgery in an attempt to prevent intubation and its attendant complications.


Subject(s)
Positive-Pressure Respiration , Postoperative Complications/therapy , Respiratory Insufficiency/therapy , Scoliosis/surgery , Adolescent , Carbon Dioxide/blood , Child , Female , Follow-Up Studies , Humans , Male , Oxygen/blood , Partial Pressure , Respiratory Insufficiency/blood , Respiratory Insufficiency/etiology
20.
J Am Mosq Control Assoc ; 17(4): 225-30, 2001 Dec.
Article in English | MEDLINE | ID: mdl-11804458

ABSTRACT

A rapid gas chromatographic method for detecting residues of the thio-organophosphate naled was developed and subsequently validated in laboratory and field studies. More than 90% of naled was recovered by a gas chromatograph when equipped with a DB-5 capillary column and a thermionic specific detector. The limit of detection was 0.01 microg/ml with direct injection. Stabilization of naled under a variety of storage conditions also was examined. Analysis of field data showed that naled broke down rapidly in the environment but was stable when stored in hexane solvent at 4 degrees C and 23 degrees C for at least 7 days. Range of percentage matrix spike recovery was 31-49% for filter paper samples exposed under field conditions for 14 h. A field study was also initiated that collected naled droplets trapped on 6.7-m acrylic mohair-look yarn strands in addition to residue on filter paper after aerial ultra-low-volume mosquito adulticide application. Spike recovery was 79% for filter paper samples and 93% for yarn samples. Average naled residue concentrations with these methods were 373 microg/m2 and 11.28-73.77 microg/yarn, respectively.


Subject(s)
Insecticides/analysis , Naled/analysis , Chromatography, Gas , Environment , Time Factors
SELECTION OF CITATIONS
SEARCH DETAIL
...