Your browser doesn't support javascript.
loading
Show: 20 | 50 | 100
Results 1 - 20 de 37
Filter
1.
Endocrinology ; 165(8)2024 Jul 01.
Article in English | MEDLINE | ID: mdl-39028678

ABSTRACT

Recognizing the limitations of current therapies for Addison's disease, novel treatments that replicate dynamic physiologic corticosteroid secretion, under control of ACTH, are required. The aim of these experiments was to evaluate the feasibility of adrenocortical cell transplantation (ACT) in a large animal model, adapting methods successfully used for intracutaneous pancreatic islet cell transplantation, using a fully biodegradable temporizing matrix. Autologous porcine ACT was undertaken by bilateral adrenalectomy, cell isolation, culture, and intracutaneous injection into a skin site preprepared using a biodegradable temporizing matrix (BTM) foam. Hydrocortisone support was provided during adrenocortical cell engraftment and weaned as tolerated. Blood adrenocortical hormone concentrations were monitored, and the transplant site was examined at endpoint. Outcome measures included cellular histochemistry, systemic hormone production, and hydrocortisone independence. Transplanted adrenocortical cells showed a capability to survive and proliferate within the intracutaneous site and an ability to self-organize into discrete tissue organoids with features of the normal adrenal histologic architecture. Interpretation of systemic hormone levels was confounded by the identification of accessory adrenals and regenerative cortical tissue within the adrenal bed postmortem. Corticosteroids were unable to be completely ceased. ACT in a large animal model has not previously been attempted, yet it is an important step toward clinical translation. These results demonstrate rhe potential for ACT based on the development of adrenal organoids at the BTM site. However, the inability to achieve clinically relevant systemic hormone production suggests insufficient function, likely attributable to insufficient cells through delivered dose and subsequent proliferation.


Subject(s)
Adrenal Cortex , Organoids , Animals , Swine , Adrenal Cortex/cytology , Adrenal Cortex/metabolism , Hydrocortisone/blood , Adrenal Glands/metabolism , Female , Cell Transplantation/methods , Adrenalectomy , Models, Animal
2.
Vaccines (Basel) ; 12(6)2024 Jun 03.
Article in English | MEDLINE | ID: mdl-38932337

ABSTRACT

Kidney transplant recipients are at an increased risk of hospitalisation and death from SARS-CoV-2 infection, and standard two-dose vaccination schedules are typically inadequate to generate protective immunity. Gut dysbiosis, which is common among kidney transplant recipients and known to effect systemic immunity, may be a contributing factor to a lack of vaccine immunogenicity in this at-risk cohort. The gut microbiota modulates vaccine responses, with the production of immunomodulatory short-chain fatty acids by bacteria such as Bifidobacterium associated with heightened vaccine responses in both observational and experimental studies. As SCFA-producing populations in the gut microbiota are enhanced by diets rich in non-digestible fibre, dietary supplementation with prebiotic fibre emerges as a potential adjuvant strategy to correct dysbiosis and improve vaccine-induced immunity. In a randomised, double-bind, placebo-controlled trial of 72 kidney transplant recipients, we found dietary supplementation with prebiotic inulin for 4 weeks before and after a third SARS-CoV2 mRNA vaccine to be feasible, tolerable, and safe. Inulin supplementation resulted in an increase in gut Bifidobacterium, as determined by 16S RNA sequencing, but did not increase in vitro neutralisation of live SARS-CoV-2 virus at 4 weeks following a third vaccination. Dietary fibre supplementation is a feasible strategy with the potential to enhance vaccine-induced immunity and warrants further investigation.

3.
Diabetes ; 72(6): 758-768, 2023 06 01.
Article in English | MEDLINE | ID: mdl-36929171

ABSTRACT

Intrahepatic islet transplantation for type 1 diabetes is limited by the need for multiple infusions and poor islet viability posttransplantation. The development of alternative transplantation sites is necessary to improve islet survival and facilitate monitoring and retrieval. We tested a clinically proven biodegradable temporizing matrix (BTM), a polyurethane-based scaffold, to generate a well-vascularized intracutaneous "neodermis" within the skin for islet transplantation. In murine models, BTM did not impair syngeneic islet renal-subcapsular transplant viability or function, and it facilitated diabetes cure for over 150 days. Furthermore, BTM supported functional neonatal porcine islet transplants into RAG-1-/- mice for 400 days. Hence, BTM is nontoxic for islets. Two-photon intravital imaging used to map vessel growth through time identified dense vascular networks, with significant collagen deposition and increases in vessel mass up to 30 days after BTM implantation. In a preclinical porcine skin model, BTM implants created a highly vascularized intracutaneous site by day 7 postimplantation. When syngeneic neonatal porcine islets were transplanted intracutaneously, the islets remained differentiated as insulin-producing cells, maintained normal islet architecture, secreted c-peptide, and survived for over 100 days. Here, we show that BTM facilitates formation of an islet-supportive intracutaneous neodermis in a porcine preclinical model, as an alternative islet-transplant site. ARTICLE HIGHLIGHTS: Human and porcine pancreatic islets were transplanted into a fully vascularized biodegradable temporizing matrix (Novosorb) that creates a unique intracutaneous site outside of the liver in a large-animal preclinical model. The intracutaneous prevascularized site supported pancreatic islet survival for 3 months in a syngeneic porcine-transplant model. Pancreatic (human and porcine) islet survival and function were demonstrated in an intracutaneous site outside of the liver for the first time in a large-animal preclinical model.


Subject(s)
Diabetes Mellitus, Type 1 , Islets of Langerhans Transplantation , Islets of Langerhans , Swine , Humans , Animals , Mice , Islets of Langerhans Transplantation/methods , Graft Survival , Islets of Langerhans/blood supply , Diabetes Mellitus, Type 1/surgery , Collagen
4.
Cell Tissue Bank ; 24(2): 341-349, 2023 Jun.
Article in English | MEDLINE | ID: mdl-36322205

ABSTRACT

There is a need to identify additional routes of supply for ophthalmic tissue in the UK. This paper reports the findings from a national study exploring the potential for eye donation (ED) from three Hospice Care (HC) and three Hospital Palliative Care Services (HPC) in England. The objectives addressed in this paper are i.) to establish the size and describe the clinical characteristics of the potential eye donor population across six clinical sites; ii.) to identify challenges for clinicians in applying the standard ED criteria for assessing patient eligibility. Retrospective assessment of 1199 deceased patient case notes, 601 Hospice Care and 598 Hospital Palliative Care services, against current eye donation criteria. Clinicians' assessments were then evaluated against the same criteria. by specialists based at the National Health Service Blood and Transplant Tissue Services division (NHSBT-TS). Results of the assessment and evaluation are reported as descriptive statistics (numerical data). Free-text comment boxes facilitated clarification and/or justification of review and evaluation decisions. 46% (n = 553) of 1199 deceased patients' notes were agreed as eligible for eye donation (Hospice care settings = 56% (n = 337); Palliative care settings = 36% (n = 216). For all eligible cases (n = 553) the option of ED was recorded as being raised with family members in only 14 cases (3%). Significant potential exists for eye donation from the clinical sites in this study. This potential is not currently being realised.


Subject(s)
Eye , Hospice Care , Hospices , Tissue and Organ Procurement , Humans , England , Palliative Care/methods , Retrospective Studies , State Medicine
5.
Lancet Oncol ; 23(8): 1078-1086, 2022 08.
Article in English | MEDLINE | ID: mdl-35809595

ABSTRACT

BACKGROUND: Most kidney transplant recipients with cancer stop or reduce immunosuppressive therapy before starting treatment with an immune checkpoint inhibitor, and approximately 40% of such patients will develop allograft rejection. Isolated immunosuppression reduction might be associated with organ rejection. Whether immunosuppression manipulation, immune checkpoint inhibition, or both, induce organ rejection is difficult to ascertain. The aim of this study was to examine the risk of allograft rejection with immune checkpoint inhibitor exposure when baseline immunosuppression was left unchanged. METHODS: We conducted a multicentre, single-arm, phase 1 study in three hospitals in Australia. Kidney transplant recipients aged 18 years or older with incurable, locally advanced cancer or defined metastatic solid tumours were eligible if they had a creatinine concentration of less than 180 mmol/L, no or low concentrations of donor-specific HLA antibodies, and an Eastern Cooperative Oncology Group status of 0-2. Patients received standard doses of nivolumab (3 mg/kg intravenously every 14 days for five cycles, then 480 mg every 28 days for up to 2 years). The primary endpoint was the proportion of patients with irretrievable allograft rejection and no evidence of tumour response. Primary outcome analyses and safety analyses were done in the modified intention-to-treat population. This trial is registered with the Australian and New Zealand Clinical Trials Register, ANZCTR12617000741381, and is completed. FINDINGS: Between May 31, 2017, and Aug 6, 2021, 22 kidney transplant recipients with various solid tumours were screened and enrolled, four of whom chose not to proceed in the study and one of whom had unexpected disease progression. 17 patients (six [35%] women and 11 [65%] men; median age 67 years [IQR 59-71]) were allocated treatment with nivolumab and were included in the analyses. The trial was then stopped due to ongoing difficulties with running clinical trials during COVID-19 health restrictions. Patients were treated with a median of three infusions (IQR 2-10) and median follow-up was 28 months (IQR 16-34). No patients had irretrievable allograft rejection without evidence of tumour response. There were no treatment-related deaths or treatment-related serious adverse events. The most common grade 3 or grade 4 adverse events were decreased lymphocyte count in four (24%) patients, fever or infection in four (24%) patients, decreased haemoglobin in three (18%) patients, and increased creatinine in three (18%) patients. INTERPRETATION: Maintaining baseline immunosuppression before treatment with an immune checkpoint inhibitor in kidney transplant recipients might not affect expected efficacy and might reduce the risk of allograft rejection mediated by immune checkpoint inhibitors. FUNDING: Bristol Myers Squibb.


Subject(s)
COVID-19 , Kidney Transplantation , Aged , Antineoplastic Combined Chemotherapy Protocols/therapeutic use , Australia , Creatinine , Female , Humans , Immune Checkpoint Inhibitors/adverse effects , Kidney Transplantation/adverse effects , Male , Nivolumab
6.
JMIR Aging ; 5(2): e33714, 2022 May 05.
Article in English | MEDLINE | ID: mdl-35511248

ABSTRACT

BACKGROUND: Many older adults prefer to remain in their own homes for as long as possible. However, there are still questions surrounding how best to ensure that an individual can cope with autonomous living. Technological monitoring systems are an attractive solution; however, there is disagreement regarding activities of daily living (ADL) and the optimal technologies that should be used to monitor them. OBJECTIVE: This study aimed to understand older adults' perceptions of important ADL and the types of technologies they would be willing to use within their own homes. METHODS: Semistructured interviews were conducted on the web with 32 UK adults, divided equally into a younger group (aged 55-69 years) and an older group (≥70 years). RESULTS: Both groups agreed that ADL related to personal hygiene and feeding were the most important and highlighted the value of socializing. The older group considered several activities to be more important than their younger counterparts, including stair use and foot care. The older group had less existing knowledge of monitoring technology but was more willing to accept wearable sensors than the younger group. The younger group preferred sensors placed within the home but highlighted that they would not have them until they felt that daily life was becoming a struggle. CONCLUSIONS: Overall, technological monitoring systems were perceived as an acceptable method for monitoring ADL. However, developers and carers must be aware that individuals may express differences in their willingness to engage with certain types of technology depending on their age and circumstances.

8.
Article in English | MEDLINE | ID: mdl-33379319

ABSTRACT

The use of technology has been suggested as a means of allowing continued autonomous living for older adults, while reducing the burden on caregivers and aiding decision-making relating to healthcare. However, more clarity is needed relating to the Activities of Daily Living (ADL) recognised, and the types of technology included within current monitoring approaches. This review aims to identify these differences and highlight the current gaps in these systems. A scoping review was conducted in accordance with PRISMA-ScR, drawing on PubMed, Scopus, and Google Scholar. Articles and commercially available systems were selected if they focused on ADL recognition of older adults within their home environment. Thirty-nine ADL recognition systems were identified, nine of which were commercially available. One system incorporated environmental and wearable technology, two used only wearable technology, and 34 used only environmental technologies. Overall, 14 ADL were identified but there was variation in the specific ADL recognised by each system. Although the use of technology to monitor ADL of older adults is becoming more prevalent, there is a large variation in the ADL recognised, how ADL are defined, and the types of technology used within monitoring systems. Key stakeholders, such as older adults and healthcare workers, should be consulted in future work to ensure that future developments are functional and useable.


Subject(s)
Activities of Daily Living , Wearable Electronic Devices , Aged , Humans , Independent Living , Technology
9.
MMWR Morb Mortal Wkly Rep ; 69(3): 67-71, 2020 Jan 24.
Article in English | MEDLINE | ID: mdl-31971935

ABSTRACT

Zika virus infection during pregnancy can cause congenital brain and eye abnormalities and is associated with neurodevelopmental abnormalities (1-3). In areas of the United States that experienced local Zika virus transmission, the prevalence of birth defects potentially related to Zika virus infection during pregnancy increased in the second half of 2016 compared with the first half (4). To update the previous report, CDC analyzed population-based surveillance data from 22 states and territories to estimate the prevalence of birth defects potentially related to Zika virus infection, regardless of laboratory evidence of or exposure to Zika virus, among pregnancies completed during January 1, 2016-June 30, 2017. Jurisdictions were categorized as those 1) with widespread local transmission of Zika virus; 2) with limited local transmission of Zika virus; and 3) without local transmission of Zika virus. Among 2,004,630 live births, 3,359 infants and fetuses with birth defects potentially related to Zika virus infection during pregnancy were identified (1.7 per 1,000 live births, 95% confidence interval [CI] = 1.6-1.7). In areas with widespread local Zika virus transmission, the prevalence of birth defects potentially related to Zika virus infection during pregnancy was significantly higher during the quarters comprising July 2016-March 2017 (July-September 2016 = 3.0; October-December 2016 = 4.0; and January-March 2017 = 5.6 per 1,000 live births) compared with the reference period (January-March 2016) (1.3 per 1,000). These findings suggest a fourfold increase (prevalence ratio [PR] = 4.1, 95% CI = 2.1-8.4) in birth defects potentially related to Zika virus in widespread local transmission areas during January-March 2017 compared with that during January-March 2016, with the highest prevalence (7.0 per 1,000 live births) in February 2017. Population-based birth defects surveillance is critical for identifying infants and fetuses with birth defects potentially related to Zika virus regardless of whether Zika virus testing was conducted, especially given the high prevalence of asymptomatic disease. These data can be used to inform follow-up care and services as well as strengthen surveillance.


Subject(s)
Congenital Abnormalities/epidemiology , Congenital Abnormalities/virology , Population Surveillance , Pregnancy Complications, Infectious/virology , Zika Virus Infection/complications , Female , Humans , Infant , Infant, Newborn , Male , Pregnancy , Prevalence , Puerto Rico/epidemiology , United States/epidemiology , United States Virgin Islands/epidemiology
10.
BMC Public Health ; 18(1): 753, 2018 06 18.
Article in English | MEDLINE | ID: mdl-29914455

ABSTRACT

BACKGROUND: Screen-time and unhealthy dietary behaviours are highly pervasive in young children and evidence suggests that these behaviours often co-occur and are associated. Identifying clusters of unhealthy behaviours, and their influences early in childhood, can assist in the development of targeted preventive interventions. The purpose of this study was to examine the sociodemographic, behavioural, and home physical environmental correlates of co-occurring screen-time and unhealthy eating behaviours and to assess the clustering of screen-time and unhealthy dietary behaviours in young children. METHODS: Parents of 126 children, from the UK, aged 5-6 years (49% boys) completed a questionnaire which assessed their child's screen-time (ST), fruit and vegetable (FV), and energy-dense (ED) snack consumption. Categories of health behaviours were created based on frequencies of children meeting recommendations for FV and ST and median splits of frequencies for ED snacks. Parents reported on their own behaviours (ST, FV, and ED snack consumption), how often they ate meals and watched TV with their child, and on the availability and accessibility of foods within the home. An observed over expected ratio (O/E) was used to assess behavioural clustering. Multivariable multinomial logistic regression was used to examine correlates of behaviour patterns. RESULTS: Approximately 25% of children had two or three health risk behaviours. Correlates consistently associated with clusters included parental income, eating meals at the TV, parental ST and ED snack food consumption, and home availability of ED snack foods. Observed over expected ratios were close to 1 and ranged from 0.78 to 1.43. The three-risk behaviour combination of insufficient FV consumption, high ED snack consumption, and excessive ST occurred more frequently than expected (1.23 (95% CI 0.89, 1.58)). CONCLUSIONS: ST and unhealthy dietary behaviours cluster in children as young as 5 years of age and parents' own behaviours appear to be important influencing factors. Further research into the development of behavioural clustering in young children to identify and further understand the mechanisms underlying the synergy among health behaviours is needed. Feasibility interventions promoting reductions in both screen-time and unhealthy dietary behaviours reciprocally, while simultaneously focusing on changing parental behaviours, are warranted.


Subject(s)
Child Behavior/psychology , Diet/psychology , Feeding Behavior/psychology , Screen Time , Child , Child, Preschool , Cluster Analysis , Diet/statistics & numerical data , Energy Intake , Female , Fruit , Habits , Health Behavior , Humans , Male , Meals , Parents/psychology , Risk Factors , Snacks/psychology , Surveys and Questionnaires , United Kingdom , Vegetables
11.
J Endocr Soc ; 1(3): 202-210, 2017 Mar 01.
Article in English | MEDLINE | ID: mdl-29264477

ABSTRACT

Corticosteroid-binding globulin (CBG) is secreted as high-affinity CBG (haCBG), which may be cleaved by tissue proteases to low-affinity CBG (laCBG), releasing free cortisol. Pregnancy and the estrogen-based combined oral contraceptive pill (COCP) increase CBG concentrations twofold to threefold. The relative effects of these two hyperestrogenic states on the CBG affinity forms are unknown. We performed an observational study in 30 pregnant women, 27 COCP takers and 23 controls. We analyzed circulating total CBG, haCBG, laCBG, and free and total cortisol concentrations. In pregnancy, total CBG and haCBG were increased compared to controls (both P < 0.0001); however, laCBG concentrations were similar. In COCP takers, total CBG and haCBG were increased [802 ± 41 vs compared to controls (both P < 0.0001)], but laCBG was also increased (P = 0.03). Pregnancy and use of COCP were associated with a comparable rise in haCBG, but laCBG was lower in pregnancy (P < 0.0001). These results were consistent with an estrogen-mediated increase in CBG synthesis in both hyperestrogenemic states but with reduced CBG cleavage in pregnancy relative to the COCP, perhaps due to pregnancy-induced CBG glycosylation. Speculatively, increased circulating haCBG concentrations in pregnancy may provide an increased reservoir of CBG-bound cortisol to prepare for the risk of puerperal infection or allow for cortisol binding in the face of competition from increased circulating progesterone concentrations.

12.
Hum Gene Ther Clin Dev ; 28(4): 178-186, 2017 12.
Article in English | MEDLINE | ID: mdl-29130351

ABSTRACT

Over a 10-year period, the Gene Therapy Resource Program (GTRP) of the National Heart Lung and Blood Institute has provided a set of core services to investigators to facilitate the clinical translation of gene therapy. These services have included a preclinical (research-grade) vector production core; current Good Manufacturing Practice clinical-grade vector cores for recombinant adeno-associated virus and lentivirus vectors; a pharmacology and toxicology core; and a coordinating center to manage program logistics and to provide regulatory and financial support to early-phase clinical trials. In addition, the GTRP has utilized a Steering Committee and a Scientific Review Board to guide overall progress and effectiveness and to evaluate individual proposals. These resources have been deployed to assist 82 investigators with 172 approved service proposals. These efforts have assisted in clinical trial implementation across a wide range of genetic, cardiac, pulmonary, and blood diseases. Program outcomes and potential future directions of the program are discussed.


Subject(s)
Dependovirus/genetics , Genetic Therapy/trends , Lentivirus/genetics , Translational Research, Biomedical/trends , Anniversaries and Special Events , Genetic Vectors , Humans , National Heart, Lung, and Blood Institute (U.S.) , United States
14.
BMC Public Health ; 17(1): 533, 2017 05 31.
Article in English | MEDLINE | ID: mdl-28569188

ABSTRACT

BACKGROUND: Screen-time and eating behaviours are associated in adolescents, but few studies have examined the clustering of these health behaviours in this age group. The identification of clustered health behaviours, and influences on adolescents' clustered health behaviours, at the time when they are most likely to become habitual, is important for intervention design. The purpose of this study was to assess the prevalence and clustering of health behaviours in adolescents, and examine the sociodemographic, individual, behavioural, and home social and physical environmental correlates of clustered health behaviours. METHODS: Adolescents aged 11-12 years (n = 527, 48% boys) completed a questionnaire during class-time which assessed screen-time (ST), fruit and vegetable (FV), and energy-dense (ED) snack consumption using a Food Frequency Questionnaire. Health behaviours were categorised into high and low frequencies based on recommendations for FV and ST and median splits for ED snacks. Adolescents reported on their habits, self-efficacy, eating at the television (TV), eating and watching TV together with parents, restrictive parenting practices, and the availability and accessibility of foods within the home. Behavioural clustering was assessed using an observed over expected ratio (O/E). Correlates of clustered behaviours were examined using multivariate multinomial logistic regression. RESULTS: Approximately 70% reported having two or three health risk behaviours. Overall, O/E ratios were close to 1, which indicates clustering. The three risk behaviour combination of low FV, high ED, and high ST occurred more frequently than expected (O/E ratio = 1.06 95% CI 1.01, 1.15. Individual, behavioural, and social and physical home environmental correlates were differentially associated with behavioural clusters. Correlates consistently associated with clusters included eating ED snacks while watching TV, eating at the TV with parents, and the availability and accessibility of ED snack foods within the home. CONCLUSIONS: There is a high prevalence of screen time and unhealthy eating, and screen time is coupled with unhealthy dietary behaviours. Strategies and policies are required that simultaneously address reductions in screen time and changes to habitual dietary patterns, such as TV snacking and snack availability and accessibility. These may require a combination of individual, social and environmental changes alongside conscious and more automatic (nudging) strategies.


Subject(s)
Adolescent Behavior/psychology , Feeding Behavior/psychology , Health Behavior , Parents/psychology , Snacks/psychology , Television/statistics & numerical data , Adolescent , Attitude to Health , Child , Cluster Analysis , Cross-Sectional Studies , Female , Humans , Male , Prevalence , Self Efficacy , Surveys and Questionnaires , United Kingdom/epidemiology
15.
Appetite ; 112: 35-43, 2017 05 01.
Article in English | MEDLINE | ID: mdl-28062200

ABSTRACT

This study aimed to examine individual, behavioural and home environmental factors associated with frequency of consumption of fruit, vegetables and energy-dense snacks among adolescents. Adolescents aged 11-12 years (n = 521, 48% boys) completed a paper-based questionnaire during class-time which included a Food Frequency Questionnaire assessing their consumption of fruit, vegetables, and energy-dense (ED) snacks, and items assessing habits, self-efficacy, eating at the television (TV), eating with parents, parenting practices, and home availability and accessibility of foods. Multiple linear regression analyses showed that eating fruit and vegetables while watching TV and home availability and accessibility of fruit and vegetables were positively associated with frequency of fruit consumption and vegetable consumption, while home accessibility of ED snack foods was negatively associated with frequency of fruit consumption. Habit for eating ED snack foods in front the TV, eating ED snack foods while watching TV, and home availability of ED snacks were positively associated with frequency of ED snack consumption. This study has highlighted the importance of a healthy home environment for promoting fruit and vegetable intake in early adolescents and also suggests that, if snacking while TV viewing occurs, this could be a good opportunity for promoting fruit and vegetable intake. These findings are likely to be useful for supporting the development of multi-faceted interventions and aid us in knowing what advice to give to parents to help them to help their young adolescents to develop and maintain healthy eating habits.


Subject(s)
Child Behavior , Diet , Feeding Behavior , Food Supply , Parents , Snacks , Television , Child , Cross-Sectional Studies , Diet Surveys , Eating , Energy Intake , Environment , Family , Female , Fruit , Habits , Humans , Male , Parenting , Social Environment , Vegetables
16.
Clin Chem Lab Med ; 55(8): 1135-1141, 2017 Jul 26.
Article in English | MEDLINE | ID: mdl-28076307

ABSTRACT

BACKGROUND: Current commercial tubes have difficulties in producing "true" serum from all blood samples even within the recommended clotting times. Hence, Becton Dickinson (BD) and now Greiner have produced tubes containing thrombin as the procoagulant to reduce the clotting time and increase the possibility of producing serum from anticoagulated blood samples. METHODS: The Greiner BCA Fast Clot (GBBCAFC) tube was evaluated in a hospital environment using 40 participants, (30 healthy and 10 undergoing renal dialysis) for 32 analytes against the Greiner lithium heparin tube and the BD Rapid Serum Tubes (BD RST) tube measured on Beckman DxC 800 and DxI 800 analyzers. Clotting strength was also examined using thromboelastography (TEG). RESULTS: The analytes results showed there was a very close agreement between the BD RST tube and GBBCAFC tube in comparison with lithium heparin plasma. The result comparison data showed equivalent performance with lower levels of hemolysis. The prolonged storage study also showed very similar agreement between the BD RST and the GBBCAFC tubes. Likewise, the TEG data showed there was very little difference in clotting ability between the tubes, and neither was capable of producing true serum from blood spiked with 2 U heparin/mL of blood. CONCLUSIONS: The study showed the GBBCAFC tube with the combination of the two procoagulants blood clotting activator and thrombin produced comparable performance with the lithium heparin plasma and the BD RST serum samples.


Subject(s)
Blood Coagulation , Blood Specimen Collection/instrumentation , Serum , Humans , Thrombelastography , Time Factors
17.
Prev Med ; 94: 40-47, 2017 01.
Article in English | MEDLINE | ID: mdl-27856338

ABSTRACT

The transition from primary/middle school to secondary/high school is likely to be a key period in children's development, characterised by significant changes in their social and physical environment. However, little is known about the changes in sedentary behaviour that accompany this transition. This review aimed to identify, critically appraise and summarise the evidence on changes in sedentary behaviour across the primary - secondary school transition. Published English language studies were located from computerised and manual searches in 2015. Inclusion criteria specified a longitudinal design, baseline assessment when children were in primary/middle school with at least one follow-up during secondary/high school and a measure of sedentary behaviour at both (or all) points of assessment. Based on data from 11 articles (19 independent samples), tracking coefficients were typically in the range of 0.3 to 0.5 and relatively consistent across the different sedentary behaviours examined and durations of follow-up. Both screen-based sedentary behaviour and overall sedentary time increased during the school transition. Overall there was an increase of approximately 10-20min per day per year in accelerometer-assessed sedentary time. Consistent with the broader age-related changes in behaviour observed during this period, sedentary behaviour increases during the transition from primary/middle to secondary/high school. Investigating features of the social and physical environment that might exacerbate or attenuate this trend would be a valuable next step.


Subject(s)
Environment , Schools , Sedentary Behavior , Humans , Longitudinal Studies , Television/statistics & numerical data , Video Games/statistics & numerical data
18.
J Strength Cond Res ; 30(11): 3098-3106, 2016 Nov.
Article in English | MEDLINE | ID: mdl-27028155

ABSTRACT

Johnston, MJ, Cook, CJ, Drake, D, Costley, L, Johnston, JP, and Kilduff, LP. The neuromuscular, biochemical, and endocrine responses to a single-session vs. double-session training day in elite athletes. J Strength Cond Res 30(11): 3098-3106, 2016-The aim of this study was to compare the acute neuromuscular, biochemical, and endocrine responses of a training day consisting of a speed session only with performing a speed-and-weights training session on the same day. Fifteen men who were academy-level rugby players completed 2 protocols in a randomized order. The speed-only protocol involved performing 6 maximal effort repetitions of 50-m running sprints with 5 minutes of recovery between each sprint, whereas the speed-and-weights protocol involved the same sprinting session but was followed 2 hours later by a lower-body weights session consisting of 4 sets of 5 backsquats and Romanian deadlift at 85% one repetition maximum. Testosterone, cortisol, creatine kinase, lactate, and perceived muscle soreness were determined immediately before, immediately after, 2 hours after, and 24 hours after both the protocols. Peak power, relative peak power, jump height, and average rate of force development were determined from a countermovement jump (CMJ) at the same time points. After 24-hours, muscle soreness was significantly higher after the speed-and-weights protocol compared with the speed-only protocol (effect size η = 0.253, F = 4.750, p ≤ 0.05). There was no significant difference between any of the CMJ variables at any of the posttraining time points. Likewise, creatine kinase, testosterone, and cortisol were unaffected by the addition of a weight-training session. These data indicate that the addition of a weight-training session 2 hours after a speed session, whereas increasing the perception of fatigue the next day does not result in a difference in endocrine response or in neuromuscular capability.


Subject(s)
Athletes , Physical Conditioning, Human/methods , Athletic Performance/physiology , Creatine Kinase/blood , Exercise Test , Football/physiology , Humans , Hydrocortisone/blood , Lactic Acid/blood , Male , Muscle Strength/physiology , Myalgia/physiopathology , Random Allocation , Testosterone/blood , Young Adult
SELECTION OF CITATIONS
SEARCH DETAIL