instruction
stringlengths
10
664
context
stringlengths
1
5.66k
response
stringlengths
1
3.34k
category
stringclasses
1 value
Publication of abstracts submitted to the annual meeting of the Pediatric Orthopaedic Society of North America: is there a difference between accepted versus rejected abstracts?
The purpose of this study was to determine publication rates of all abstracts submitted for presentation at the annual conference of the Pediatric Orthopaedic Society of North America (POSNA) comparing papers accepted for presentation with those that were not accepted and to determine the median times to publication and the mean impact factor of journals that published papers from the 2 groups. The titles and authors of all abstracts submitted for presentation to the POSNA for the years 2003 to 2005 were identified. To determine publication status, we conducted a computerized Pubmed search using the first author's name. If multiple publications were identified, the Boolean search operator AND was used to combine author names with key words. The title of each located published article was compared with the title of the abstract. If differences were noted, the abstract content was compared with the final publication. The journals, impact factor was determined using the journal citation report. The median time from conference presentation to publication was determined using a Kaplan-Meier survival analysis. Of 1191 abstracts submitted to the annual meetings of POSNA from the years 2003 through 2005, 440 (37%) were accepted for presentation. Acceptance of submitted abstracts increased from 30% in 2003 to 40% in 2005. Of the 1191 abstracts 599 (50%) were subsequently published by August 2009. The mean publication rate for abstracts accepted for presentation was 58.9% (259 of 440) compared with 45% (339 of 751) for rejected abstracts. The median time to publication of accepted abstracts was not significantly different when compared with that of rejected abstracts. The mean journal impact factor for accepted articles was 2.2 compared with 1.5 for rejected abstracts.
The publication rates of abstracts submitted to POSNA is high compared with those of other international orthopaedic associations. The mean publication rate for accepted abstracts and rejected abstracts has increased substantially from 45% and 38% in 1991 to 1994 to 58.9% and 45% in 2003 to 2005, respectively. The journal in which most of the abstracts are ultimately published is the Journal of Pediatric Orthopaedics[corrected].
closed_qa
Do stable patients with a premorbid depression history have a worse outcome after deep brain stimulation for Parkinson disease?
Deep brain stimulation (DBS) has been associated with mood sequelae in a subset of patients operated on in either the subthalamic nucleus or the globus pallidus internus for the treatment of Parkinson disease. To compare mood and motor outcomes in those with and without a presurgical history of depression. Unilateral subthalamic nucleus or unilateral globus pallidus internus DBS patients followed up for a minimum of 6 months were included. All patients underwent a comprehensive outpatient psychiatric evaluation by a board-certified psychiatrist. Psychiatric diagnoses were based on Diagnostic and Statistical Manual, fourth edition, text revision, nomenclature (American Psychiatric Association, 2000). Motor and mood outcomes were compared. A total of 110 patients were included. There were no significant differences in baseline variables between the 2 groups. Those with a preoperative history of depression had significantly higher Beck Depression Inventory scores than the nondepression group after DBS (8.97 ± 7.55 vs 5.92 ± 5.71; P = .04). Patients with a depression history had less improvement (11.6%) in pre/post-DBS change when Unified Parkinson Disease Rating Scale motor scores were compared (P = .03) after adjustment for stimulation site and baseline demographic and clinical variables. Patients with a higher levodopa equivalent dose had a worse clinical motor outcome.
Patients with a preoperative depression history had higher Beck Depression Inventory scores after DBS and significantly less (albeit small) improvement in pre/post-DBS change in Unified Parkinson Disease Rating Scale motor scores than patients without a history of depression.
closed_qa
Does total mesorectal excision require a learning curve?
The procedure of total mesorectal excision (TME) is the gold standard in the treatment of rectal cancer. However, quality control of TME is still under debate. The present study was conducted to determine whether TME requires a learning curve to allow the surgeon to grasp the necessary technical expertise. We performed a retrospective review of patients with rectal cancer who underwent TME with curative intention between August 1998 and December 2003; 195 consecutive patients were enrolled. From the first patient of the cohort, the first 50 patients were categorized into group 1, the next 50 into group 2, the next 50 into group 3, and the final 45 patients into group 4. Local recurrence rates were compared between the four groups. No significant difference in clinicopathological features was observed between the four groups, except for age, operative time, and grade of mesorectum. The local recurrence (LR) rate decreased from 22.3% in the inadequate TME group (G1) to 9.1% in the adequate TME group (G2-4) (p=0.035). In multivariate analysis, regional lymph node metastasis, mesorectal grade (incomplete or nearly complete), and early period of learning curve were independent predictors of local recurrence.
Our results suggest that a learning curve is necessary for the development of technical expertise in the performance of TME for treatment of rectal cancer.
closed_qa
Precontoured plating of clavicle fractures: decreased hardware-related complications?
Operative treatment of displaced midshaft clavicle fractures reportedly decreases the risk of symptomatic malunion, nonunion, and residual shoulder disability. Plating these fractures, however, may trade these complications for hardware-related problems. Low-profile anatomically precontoured plates may reduce the rates of plate prominence and hardware removal.QUESTIONS/ We compared the outcomes after precontoured and noncontoured superior plating of acute displaced midshaft clavicle fractures. Primary outcomes were rate of plate prominence, rate of hardware removal, and rate of complications. Secondary outcomes were ROM and pain and function scores. We retrospectively reviewed 52 patients with 52 acute, displaced midshaft clavicle fractures treated with either noncontoured or precontoured superior clavicle plate fixation. Fourteen patients with noncontoured plates and 28 with precontoured plates were available for followup at a minimum of 1 year postoperatively. Postoperative assessment included ROM, radiographs, and subjective scores including visual analog scale for pain, American Shoulder and Elbow Surgeons questionnaire, and Simple Shoulder Test. Patients complained of prominent hardware in nine of 14 in the noncontoured group and nine of 28 in the precontoured group. Hardware removal rates were three of 14 in the noncontoured group and three of 28 in the precontoured group. Postoperative ROM and postoperative subjective scores were similar in the two groups.
Precontoured plating versus noncontoured plating of displaced midshaft clavicle fractures results in a lower rate of plate prominence in patients who do not undergo hardware removal.
closed_qa
Socio-economic inequities in children's injury rates: has the gradient changed over time?
Changing socio-economic gradients in adult health over time have been documented, but little research has investigated temporal changes in child health gradients. Childhood hospitalizations for injury have fallen over the last two decades; whether the socio-economic gradient in childhood injury has changed is unknown. Population-based hospital discharge data were used to calculate rates of hospitalization for injury from 1986/87 through 2005/06 for all children under 20 years of age in Manitoba (average yearly number of hospitalizations = 326,357). Information on socio-economic status (SES) came from area-level census data and was assigned by residential postal codes. Generalized linear models with generalized estimating equations were employed to describe the relation between SES and injury rates and whether this relation changed over time. All-cause injuries were examined as well as injuries for motor vehicle collisions (MVCs), other vehicle injuries, self-inflicted injuries, assault, poisoning, injuries caused by machinery, sports injuries and falls. Injury hospitalizations for children decreased steadily over the study period, from 1.07% to 0.51%. SES significantly predicted injury hospitalizations (p<0.0001), children with lower SES showing higher rates. A significant SES by year interaction (p<0.0001) indicated that the SES gradient for injury hospitalizations increased over time. Analysis by type of injury found a significant SES by year interaction for MVCs, self-inflicted injuries and falls; for MVCs and self-inflicted injuries the pattern (increasing SES gradient) was similar to that of hospitalization for all-cause injury. The pattern for falls was inconsistent.
Despite the overall drop in injury hospitalizations over time, the SES gradient in hospitalized injury rates has increased.
closed_qa
Does scapular elevation occur with glenohumeral flexion and abduction?
This study aims to reveal whether there is an elevation in scapula during flexion and abduction of the glenohumeral joint. In the first stage of our study 32 subjects were randomly divided into two groups. The mobility of the scapular notch was examined using open magnetic resonance imaging (MRI) assay when the glenohumeral joint was in flexion in the first group (5 males, 10 females; mean age 21.1 years; range 18 to 24 years) and in abduction in the second group (8 males, 9 females; mean age 22.1 years; range 18 to 27 years) and the motion range was found to be between 0 and 150 degrees. In the second stage of our study, the mobilities of the scapular notch was examined on autopsy during passive humeral mobility. According to the open MRI results, there was no elevation or depression during the passive flexion and abduction of the glenohumeral joint. While the scapular notch migrated slightly to the medial side during abduction of the glenohumeral joint, it did not move during flexion. Also in an autopsy study, we observed that scapula did not move in vertical direction during the glenohumeral abduction and flexion mobilities.
There is no vertical mobility in the scapula during glenohumeral flexion and abduction. Also, there is no medial mobility during flexion except during abduction.
closed_qa
Do type 2 diabetes patients without diabetic retinopathy or subjects with impaired fasting glucose have impaired colour vision?
To investigate associations between fasting plasma glucose level and the prevalence of acquired colour vision impairment in type 2 diabetes patients without diabetic retinopathy. Participants in this cross-sectional study of male officials aged 20-60 yr in the Japanese Self Defence Force, underwent colour vision testing, ophthalmic examination, a standardized interview and examination of venous blood samples. Ishihara plates, a Lanthony 15-hue desaturated panel and Standard Pseudoisochromatic Plates Part 2 were used to examine colour vision. The Farnsworth-Munsell 100-hue test was performed to define acquired colour vision impairment. Cardiovascular disease risk factors were determined from serum blood samples, physical records and an interview. We performed logistic regression analysis adjusted for age, diagnosed hypertension, dyslipidaemia, cataract, glaucoma, being overweight, smoking status and alcohol intake. Crude and adjusted odds ratios were calculated for three glucose levels, which included normal fasting glucose, impaired fasting glucose and diabetes. Out of a total of 1042 men enrolled, 872 were eligible for the study, and 31 were diagnosed with acquired colour vision impairment. As compared with the subjects with normal fasting glucose (<5.6 mmol/l), the crude odds ratio for acquired colour vision impairment was 0.93 (95% CI 0.32-2.74) for the subjects with impaired fasting glucose (5.6-6.9 mmol/l) and 8.07 (95% CI 2.48-26.22) for the patients with type 2 diabetes. The multiple-adjusted odds ratios were 0.77 (95% CI 0.25-2.34) for the subjects with impaired fasting glucose and 5.89 (95% CI 1.55-22.40) for the patients with type 2 diabetes.
Our findings suggest that there is a dramatically increased prevalence of acquired colour vision impairment in type 2 diabetes patients without diabetic retinopathy which might be attributable to another pathogenesis associated with diabetic retinopathy.
closed_qa
Diabetes inpatients: a case of lose, lose, lose. Is it time to use a 'diabetes-attributable hospitalization cost' to assess the impact of diabetes?
The UK National Health Service in England pays for inpatients using a formula ('tariff'). The appropriateness of the tariff for people with diabetes is unknown. We have compared the tariff paid and costs for inpatients with/without diabetes and tested the concept of a 'diabetes-attributable hospitalization cost'. This was a cross-sectional, retrospective 12-month audit in a single teaching hospital assessing mortality, bed days per annum and 'diabetes-attributable hospitalization cost' (i.e. the proportion of costs for all patients with diabetes in excess of that paid for comparable patients without diabetes). There were 64 829 inpatient admissions, with 4864 of those coded as having diabetes; 12.9% was estimated to be the number of patients having diabetes but not coded. People with diabetes occupied 13.9% of all bed days and were 18.1% (1.3-37.8%) more likely to die (age adjusted). The mean bed days per annum were greatest among those with (vs. without) diabetes (men 10.9 ± 17.0 vs. 6.3 ± 12.8; women 11.4 ± 19.4 vs. 5.9 ± 11.6; P<0.001). The greatest excess admission rates were among those aged 25-64 years. The annual mean tariff was greater for those with diabetes (5380 ± 8740) than those without diabetes (3706 ± 6221) (P<0.001). The overall cost was even higher among those with diabetes: 5835 ± 11 246 vs. 3567 ± 7238 (P<0.001). The diabetes-attributable hospitalization cost was 46.5% (9 125 085). An HbA(1c)>10.0% (>86 mmol/mol) was associated with excess hospitalization.
Those with diabetes cost more and are more likely to die when inpatients. The tariff paid for diabetes is high, but in this centre less than the actual costs. Approaches known to reduce hospitalization are urgently required.
closed_qa
Is the temporal artery thermometer a reliable instrument for detecting fever in children?
We aimed to study the diagnostic accuracy of the temporal artery thermometer vs. rectal temperature in a large group of children with and without fever, aged 0-18 years. Many have studied the diagnostic accuracy of the temporal artery thermometer in children compared with a reference method, with contradictory outcomes. No studies have been carried out in a large group of children of all ages. Diagnostic accuracy/validation study. Children (0-18 years) with fever (T>38·0°C) were recruited through the emergency department and children with normal temperatures through the day-care department of the Children's Hospital. All children routinely had rectal temperature recordings. Temporal artery temperature was recorded shortly after the rectal recording. The mean absolute difference in temperature, the level of agreement (intraclass correlation coefficient) and the sensitivity and specificity of detecting fever were calculated. A total number of 198 children (121 boys) participated, with a mean age of 5·1 (SD 4·7) years. Of those children, 81 had fever according to the rectal recording. Mean difference between temporal artery temperature and rectal temperature was -0·11 (SD 0·63)°C, with an agreement of 0·812. The sensitivity and specificity of the temporal artery thermometer for detecting fever were 67·9 and 98·3%, respectively.
The diagnostic accuracy of the temporal artery thermometer in detecting fever in children of all ages is low.
closed_qa
P300 auditory event-related potentials in children with obesity: is childhood obesity related to impairment in cognitive functions?
To investigate alterations in P300 auditory event-related potentials in children with obesity to detect changes in cognitive functions. A total of 50 children with obesity and 23 age- and sex-matched healthy control subjects were included in the study. Laboratory tests were performed to detect dislipidemia and insulin resistance (IR). The latencies and amplitudes of P300 waves were measured in healthy and obese subjects with or without IR. The oddball paradigm was used in recordings of P300 auditory event-related potentials. A significant difference was observed between groups regarding latency and amplitude of P300 component obtained from central (Cz) electrode. The grand means of P300 latency were longer, and amplitude decreased significantly in obese group compared to that of healthy controls. When the obese group was divided into two different subgroups, those with IR and without IR, the grand means of P300 latency were longer and the amplitude decreased significantly in subjects with IR compared to those without IR.
Both decreased amplitude and prolonged latency of P300 are associated with IR in children with obesity, which shows the impairment of neural activity associated with sensory and cognitive information processing in these children. Further studies are necessary to strengthen the current findings and to determine the exact mechanism of cognitive impairment in obese children.
closed_qa
Is there any cardioprotective role of Taurine during cold ischemic period following global myocardial ischemia?
The aim of the present study was to investigate the cardioprotective effect of Taurine on the donor hearts during cold ischemic period. 32 rats were divided into four groups (sham, taurine, ischemia, treatment group, 8 rats in each). All rats were fed with rat food for three weeks. Taurine and treatment groups were given a 200 mg/kg/day dose of Taurine by oral gavage besides rat feed. Cardiectomy was performed in all rats after three weeks. In ischemia and treatment groups, harvested hearts were kept in 0.9% sodium chloride at +4 degrees C for 5 hours. Tissue samples were taken from left ventricle in all groups. These samples were evaluated by histopathologic and biochemical examination. In the present study results of the biochemical and histopathological examination reveals the protective effects of Taurine. As a marker of lipid peroxidation, Malondialdehyde (MDA) levels in ischemia group were significantly higher than both Sham and Taurine groups. MDA values were recorded; 3.62 ± 0.197 in the sham group, 2.07 ± 0.751 in the Taurine group, 9.71 ± 1.439 in the ischemia group and 7.68 ± 1.365 in the treatment group. MDA levels decreased in treatment group. (p<0.05) In accordance with MDA findings, while superoxide dismutase and glutathione peroxidase levels decreased in ischemia group, they increased in treatment group. (p<0.05) There was no differences in Catalase (CAT) enzyme level between treatment and ischemia group (p = 1.000). CAT level results were recorded; 7.08 ± 0.609 in the sham group, 6.15 ± 0.119 in the Taurine group, 5.02 ± 0.62 in the ischemia group, and 5.36 ± 0.384 in the treatment group. Less intracellular edema and inflammatory cell reaction were observed in histologic examination in favor of treatment group. (p<0.01)
Taurine decreased myocardial damage during cold ischemic period following global myocardial ischemia.
closed_qa
Poland's syndrome and recurrent pneumothorax: is there a connection?
Two male patients, aged 19 and 21 years respectively were submitted to our department after their second incident of pneumothorax. Both had Poland's syndrome (unilaterally hypoplastic chest wall with pectoralis major muscle atrophy) and both had multiple bullae to the ipsilateral lung based on CT findings. The patients were treated operatively (bullectomy, lung apicectomy, partial parietal pleurectomy and chemical pleurodesis) due to the recurrent state of their pneumothorax. The patients had good results with total expansion of the affected lung.
Poland's syndrome can be combined with ipsilateral presence of lung bullae, a common cause of pneumothorax. Whether this finding is part or a variation of the syndrome needs to be confirmed by a larger number of similar cases.
closed_qa
Does monitoring need for care in patients diagnosed with severe mental illness impact on Psychiatric Service Use?
Effectiveness of services for patients diagnosed with severe mental illness (SMI) may improve when treatment plans are needs based. A regional Cumulative Needs for Care Monitor (CNCM) introduced diagnostic and evaluative tools, allowing clinicians to explicitly assess patients' needs and negotiate treatment with the patient. We hypothesized that this would change care consumption patterns. Psychiatric Case Registers (PCR) register all in-patient and out-patient care in the region. We matched patients in the South-Limburg PCR, where CNCM was in place, with patients from the PCR in the North of the Netherlands (NN), where no CNCM was available. Matching was accomplished using propensity scoring including, amongst others, total care consumption and out-patient care consumption. Date of the CNCM assessment was copied to the matched controls as a hypothetical index date had the CNCM been in place in NN. The difference in care consumption after and before this date (after minus before) was analysed. Compared with the control region, out-patient care consumption in the CNCM region was significantly higher after the CNCM index date regardless of treatment status at baseline (new, new episode, persistent), whereas a decrease in in-patient care consumption could not be shown.
Monitoring patients may result in different patterns of care by flexibly adjusting level of out-patient care in response to early signs of clinical deterioration.
closed_qa
Idiopathic toe-walking in children, adolescents and young adults: a matter of local or generalised stiffness?
Idiopathic Toe Walking (ITW) is present in children older than 3 years of age still walking on their toes without signs of neurological, orthopaedic or psychiatric diseases. ITW has been estimated to occur in 7% to 24% of the childhood population. To study associations between Idiopathic Toe Walking (ITW) and decrease in range of joint motion of the ankle joint. To study associations between ITW (with stiff ankles) and stiffness in other joints, muscle strength and bone density. In a cross-sectional study, 362 healthy children, adolescents and young adults (mean age (sd): 14.2 (3.9) years) participated. Range of joint motion (ROM), muscle strength, anthropometrics sport activities and bone density were measured. A prevalence of 12% of ITW was found. Nine percent had ITW and severely restricted ROM of the ankle joint. Children with ITW had three times higher chance of severe ROM restriction of the ankle joint. Participants with ITW and stiff ankle joints had a decreased ROM in other joints, whereas bone density and muscle strength were comparable.
ITW and a decrease in ankle joint ROM might be due to local stiffness. Differential etiological diagnosis should be considered.
closed_qa
Do biologics-naïve patients with rheumatoid arthritis respond better to tocilizumab than patients for whom anti-TNF agents have failed?
To determine responses to tocilizumab between patients with rheumatoid arthritis (RA) who switched to anti-TNF agents and those who are biologics-naïve. This retrospective study investigated 107 patients with RA who were treated with tocilizumab. At baseline, 61 of them had already been treated with anti-TNF agents (switched group; 46 for inefficacy and 15 for adverse events), and 46 were biologics-naïve (naïve group). Treatment responses to tocilizumab at week 12 and 24 were compared between the switched and naïve groups using the disease activity score 28 (DAS28). Forty-two (91.3%) and 50 (82.0%) patients in the naïve and switched groups, respectively, completed 24 weeks of tocilizumab treatment. The DAS28-ESR and DAS28-CRP values (means±SD) at weeks 12 and 24 compared to baseline decreased significantly for the naïve and switched groups. The DAS28-ESR and DAS28-CRP values at weeks 12 and 24 were significantly decreased in the naïve group, compared to the switched group. Disease activity was improved in the naïve patients compared to the switched patients.
Tocilizumab was safe, tolerable, and clinically effective for patients with inadequate responses to anti-TNF therapy and for those who were biologics-naïve, and it was more effective among the latter.
closed_qa
Perinatal outcomes in women with preeclampsia and superimposed preeclampsia: do they differ?
The purpose of this study was to determine whether superimposed preeclampsia results in worse perinatal outcomes than preeclampsia. We conducted a retrospective cohort study using our perinatal database (1990-2008). Perinatal outcomes among women with chronic hypertension (n = 1032), superimposed preeclampsia (n = 489), and preeclampsia (n = 4217) were compared with outcomes of control subjects (n = 57,103). Outcomes among women with superimposed preeclampsia were also compared with outcomes of women with preeclampsia. Multivariable analysis was used to control for confounders. Rates of small-for-gestational age, abruption, stillbirth, and eclampsia were not significantly different with superimposed preeclampsia compared with preeclampsia. Delivery at<34 weeks' gestation (17.3% vs 8.7%; P<.001), cesarean delivery (46.2% vs 36.3%; P<.001), and neonatal intensive care unit admission (16.3% vs 11.4%; P<.002) were significantly higher among women with superimposed preeclampsia. These risks persisted after we controlled for confounders.
Women with superimposed preeclampsia have higher risks of intervention-related events compared with those with preeclampsia.
closed_qa
Treatment of early pregnancy failure: does induced abortion training affect later practices?
The objective of the study was to examine the relationship between induced abortion training and views toward, and use of, office uterine evacuation and misoprostol in early pregnancy failure (EPF) care. We surveyed 308 obstetrician-gynecologists on their knowledge and attitudes toward treatment options for EPF and previous training in office-based uterine evacuation. Sixty-seven percent of respondents reported training in office uterine evacuation, and 20.3% reported induced abortion training. Induced abortion training was associated with strongly positive views toward both office-based uterine evacuation and misoprostol as treatment for EPF compared with those with office uterine evacuation training in other settings (odds ratio [OR], 2.64; P<.004 and OR, 3.22; P<.003, respectively). Furthermore, induced abortion training was associated with the use of office uterine evacuation for EPF treatment compared with those with office evacuation training in other settings (OR, 2.90; P = .004).
Training experiences, especially induced abortion training, are associated with the use of office uterine evacuation for EPF.
closed_qa
Recurrent endometrioma and ovarian reserve: biological connection or surgical paradox?
Cumulative evidence supports the view that ovarian endometriomas originate from ovulatory events and that the ovarian reserve is reduced following surgery. On these bases, we have hypothesized that the risk of recurrence may be related to the residual ovarian reserve of the operated ovary. We retrospectively selected 45 women scheduled for in vitro fertilization who previously underwent surgical excision of monolateral endometriomas and compared ovarian responsiveness in those who did (n = 24) and did not (n = 21) have a recurrent endometrioma. In the intact ovaries, the mean ± SD number of codominant follicles in women with and without recurrences was 3.5 ± 1.7 and 3.7 ± 2.2, respectively (P = NS). In the affected ovaries, the mean ± SD number of follicles in gonads with and without recurrences was 2.5 ± 2.3 and 1.1 ± 1.5, respectively (P<.05).
Ovarian responsiveness is higher in gonads that developed recurrent endometriomas.
closed_qa
Does hormone therapy exacerbate the adverse effects of radiotherapy in men with prostate cancer?
We examined whether short course androgen deprivation therapy as an adjunct to radiotherapy would impact health related quality of life outcomes in patients with localized prostate cancer treated definitively with external beam radiation therapy or permanent brachytherapy. From 1999 to 2003 patients were enrolled in a prospective study at our institution and completed validated health related quality of life surveys at defined pretreatment and posttreatment intervals. A total of 81 men received radiotherapy alone and 67 received radiotherapy plus androgen deprivation therapy. Median androgen deprivation therapy duration was 4 months. Univariate and multivariate analysis was done to compare time to return to baseline in 6 distinct health related quality of life domains. On univariate analysis the radiotherapy plus androgen deprivation therapy group achieved baseline urinary symptoms more rapidly than the radiotherapy group (5 months, p = 0.002). On multivariate analysis time to return to baseline in any of the 6 health related quality of life domains was not significantly affected by adding androgen deprivation therapy. Factors associated with longer time to return to baseline mental composite scores on multivariate analysis included nonwhite ethnicity, cerebrovascular disease history and alcohol abuse history. Men treated with permanent brachytherapy monotherapy experienced longer time to return to baseline for urinary function and symptoms. Baseline sexual function and lack of a partner were associated with longer time to sexual recovery.
Adding androgen deprivation therapy to definitive radiotherapy does not significantly impact the time to return to baseline health related quality of life. These data may be valuable for patients and physicians when weighing the toxicity and benefits of androgen deprivation therapy when added to definitive radiotherapy.
closed_qa
Does a difference exist in inferior alveolar canal displacement caused by commonly encountered pathologic entities?
The aim of the present study was to investigate whether a difference exists in the location of the displaced inferior alveolar canal (IAC) and neurovascular bundle (toward the buccal or lingual cortex) among odontogenic tumors and vascular lesions. If some consistency exists in the manner in which the canal and bundle are displaced on radiographic examination, the nature of the mandibular lesion under examination could be anticipated. This information would assist the surgical team in treatment planning, diagnostic biopsy, and resection, especially in cases of intraosseous vascular pathologic findings. A retrospective review of the computed tomography images obtained for odontogenic tumors and vascular anomalies treated at the Department of Oral and Maxillofacial Surgery, University of Illinois at Chicago, from January 2000 to June 2010 was undertaken. The IAC and neurovascular bundle were traced from the lingula to the mental foramina, and its location within the mandible was recorded at 3 specific points. In the odontogenic tumor group, we found that the canal with the neurovascular bundle was displaced either toward the buccal cortex of the mandible or the inferior border, but it was never identified lingually. In contrast, all the vascular anomalies had displaced the structures toward the lingual aspect of the mandible at all selected points.
To our knowledge, this is the first study to have examined the potential differences in the displacement of the inferior alveolar neurovascular bundle caused by the 2 commonly encountered pathologic entities in the maxillofacial skeleton: odontogenic tumors and vascular anomalies. We identified a striking difference in the manner of the IAC and its contents that was consistent among the tumors in the 2 groups. The location of the IAC in relationship to the pathologic entity under investigation could prove valuable in the differential diagnosis and assist with planning the biopsy. Additional investigation with a larger number of cases of these 2 groups of lesions involving the mandible is warranted to confirm our preliminary findings.
closed_qa
3-Dimensional imaging for lower third molars: is there an implication for surgical removal?
Surgical removal of impacted third molars may be the most frequent procedure in oral surgery. Damage to the inferior alveolar nerve (IAN) is a typical complication of the procedure, with incidence rates reported at 1% to 22%. The aim of this study was to identify factors that lead to a higher risk of IAN impairment after surgery. In total 515 surgical third molar removals with 3-dimensional (3D) imaging before surgical removal were retrospectively evaluated for IAN impairment, in addition to 3D imaging signs that were supposed predictors for postoperative IAN disturbance. Influence of each predictor was evaluated in univariate and multivariate analyses and reported as odds ratio (OR) and 95% confidence interval (CI). The overall IAN impairment rate in this study was 9.4%. Univariate analysis showed narrowing of the IAN canal (OR, 4.95; P<.0001), direct contact between the IAN and the root (OR, 5.05; P = .0008), fully formed roots (OR, 4.36; P = .045), an IAN lingual course with (OR, 6.64; P = .0013) and without (OR, 2.72; P = .007) perforation of the cortical plate, and an intraroot (OR, 9.96; P = .003) position of the IAN as predictors of postoperative IAN impairment. Multivariate analysis showed narrowing of the IAN canal (adjusted OR, 3.69; 95% CI, 1.88 to 7.22; P = .0001) and direct contact (adjusted OR, 3.10; 95% CI, 1.15 to 8.33; P = .025) to be the strongest independent predictors.
Three-dimensional imaging is useful for predicting the risk of postoperative IAN impairment before surgical removal of impacted lower third molars. The low IAN impairment rate seen in this study-compared with similar selected study groups in the literature of the era before 3D imaging-indicates that the availability of 3D information is actually decreasing the risk for IAN impairment after lower third molar removal.
closed_qa
Nasal inspiratory pressure: an alternative for the assessment of inspiratory muscle strength?
Inspiratory muscle strength is usually assessed thorough the determination of static mouth pressure (PImax). However, since this manoeuvre presents certain problems, alternative techniques have been developed over the last few years. One of the most promising is determination of sniff nasal inspiratory pressure (SNIP).AIM: To evaluate SNIP assessment as an alternative for the evaluation of the inspiratory muscle strength. Subjects were consecutively included and assigned to one of three different groups: control (8), COPD patients (23) and patients with neuromuscular disorders (21). Different maximal inspiratory pressures were determined: (a) dynamic in the esophagus (maximal sniff Pes, reference variable), (b) PImax, and (c) SNIP. Both SNIP and MIP showed an excellent correlation with Pes (r=0.835 and 0.752, respectively, P<0.05 for both). SNIP/Pes intra-class correlation coefficients were 0.585 (CI 95%: −0.097 to 0.901) in controls, 0.569 (CI 95%: −0.048 to 0.836) in COPD patients, and 0.840 (CI 95%: 0.459 to 0.943) in neuromuscular disorders, respectively. For PImax/Pes, these values were 0.602 CI 95%: −0.108 to 0.933), 0.418 (CI 95%: −0.108 to 0.761), and 0.712 (CI 95%: 0.378 a 0.882). Moreover, both SNIP and PImax showed 100% sensitivity in the three groups of subjects, although specificities were 100%, 69% and 75% for SNIP, and 83%, 54% and 75% for PImax, respectively.
SNIP is a good physiological marker of inspiratory muscle strength. Its role is likely to complement that of PImax.
closed_qa
Inflammation and oxidative stress in testicular torsion: do they deserve intensive treatment to save both guilty and innocent testes?
To investigate at the molecular level, whether the combined use of an antioxidant (L-carnitine) and a selective cyclooxygenase-2 (COX-2) inhibitor (meloxicam) is effective in the treatment of cellular damage caused by testicular torsion. A total of 30 male Wistar rats were randomly divided into 5 groups. The control group underwent a sham operation, and the second group underwent torsion/detorsion for 90 minutes. Groups 3 and 4 received L-carnitine (500 mg/kg/d) and meloxicam (3 mg/kg/d), respectively. Group 5 also received these 2 agents, in addition to the same torsion/detorsion procedure. Bilateral orchiectomy was performed 96 hours after the operation in all groups. cDNA was synthesized after isolation of total RNA from the tissues. The relative expression of interleukin (IL)-1a, COX-2, and β-actin genes was measured by real-time polymerase chain reaction. The COX-2 and IL-1a mRNA levels had significantly decreased in groups 3, 4, and 5 compared with group 2 (P<.05). COX-2 and IL-1a mRNA levels were significantly great in the torsion/detorsion group (P=.007). The COX-2 and IL-1a mRNA levels significantly decreased in the torsion/detorsion testis after maximal treatment (P<.001).
Meloxicam seems to exert its inhibitory effect on the expression of specific genes of inflammation, as well as the combination therapy. Because the effects of these inflammatory genes are still evident 4 days after detorsion, combination therapy using these agents could be administered until late postoperative period to prevent the initiation of autoimmune activity against sperm cells and protect the innocent contralateral testis from the insult of antisperm antibodies.
closed_qa
Does patent foramen ovale closure have an anti-arrhythmic effect?
Atrial tachyarrhythmias are associated with patent foramen ovale. The objective was to determine the anti-arrhythmic effect of patent foramen ovale closure on pre-existing atrial tachyarrhythmias. Medline, EMBASE, Cochrane Library, and Google Scholar databases were searched between 1967 and 2010. The search was expanded using the 'related articles' function and reference lists of key studies. All studies reporting pre- and post-closure incidence (or prevalence) of atrial tachyarrhythmia in the same patient population were included. Random and fixed effect meta-analyses were used to aggregate the data. Six studies were identified including 2570 patients who underwent percutaneous closure. Atrial fibrillation was in fact the only AT reported in all studies. Meta-analysis using a fixed effects model demonstrated a significant reduction in the prevalence of atrial fibrillation with an OR of 0.43 (95% CI 0.26-0.71). When using the random-effects model, OR was 0.44 (95% CI 0.18-1.04) with a statistically significant trend demonstrated (test for overall effect: Z=1.87, p=0.06).
Closure of a patent foramen ovale may be associated with reduction in the prevalence of atrial fibrillation.
closed_qa
Urinary symptoms and urodynamic findings in women with pelvic organ prolapse: is there a correlation?
International official guidelines recommend urodynamic (UDS) evaluation in patients with pelvic organ prolapse (POP). However, the real benefit of this examination is still the subject of heated and controversial debate. Therefore, we aimed to assess the correlation between urinary symptoms and UDS findings in women with POP through the implementation of a sophisticated computer-based technology in the outpatient workup. A prospective cohort study was performed in a single, tertiary, urogynaecologic referral department, enrolling consecutive women seeking care for pelvic floor dysfunctions. Patients underwent clinical and urodynamic evaluation. Data regarding baseline characteristics, symptoms, anatomic, and urodynamic findings were gathered for each patient. Multiple linear regression (MLR) and artificial neural networks (ANNs) were performed to design predicting models. A total of 802 women with POP were included. POP quantification stages and baseline data poorly correlated to final UDS findings. Stress urinary incontinence and overactive bladder were both independently associated to each UDS diagnosis, including detrusor overactivity (DO), urodynamic stress incontinence (USI), and mixed urinary incontinence (USI plus DO). Receiver operating characteristic comparison confirmed that ANNs were more accurate than MLR in identifying predictors of UDS diagnosis, but none of these methods could successfully overcome UDS. Case-control studies are needed to confirm our findings.
Despite the current debate based on the actual value of UDS in women with POP, even the implementation of ANN, a sophisticated computer-based technology, does not permit an accurate diagnosis just on the basis of symptoms or avoiding UDS. Therefore, in women with POP, especially if scheduled for surgery, UDS should be considered as mandatory, since misleading counselling could result in unpleasant unexpected events.
closed_qa
Chemoresistance in non-small-cell lung cancer: can multidrug resistance markers predict the response of xenograft lung cancer models to chemotherapy?
In chemotherapy for non-small-cell lung cancer (NSCLC), some patients seem to exhibit an intrinsic resistance or develop an acquired resistance under treatment. Results on resistance markers for possible treatment failure as shown in studies on selected lung cancer cell lines could not be completely confirmed in clinical trials. As these conflicting data require further research, we created a model between cell culture and the clinical need to study this problem. Our study was based on patient-derived NSCLC xenografts in a mouse model, which revealed a high coincidence with the original tumour. Protein and messenger RNA (mRNA) expression of known resistance markers (breast cancer resistance protein (BCRP), multidrug resistance P-glycoprotein (MDR), lung cancer-related protein (LRP) and multidrug resistance protein 1 (MRP1)) were analysed by real-time polymerase chain reaction (PCR) and immunoblotting in 24 xenografts. Chemosensitivity to etoposide, carboplatin, gemcitabine, paclitaxel, cetuximab and erlotinib was determined in in vivo xenograft experiments and compared with the protein and mRNA expression of the multidrug resistance markers. With the exception of a single correlation between chemosensitivity and mRNA expression of etoposide and bcrp (mRNA expression of BCRP), we found no significant correlation between the response rates and protein- and mRNA expression levels in our 24 xenografts. The present results indicate that in vivo expression levels of multidrug resistance proteins and their mRNAs may not play a comparable role in chemoresistance of NSCLC, as pointed out in selected tumour cell lines.
Patient-derived xenografts allow detailed investigation of therapy-related markers and their dynamic regulation in a well-standardised and clinically related way. As a consequence of our investigations, we regard multidrug resistance to be a multifactorial phenomenon, in which more factors than the markers analysed by the present study may be involved.
closed_qa
Can signal enhancement ratio (SER) reduce the number of recommended biopsies without affecting cancer yield in occult MRI-detected lesions?
We retrospectively determined if signal enhancement ratio (SER), a quantitative measure of contrast kinetics using volumetric parameters, could reduce the number of biopsy recommendations without decreasing the number of cancers detected when applied to suspicious lesions seen on breast magnetic resonance imaging (MRI). A retrospective review of Breast Imaging Reporting and Data System (BIRADS) 4 or 5 lesions seen on breast MRI in 2008 that were clinically and mammographically occult yielded a final sample size of 73 lesions in 65 patients. Images were processed with in-house software. Parameters used to predict benignity/malignancy included SER total tumor volume (lesion volume above a 70% initial enhancement level), SER partial tumor volume (volume with "washout" and "plateau" kinetics), SER washout tumor volume, peak SER, and peak percent enhancement. Thresholds were determined to retrospectively discriminate benign from malignant histopathology. Clinical impact was assessed through the reduction in the number of biopsies recommended (by eliminating benign lesions discriminated by SER). Based on the original radiologist interpretations, 73 occult lesions were called suspicious and biopsied with a predictive value of biopsies (PPV(3)) of 18/73 (25%). SER parameters were found to be significantly associated with histopathology (P<.05). Biopsy recommendations could be reduced using SER parameters of SER partial tumor volume (73 to 40), SER total tumor volume (73 to 45), and peak percent enhancement (73 to 55) without removing true positives.
The adjunctive use of SER parameters may reduce the number of recommended biopsies without reducing the number of cancers detected.
closed_qa
Are bicuspid aortic valves a limitation for aortic valve repair?
To compare the mid-term results after aortic valve (AV) repair in bicuspid AVs with those in tricuspid AVs. Between 2000 and 2010, 100 patients (mean age 47.2 years) underwent AV repair procedures for insufficient bicuspid AV (n=43) and tricuspid AV (n=57). Aortic regurgitation (AR) more than moderate was present in 31/43 and 21/57 patients in the bicuspid AV and the tricuspid AV group, respectively. Concomitant root replacement by either the reimplantation or the remodeling technique was performed in 42 patients (bicuspid AV 17/43, tricuspid AV 25/57). All patients were prospectively studied with postoperative and further annual clinical assessment and echocardiography. Follow-up was 99% complete with a mean follow-up time of 22 months. Three patients died during the initial hospitalization, all due to postoperative cardiac failure. Overall actuarial 3 years' survival was 93±4.2% without significant differences between the two groups. Overall actuarial 3 years' freedom from AV-related reoperation was 86±5.1% without significant differences between the groups (85±9.7% for bicuspid AV, 86±6.0% for tricuspid AV; log-rank test: p=0.98). Overall actuarial 3 years' freedom from recurrent AR≥moderate was 100% and AR>trace was 71.3±8.2% without significant differences between the groups (76.5±11.7% for bicuspid AV, 71.4±9.4 for tricuspid AV; log-rank test: p=0.97).
The mid-term outcome in terms of survival, freedom from reoperation or recurrent AR is similar for both groups of patients after AV repair procedures. Therefore, we advocate valve repair also in patients presenting with an insufficient bicuspid AV.
closed_qa
Neuropathological findings of PSP in the elderly without clinical PSP: possible incidental PSP?
We aimed to describe cases with incidental neuropathological findings of progressive supranuclear palsy (PSP) from the Banner Sun Health Research Institute Brain and Body Donation Program. We performed a retrospective review of 277 subjects with longitudinal motor and neuropsychological assessments who came to autopsy. The mean Gallyas-positive PSP features grading for subjects with possible incidental neuropathological PSP was compared to those of subjects with clinically manifest disease. There were 5 cases with histopathological findings suggestive of PSP, but no parkinsonism, dementia or movement disorder during life. Cognitive evaluation revealed 4 of the 5 cases to be cognitively normal; one case had amnestic mild cognitive impairment (MCI) in her last year of life. The mean age at death of the 5 cases was 88.9 years (range 80-94). All 5 individuals had histopathologic microscopic findings suggestive of PSP. Mean Gallyas-positive PSP features grading was significantly lower in subjects with possible incidental neuropathological PSP than subjects with clinical PSP, particularly in the subthalamic nucleus.
We present 5 patients with histopathological findings suggestive of PSP, without clinical PSP, dementia or parkinsonism during life. These incidental neuropathological PSP findings may represent the early or pre-symptomatic stage of PSP. The mean Gallyas-positive PSP features grading was significantly lower in possible incidental PSP than in clinical PSP, thus suggesting that a threshold of pathological burden needs to be reached within the typically affected areas in PSP before clinical signs and symptoms appear.
closed_qa
Is pouch of Douglas obliteration a marker of bowel endometriosis?
To estimate the clinical significance of pouch of Douglas (POD) obliteration in women undergoing laparoscopic excision of endometriosis. Prospective study (Canadian Task Force Classification II-2). University-affiliated tertiary referral center for endometriosis. A total of 454 consecutive women who underwent laparoscopic surgery for treatment of pelvic pain or infertility-associated endometriosis between October 2004 and September 2008. Demographic, historical, and final surgical data were compared between women with and without POD obliteration at laparoscopy. Logistic regression analyses were performed to investigate the predictive value of POD obliteration at laparoscopy with regard to bowel endometriosis. One hundred consecutive women with POD obliteration at laparoscopy were included. 58% (95% confidence interval [CI] 0.48-0.67, n = 58/100) of the women with POD obliteration required bowel surgery compared with 20% (95% CI 0.16-0.25, 72/354) of women without POD obliteration (p<.001). Of the POD obliteration group, 66% (95% CI 0.53-0.76) required bowel shaving, 12% (0.06-0.23) full segmental rectal resection, 9% (0.04-0.19) wedge rectal resection, 5% (0.02-0.14) full segmental rectosigmoid resection and 9% (0.04-0.19) a combination of the above. Bowel endometriosis was histologically confirmed in all women.
POD obliteration at laparoscopy carries a high risk of bowel endometriosis and bowel surgery. This risk is three times higher than those without POD obliteration. Women with POD obliteration should be managed in tertiary referral centers for the treatment of endometriosis where colorectal input is available.
closed_qa
Early stimulation in newborns with birth weight between 1,000 and 1,500 g: is it always necessary?
To determine whether the currently widespread practice of sending all premature infants with birth weight between 1,000 and 1,500 g to early care centres is necessary from a neurological point of view, or if it is possible to establish selection criteria. A retrospective study of newborns (NB) at our hospital between January 1998 and December 2004 with birth weight between 1,000 and 1,500 g, and followed up for at least two years in a paediatric neurology clinic. We analysed the prognostic significance of the different neurological variables in the neonatal period, and those of greater significance were set at a score for deciding the start of early stimulation treatment on discharge from neonatology. A total of 194 infants met the above criteria. The most significant neurological prognostic variables were: gestational age<28 weeks, male sex, intraventricular haemorrhage grade>I, history of high risk pregnancy, sepsis, anaemia with haemodynamic repercussion and fundamentally abnormal neurological examination at discharge (odds ratio of 16). A prognostic score was developed with a cut-off of 4 points, with an area under the curve of 88.3%. The positive predictive value and negative predictive value were 43.75% and 96.2%, respectively, with 84.8% sensitivity and 78.9% specificity.
The newborns with birth weight between 1,000 and 1,500 g and normal neurological examination at discharge, with a score of less than 4 points, do not require early stimulation treatment from a neurological standpoint, given its predictable good outcome.
closed_qa
Do pregnant women have improved outcomes after traumatic brain injury?
Pregnant women, who have significantly elevated levels of estrogen and progesterone, might benefit from the neuroprotective effect of steroid hormones. Pregnant patients were identified and compared with their nonpregnant counterparts with respect to demographics and outcome. Of the 18,800 female, moderate to severe TBI patients, 71 were pregnant. Similar mortalities were noted in pregnant and nonpregnant TBI patients (9.9% vs 9.3%, P = .84). Adjusting for confounding variables, pregnant TBI patients had a trend toward increased mortality (adjusted odds ratio [AOR] = 2.2; 95% confidence interval [CI], .9-5.1; P = .07). In patients aged 15 to 47 years (n = 8,854), similar mortalities were noted in pregnant and nonpregnant TBI patients (9.9% vs 6.8%, P = .34). After adjusting for risk factors, again there was a trend toward increased mortality in the pregnant TBI group (AOR = 2.0; 95% CI, .8-4.6; P = .12).
Pregnant patients with moderate to severe TBI show no statistically significant difference in mortality compared with their nonpregnant counterparts.
closed_qa
Treatment outcomes of injured children at adult level 1 trauma centers: are there benefits from added specialized care?
Accidental traumatic injury is the leading cause of morbidity and mortality in children. The authors hypothesized that no mortality difference should exist between children seen at ATC (adult trauma centers) versus ATC with added qualifications in pediatrics (ATC-AQ). The National Trauma Data Bank, version 7.1, was analyzed for patients aged<18 years seen at level 1 trauma centers. Bivariate analysis compared patients by ATC versus ATC-AQ using demographic and injury characteristics. Multivariate analysis adjusting for injury and demographic factors was then performed. A total sample of 53,702 children was analyzed, with an overall mortality of 3.9%. The adjusted odds of mortality was 20% lower for children seen at ATC-AQ (odds ratio, .80; 95% confidence interval, .68-.94). Children aged 3 to 12 years, those with injury severity scores>25, and those with Glasgow Coma Scale scores<8 all had significant reductions in the odds of death at ATC-AQ.
Improved overall survival is associated with pediatric trauma patients treated at ATC-AQ.
closed_qa
Can intravascular ultrasound guidance modify the efficacy of drug-eluting stent over bare-metal stent in an aorto-ostial lesion?
We compared the efficacy of drug-eluting stents (DESs) versus bare-metal stents (BMSs) in de novo and native aorto-ostial lesions (AOLs) guided by intravascular ultrasound (IVUS). Thirty-eight patients underwent DES implantation for 38 AOLs; 35 with sirolimus-eluting stents, and three with paclitaxel-eluting stents (DES group). The control group was composed of 40 AOLs treated by BMS. The incidence of the primary composite end point of all-cause mortality, Q-wave myocardial infarction and target vessel revascularization (TVR) as TVR-major adverse cardiac event (TVR-MACE) was evaluated during a 1-year follow-up. Clinical and IVUS parameters were compared between the DES and BMS groups, and Cox hazards model was used to calculate hazard ratios of several factors for the 1-year TVR-MACE. Although the vessel, plaque, and stent volumes were significantly larger after the procedures in the DES group owing to longer lesions (18.3 ± 5.1 vs. 13.2 ± 5.9 mm, P<.001), the stent volume index (10.8 ± 2.6 vs. 12.4 ± 3.3, P=.024) was much smaller than that in the BMS group. During the 1-year follow-up, there were 13 TVR-MACEs in all patients (13% in DES vs. 20% in BMS, P=.4 by Kaplan-Meier analysis). The Cox hazards model did not indicate any specific unfavorable factor for the 1-year TVR-MACE.
The present study showed equality between DES and BMS on de novo and native AOLs about the 1-year TVR-MACE rate, even though a DES was used in longer and bulkier lesions as compared to BMS.
closed_qa
Evidence of abnormal tyrosine phosphorylated proteins in the urine of patients with bladder cancer: the road toward a new diagnostic tool?
Since changes in protein phosphorylation are a common feature of cancer cells, we analyzed phosphoproteins in the tissue and urine of patients with bladder cancer and assessed the diagnostic relevance of abnormally phosphorylated proteins as tumor markers. Enrolled in this study were 66 patients and 82 healthy volunteers. From the first 14 patients with bladder cancer we obtained samples of malignant and normal bladder tissue. All patients and volunteers provided a urine sample. Protein extracts of tissue specimens were separated by 2-dimensional gel electrophoresis for comparative analysis of neoplastic and normal tissue. Phosphoproteins were studied by Western blot and characterized by mass spectrometry. Urine samples were analyzed by 1-dimensional gel electrophoresis. Phosphoproteins were measured by affinity dot blotting. Profound changes in the pattern of protein tyrosine phosphorylation were consistently, reproducibly observed in bladder cancer tissues. A total of 24 phosphorylated proteins were differentially expressed in cancer tissue and identified by mass spectrometry. Phosphoproteins were fairly stable in urine samples, leading to accumulation. Urinary tyrosine phosphoproteins showed the most remarkable changes in patients with cancer with an approximately 5-fold increase compared to levels in healthy controls.
To our knowledge we investigated for the first time the diagnostic potential of tissue and urinary tyrosine phosphoproteins for bladder carcinoma. Results indicate that phosphorylated proteins may represent a new, valuable class of urinary biomarkers for bladder cancer.
closed_qa
Is there any relationship between coronary artery disease and postprandial triglyceride levels?
We aimed to evaluate the relationship between postprandial triglyceride (PPTG) levels and coronary artery disease (CAD). A total of 80 patients were included in this prospective cohort study. Oral lipid loading was used in order to measure PPTG levels. In the fasting state and after the high fat breakfast, triglyceride levels were measured by enzymatic methods at 2nd, 4th, 6th and 8th hours. We made subgroup analysis to show the effects of lipid loading on triglyceride levels in patients with and without fasting hypertriglyceridemia. We evaluated triglyceride levels and changes of triglyceride levels in percentages after lipid loading using a general linear model for repeated measures. Sample size analysis was performed. Baseline clinical, demographic and laboratory characteristics of both groups were similar. The peak triglyceride levels were seen at the 4th hour in both groups. Triglyceride levels were significantly increased after lipid-rich-breakfast loading compared to baseline levels in both groups (p<0.001) but these changes were not significant (p=0.279). In patients with elevated fasting triglyceride levels, the area under the plasma triglyceride concentration curve was significantly larger in CAD group than control group (334±103 vs. 233±58 mg/dl, p=0.02).
Our data show that in patients who have a high fasting triglyceride level, high levels of PPTG may be related to CAD, however high PPTG levels are not related to CAD in patients with normal fasting levels of triglyceride.
closed_qa
Thinking about maintaining exercise therapy: does being positive or negative make a difference?
To investigate social-cognitive and exercise differences in individuals who think positively or negatively about upcoming exercise while engaged in programs of maintenance exercise therapy for cardiovascular disease and other chronic health conditions. Participants (n = 40) completed measures relative to exercise adherence. MANOVA revealed positive thinkers were significantly higher in exercise frequency, self-regulatory efficacy, positive affect, willingness to adapt and lower in decisional struggle than negative thinkers.
Thoughts about exercise therapy are related to social cognitions crucial to motivating self-regulatory actions influencing exercise. Negative thoughts may suggest less ability to adapt to maintenance exercise challenges.
closed_qa
Do patients whose psychogenic non-epileptic seizures resolve, 'replace' them with other medically unexplained symptoms?
In clinical practice, it is sometimes observed that patients in whom psychogenic non-epileptic seizures (PNES) cease, develop another medically unexplained symptom (MUS). In order to determine how many patients develop new MUS post diagnosis and whether patients whose attacks cease are more likely to do so, new MUS were recorded 6-12 months after the diagnosis of PNES in 187 consecutive patients. Compared with baseline, the overall proportion of patients with MUS increased slightly, from 70.1% to 76.5%, with 44/187 patients (23.5%) developing new MUS. There were no significant differences between attack free and non-attack free patients. Binary logistic regression analysis showed that predictors of new MUS diverged between attack free and non-attack free patients. Among patients continuing to have attacks, those with previous health related psychological trauma were 18.00 times more likely to develop new MUS (p<0.0005). In patients who became attack free, patients drawing disability benefits were 5.04 times more likely to have new MUS (p = 0.011).
The results suggest that almost 25% of patients develop new MUS following a diagnosis of PNES, although most of those have MUS pre-diagnosis. Patients with a history of health related psychological trauma whose attacks continue after diagnosis are at particularly high risk of developing new MUS. The data do not support the hypothesis that PNES that resolve are likely to be 'replaced' by other MUS.
closed_qa
Maternal unemployment and childhood overweight: is there a relationship?
Previous studies have shown a positive association between maternal work hours and childhood overweight. However, it is unclear what role job instability plays in this relationship; therefore, this study examined whether children whose mothers experienced unemployment were more likely to have greater increases in body mass index (BMI) as compared with children whose mothers were stably employed. The effects of unemployment benefits, welfare and number of hours worked were also explored. A multiple regression analysis was used to analyse changes in BMI over a 4-year period using the National Longitudinal Survey of Youth. In all, 4890 US children, aged 2-16 at baseline, were included in the analysis. As compared with children of mothers who were employed full-time and did not receive welfare, children of mothers who experienced unemployment and received unemployment benefits were not more likely to have significantly different changes in BMI. Yet children of mothers who experienced unemployment and did not receive unemployment benefits were significantly more likely to have greater increases in BMI. These results were also shown in models which controlled for height. This supports the conclusion that adiposity changes, and not simply growth-rate differences, account for the different BMI changes between groups.
Aspects of maternal employment other than number of work hours are associated with child BMI, including unemployment events and what type of support a mother receives during the time of unemployment. This has implications for policies that relate to benefits for mothers who lose their jobs.
closed_qa
Is aspirin effective in women undergoing in vitro fertilization (IVF)?
Aspirin is believed to improve the outcome of IVF, but previous conventional meta-analyses on the subject are conflicting. Therefore, we performed a meta-analysis with individual patient data (IPD MA) of randomized clinical trials (RCTs) on the subject. A systematic literature search was conducted to identify RCTs assessing the effectiveness of aspirin in IVF. Authors were asked to share their original data. In a one step meta-analytic approach, the treatment effect of aspirin was estimated with odds ratios (ORs) and 95% confidence intervals (CIs) using logistic regression, based on the intention to treat principle. Ten studies fulfilled the inclusion criteria. Authors of six studies provided IPD, including 1119 patients (562 placebo and 557 aspirin). There were 160 clinical pregnancies in the aspirin (28.8%) and 179 (31.9%) in the placebo group [OR 0.86, 95% CI (0.69-1.1)]. There were 129 ongoing pregnancies in the aspirin (23.6%) and 147 in the placebo group (26.7%) [OR 0.85, 95% CI (0.65-1.1)]. Whereas the conventional meta-analysis limited to studies that could provide IPD showed an OR of 0.89 (95% CI 0.69-1.2), the conventional meta-analysis limited to the eight studies of which method of randomization could be confirmed showed an OR of 0.94 (95% CI 0.76-1.17) and the conventional meta-analysis including all 10 eligible RCTs identified with our search changed the OR to 1.07 (95% CI 0.81-1.41). This difference in direction of effect, derived from the studies not able to share IPD of which quality of randomization could not be confirmed.
Aspirin does not improve pregnancy rates after IVF.
closed_qa
Can poison control data be used for pharmaceutical poisoning surveillance?
To determine the association between the frequencies of pharmaceutical exposures reported to a poison control center (PCC) and those seen in the emergency department (ED). A statewide population-based retrospective comparison of frequencies of ED pharmaceutical poisonings with frequencies of pharmaceutical exposures reported to a regional PCC. ED poisonings, identified by International Classification of Diseases, Version 9 (ICD-9) codes, were grouped into substance categories. Using a reproducible algorithm facilitated by probabilistic linkage, codes from the PCC classification system were mapped into the same categories. A readily identifiable subset of PCC calls was selected for comparison. Correlations between frequencies of quarterly exposures by substance categories were calculated using Pearson correlation coefficients and partial correlation coefficients with adjustment for seasonality. PCC reported exposures correlated with ED poisonings in nine of 10 categories. Partial correlation coefficients (r(p)) indicated strong associations (r(p)>0.8) for three substance categories that underwent large changes in their incidences (opiates, benzodiazepines, and muscle relaxants). Six substance categories were moderately correlated (r(p)>0.6). One category, salicylates, showed no association. Limitations Imperfect overlap between ICD-9 and PCC codes may have led to miscategorization. Substances without changes in exposure frequency have inadequate variability to detect association using this method.
PCC data are able to effectively identify trends in poisonings seen in EDs and may be useful as part of a pharmaceutical poisoning surveillance system. The authors developed an algorithm-driven technique for mapping American Association of Poison Control Centers codes to ICD-9 codes and identified a useful subset of poison control exposures for analysis.
closed_qa
Do postmarketing surveillance studies represent real-world populations?
To evaluate outcomes after carotid artery stenting in larger real-world populations, the Food and Drug Administration mandated that companies conduct postmarketing surveillance (PMS) studies of approved stent systems. Whether PMS studies are representative of carotid artery stenting in routine clinical practice has not been established. Within the National Cardiovascular Database Registry-Carotid Artery Revascularization and Endarterectomy (NCDR CARE) Registry, we compared patient and procedural characteristics, in-hospital outcomes, and subsequent all-cause mortality after carotid artery stenting in PMS study participants and nonparticipants. We conducted both crude and propensity score-adjusted comparisons for all outcomes between groups. Compared with nonparticipants, participants in PMS studies had lower rates of symptomatic carotid artery disease within the preceding 6 months, prior stroke, and acute evolving stroke at baseline. The PMS study participants had lower unadjusted rates of combined in-hospital death, stroke, or myocardial infarction (2.3% versus 4.1%; P<0.001), driven by lower rates of stroke (1.7% versus 2.7%; P=0.005) and death (0.3% versus 1.4%; P<0.001). Differences in survival persisted after propensity score adjustment (odds ratio, 0.44; 95% confidence interval, 0.21 to 0.95; P=0.04 for in-hospital mortality; and hazard ratio, 0.80; 95% confidence interval, 0.66 to 0.97; P=0.02 for 2-year mortality). Baseline differences in neurological history explained the largest proportion of the difference in outcomes between groups.
Participants in PMS studies for carotid artery stenting have different clinical and procedural characteristics and lower mortality compared with nonparticipants. Extrapolating results from PMS studies of carotid artery stenting to larger real-world settings should be done only with great caution.
closed_qa
Can the viability of a nonunion be evaluated using SPECT/CT?
The importance of the vitality of a nonunion is crucial for the planning of the reconstructive procedure. Purpose of the present study is to analyze the role of single photon emission computed tomography (SPECT) in diagnosing and planning the treatment of atrophic nonunions in the upper and lower extremity. This study examined retrospectively the SPECT/CT scans of 10 patients (mean age = 44.5 ± 16.5 years, 9 males/1 female, 4 tibia/4 femur/1 radius/1 fibula) who underwent surgical exploration for suspected avital pseudarthrosis. Surgical and histopathological findings were compared with the radiologists' findings to assess the sensitivity and specificity of SPECT in diagnosing avital nonunions. The average interval from the osteosynthesis until their SPECT scan was 18 months. All surgical findings have been documented electronically in the hospital computer system. Results of the radiologist's reading were then compared with surgical exploration and histopathological findings and specifity and sensitivity was calculated. There were 4 vital and 6 nonvital pseudarthroses. SPECT scans identified all the vital pseudarthroses and 3 of the 6 nonvital pseudarthroses. The sensitivity of SPECT in diagnosing non-vital atrophic nonunions is 50% and the specifity 100%.
SPECT/CT scan is a test with a low sensitivity but good specificity that excludes infection and confirms nonviability of the nonunion site. However, we shall wait for larger pool of research results in order to incorporate this test in routine clinical use.
closed_qa
Are serum protein biomarkers derived from proteomic analysis useful in screening for trisomy 21 at 11-13 weeks?
The aim of this study is to identify potential biomarkers for fetal trisomy 21 from previous publications using proteomic techniques and examine the potential value of such biomarkers in early screening for this aneuploidy. This was a case-control study of 25 pregnancies with fetal trisomy 21 and 50 euploid controls undergoing first-trimester screening for aneuploidies by a combination of maternal age, fetal nuchal translucency (NT) thickness and maternal serum free β-human chorionic gonadotrophin (β-hCG) and pregnancy-associated plasma protein-A (PAPP-A). The maternal serum concentrations of afamin, apolipoprotein E, clusterin, ceruloplasmin, epidermal growth factor, fetuin-A, pigment epithelium-derived factor glycoprotein and transthyretin were determined using an ELISA and compared in the euploid and trisomy 21 groups. In pregnancies with fetal trisomy 21, the median maternal age, fetal NT thickness and serum free β-hCG were increased, whereas serum PAPP-A was decreased. However, there were no significant differences between cases and controls in any of the biomarkers.
Proteins identified as potential biomarkers for trisomy 21 using proteomic techniques have not been found to be useful in early screening for this aneuploidy.
closed_qa
Do voluntary step reactions in dual task conditions have an added value over single task for fall prediction?
Stepping reactions play a critical role in responding to balance perturbations, whether they are a consequence of external perturbation or self-induced in nature. The aim of the present study was to determine prospectively the capacity of voluntary stepping performance in singleand dual-task conditions, to predict future falls among older community-dwelling persons. We also aimed to assess whether dual task conditions have an added value over single tasks for fall prediction. A total of 100 healthy old volunteers (mean age 78.4±5.7 yrs), from two self-care protected retirement homes for older adults, performed the Voluntary Step Execution Test in single- and dual-task conditions as a reaction time task while standing on a single force platform. Step initiation, preparatory and swing phases, and foot-contact time were extracted from data on center of pressure and ground reaction force. One-year fall incidences were monitored. Ninety-eight subjects completed the one-year follow-up, 49 non-fallers, 32 one-time fallers, and 17 recurrent fallers (two or more falls). Recurrent fallers had significantly slower voluntary step execution times in both single- and dual-task conditions, especially due to a slower preparation phase. Two stepwise (backward) logistic regression models showed that longer step execution times have strong predictive value for falls in both single- and dual-task conditions (odds ratio (OR) 8.7 and 5.4, respectively, p<0.05).
Voluntary Step Execution Test in both single- and dual-task conditions is a simple and safe examination which can potentially and effectively predict future falls, with no added value to dual- over single-task condition.
closed_qa
Is chair rise performance a useful measure of leg power?
Chair rise performance, which is simple to assess in a home or clinic setting, has been used as a method of predicting leg power deficit in older adults. More recently, chair rise performance has been assessed in younger populations as a baseline for assessment of subsequent age-related declines in function and power. However, as rising from a chair repeatedly not only requires lower limb strength and power but also good balance and coordination, it may not be purely a measure of leg power especially among these younger, well functioning groups who are yet to experience agerelated declines and deficits in function. The aim of this study was to assess whether chair rise performance can be considered as a predictor of leg power, and hence of deficits in this, in men and women in mid-life. We assessed the relationship of chair rise performance with leg extensor power (LEP), measured using the Nottingham Power Rig (NPR), and with standing balance performance. LEP was measured in a clinic setting in a sub-sample of 81 men and 93 women from the MRC National Survey of Health and Development, a nationally representative cohort born in Britain in 1946. The time taken to rise from a chair 10 times and standing balance time were assessed during home visits at the same age. Increasing LEP was associated with better chair rise performance among those who completed 10 chair rises in ≥15 seconds, after adjustment for body size (p=0.008). Better standing balance performance was associated with better chair rise performance in men, but not women.
That LEP and standing balance are both related to chair rise time in men suggests that chair rise time should not be thought of purely as a proxy measure of leg power in middle-aged populations. This has implications for longitudinal studies which want to study age-related decline in chair rise performance.
closed_qa
Acute ankle sprain: is there a best support?
Acute lateral ankle sprain accounts for 85% of all sprains, being generally accepted as the most common sports-related ligamentous injury. There is a lack of consensus about the optimal management of these injuries despite their frequency. The time-honoured mantra of rest, ice, elevation and compression is still commonly used, even though the current evidence for compression is conflicting. A prospective randomized controlled clinical trial was carried out in the emergency department of a regional hospital in Ireland to compare outcomes, in terms of ankle function, pain improvement and return-to-work times, in adults presenting within 24 h of first-time acute lateral ankle sprain, among three external supports. We found no statistically significant differences among all three treatments in terms of ankle joint function, using the Karlsson ankle function scale, at 10 or 30-days follow-up. There was a tendency for Elastoplast bandaging to provide better average ankle function at both time points, when compared with double tubigrip and no support. Participants returned to work an average 2 days earlier, if treated with Elastoplast.
This study found no statistically significant difference in ankle function between double tubigrip bandage, Elastoplast bandage and no support at 10 or 30-days follow-up.
closed_qa
Tuberculosis risk before and after highly active antiretroviral therapy initiation: does HAART increase the short-term TB risk in a low incidence TB setting?
To evaluate the short-term and long-term effects of highly active antiretroviral therapy (HAART) on tuberculosis (TB) risk compared with risk without HAART in a low TB incidence setting. An observational cohort study among HIV-infected persons in care at the Comprehensive Care Center (Nashville, TN) between January 1998 and December 2008. A marginal structural model was used to estimate the effect of HAART on short-term (≤180 days) and long-term (>180 days) TB risk, with CD4⁺ lymphocyte count incorporated as a time-updated covariate. Of 4534 HIV-infected patients, 34 developed TB (165 per 100,000 person-years; 20,581 person-years of follow-up). Seventeen cases occurred among persons not on HAART or>30 days after HAART discontinuation (212 per 100,000 person-years; 8019 person-years of follow-up). Seventeen occurred among persons on HAART (135 per 100,000 person-years; 12,562 person-years of follow-up); 10 in the first 180 days (402 per 100,000 person-years; 2489 person-years of follow-up); and 7 after more than 180 days (69 per 100,000 person-years; 10,073 person-years of follow-up). After adjusting for the most recent CD4⁺ lymphocyte count, the risk of TB in the first 180 days of HAART exposure relative to no HAART was 0.68 (0.14-3.22, P = 0.63).
In this low TB incidence setting, the TB rate in the first 180 days of HAART was almost twice as high as persons not on HAART. However, after adjusting for most recent CD4⁺ count, there was no significant difference in TB risk between these 2 groups. This suggests that low recent CD4⁺ lymphocyte count influences TB risk during the first 180 days of HAART.
closed_qa
Is there any association between TACSTD2, KIAA1253, Ku70 and mutant KRAS gene expression and clinical-pathological features of colorectal cancer?
Samples of tumor and normal tissue of patients surgically treated for colorectal cancer between July 2005 and July 2009 were stored in a tissue bank. These samples were studied with the technique of real-time polymerase chain reaction in respect to expression of the following genes: KRAS codon 12 mutation, TACSTD2, Ku70, and SERIN1. Tumor samples of 37 patients were studied. The mean age was 65.5 years. Twenty one patients (56.8%) were male. Nine patients (24.3%) were classified as TNM stage I, 11 patients (29.8%) as TNM stage II, eight patients (21.6%) as TNM stage III and nine patients (24.3%) as TNM stage IV. The Ku70 expression in poorly-differentiated tumors is significantly higher than in well and moderately-differentiated tumors (2.76 vs. 1.13; p<0.05). SERIN1, TACSTD2 and KRAS codon 12 mutation are not associated with clinical-pathological characteristics of colorectal cancer.
Ku70 expression in poorly-differentiated tumors is significantly higher than in well and moderately-differentiated colorectal tumors.
closed_qa
Do brain activation changes persist in athletes with a history of multiple concussions who are asymptomatic?
To evaluate brain activation patterns of asymptomatic athletes with a history of two or more concussions. A paired case-control design was used to evaluate brain activation patterns during cognitive performance in 14 athletes with a history of two or more concussions and 14 age- and sex-matched controls with no previous concussion. Percentage Blood-Oxygen-Level-Dependent (BOLD) change during an N-back working memory task was assessed in all participants. Performance on the Trail-Making Test Form A and B, Symbol-Digit Modalities Test and the Immediate Post-concussion Assessment and Cognitive Test (ImPACT) was also compared between groups. As expected, brain regions activated during the performance of the N-back were equivalent between groups. The groups performed similarly on the neurocognitive measures. The history of concussion group was less accurate than controls on the 1-, 2- and 3-back conditions of the N-back.
Following the complete resolution of symptoms, a history of two or more concussions is not associated with changes in regional brain activation during the performance of working memory task. Compensatory brain activation may only persist during the typically brief time athletes experience symptoms following concussion.
closed_qa
Effects on outcomes of heart rate reduction by ivabradine in patients with congestive heart failure: is there an influence of beta-blocker dose?
This study used the SHIFT (Systolic Heart failure treatment with the I(f) inhibitor ivabradine Trial) database to assess the impact of background beta-blocker dose on response to ivabradine. In systolic heart failure, reduction in relatively high heart rates improves clinical outcomes when achieved with beta-blockers and even more so when the sinus node inhibitor ivabradine also is added. Among patients with systolic heart failure, sinus rhythm, and heart rate ≥70 beats/min on recommended background therapy, maximally tolerated beta-blocker doses were subgrouped as no beta-blocker,<25%, 25% to<50%, 50% to<100%, and 100% of European Society of Cardiology–suggested target doses. The impact of ivabradine on cardiovascular death or heart failure hospitalization (primary endpoint) was analyzed in each subgroup as time-to-first event using Cox models adjusted for heart rate. The statistical models assessed heterogeneity and trend of the treatment effect across subgroups, and an additional analysis was made adjusting for the interaction of randomized treatment with baseline heart rate. The primary endpoint and heart failure hospitalizations were significantly reduced by ivabradine in all subgroups with<50% of target beta-blocker dose, including no beta-blocker (p = 0.012). Despite an apparent trend to reduction in treatment-effect magnitude with increasing beta-blocker dose, no variation in treatment effect was seen in general heterogeneity interaction tests (p = 0.35). Across beta-blocker subgroups, treatment effect was borderline nonsignificant only for the primary endpoint (p = 0.056), and significance was further lost after adjusting for interaction between baseline heart rate and ivabradine effect (p = 0.14).
The magnitude of heart rate reduction by beta-blocker plus ivabradine, rather than background beta-blocker dose, primarily determines subsequent effect on outcomes. (Effects of ivabradine on cardiovascular events in patients with moderate to severe chronic heart failure and left ventricular systolic dysfunction. A three-year randomised double-blind placebo-controlled international multicentre study; ISRCTN70429960)
closed_qa
Is abdominal compression useful in lung stereotactic body radiation therapy?
To determine the usefulness of abdominal compression in lung stereotactic body radiation therapy (SBRT) depending on lobe tumor location. Twenty-seven non-small cell lung cancer patients were immobilized in the Stereotactic Body Frame™ (Elekta). Eighteen tumors were located in an upper lobe, one in the middle lobe and nine in a lower lobe (one patient had two lesions). All patients underwent two four-dimensional computed tomography (4DCT) scans, with and without abdominal compression. Three-dimensional tumor motion amplitude was determined using manual landmark annotation. We also determined the internal target volume (ITV) and the influence of abdominal compression on lung dose-volume histograms. The mean reduction of tumor motion amplitude was 3.5 mm (p = 0.009) for lower lobe tumors and 0.8 mm (p = 0.026) for upper/middle lobe locations. Compression increased tumor motion in 5 cases. Mean ITV reduction was 3.6 cm(3) (p = 0.039) for lower lobe and 0.2 cm(3) (p = 0.048) for upper/middle lobe lesions. Dosimetric gain of the compression for lung sparing was not clinically relevant.
The most significant impact of abdominal compression was obtained in patients with lower lobe tumors. However, minor or negative effects of compression were reported for other patients and lung sparing was not substantially improved. At our institute, patients with upper or middle lobe lesions are now systematically treated without compression and the usefulness of compression for lower lobe tumors is evaluated on an individual basis.
closed_qa
Are quality improvement methods a fashion for hospitals in Taiwan?
This study reviews the rise and fall of the quality improvement (QI) methods implemented by hospitals in Taiwan, and examines the factors related to these methods. Cross-sectional, questionnaire-based survey. One hundred and thirty-nine district teaching hospitals, regional hospitals and medical centers. Directors or the persons in charge of implementing QI methods. s) None. s) Breadth and depth of the 18 QI methods. Seventy-two hospitals responded to the survey, giving a response rate of 52%. In terms of breadth based on the hospitals' self-reporting, the average number of QI methods adopted per hospital was 11.78 (range: 7-17). More than 80% of the surveyed hospitals had implemented eight QI methods, and>50% had implemented five QI methods. The QI methods adopted by over 80% of the surveyed hospitals had been implemented for a period of ∼7 years. On the basis of the authors' classification, seven of the eight QI methods (except for QI team in total quality management) had an implementation depth of almost 70% or higher in the surveyed hospitals.
This study provides a snapshot of the QI methods implemented by hospitals in Taiwan. The results show that the average breadth of the QI methods adopted was 11.78; however, only 8.83 were implemented deeply. The hospitals' accreditation level was associated with the breadth and depth of QI method implementation.
closed_qa
Does stimulant pretreatment modify atomoxetine effects on core symptoms of ADHD in children assessed by quantitative measurement technology?
To compare the reduction of ADHD symptoms under atomoxetine (ATX) in patients with and without pretreatment with a stimulant medication using a computer-based Continuous Performance Test (cb-CPT) combined with an infrared motion tracking (MT) device. Double-blind, placebo-controlled study in ADHD patients (6-12 years) treated with ATX (target dose = 1.2 mg/kg per day). The cb-CPT/MT scores were analyzed using ANCOVA (last observation carried forward). Patient data (n = 125) suggested a differential ATX treatment effect between pretreated and stimulant-naïve patients in terms of three cb-CPT/MT parameters.
This secondary analysis provided evidence that ATX reduced ADHD symptom severity measured by cb-CPT/MT parameters regardless of stimulant pretreatment. A few differential effects were seen based on the cb-CPT/MT. However, no clear pattern could be identified and, overall, the observed differences have no larger clinical relevance. The ATX effect in this study seemed to be largely independent of any previous exposure to stimulants.
closed_qa
Risk screening for ADHD in a college population: is there a relationship with academic performance?
The present study examines the relationship between self-reported levels of ADHD and academic outcomes, as well as aptitude. A total of 523 college students took the Adult Self-Report Scale-Version 1.1 (ASRS-V1.1), and their scores were compared with course performance and ACT (American College Test) composite scores. The measure identified 70 students (13.4%) as being in the "highly likely" category for an ADHD diagnosis. Course exam and ACT scores for the 70 "highly likely" students were statistically identical to the remaining 453 students in the sample and the 77 students identified as "highly unlikely" as well. Only 4 of the "highly likely" 70 students were registered with the university's Office of Student Disability Services as having been diagnosed with ADHD.
The ASRS-V1.1 failed to discriminate academic performance and aptitude differences between ADHD "highly likely" and "highly unlikely" individuals. The use of self-report screeners of ADHD is questioned in contexts relating ADHD to academic performance.
closed_qa
Do pharmacokinetics explain persistent restenosis inhibition by a single dose of paclitaxel?
The purpose of this study was to investigate the elimination of paclitaxel from the arterial wall after a single short administration with a coated balloon. Slightly oversized paclitaxel-coated balloons (dose 3 or 9 μg/mm(2)) without or with premounted stents were inflated in nonatherosclerotic coronary arteries of either young domestic pigs or adult Goettingen minipigs. The paclitaxel content of plasma, arterial segments, and residual hearts (without treated arteries) was measured for up to 180 days using high-performance liquid chromatography/ultraviolet detection or mass spectrometry. Angiograms were evaluated for lumen narrowing. The paclitaxel concentration in plasma remained<10 ng/mL. In arteries of domestic pigs and minipigs treated with paclitaxel-coated balloons with premounted stents, 10%±5% or 21%±8% of dose, respectively, was initially detected and decreased to 3.5%±3.1% of dose (domestic pig) by Day 7. Within 6 months it fell with a half-life of 1.9 months to 0.40%±0.35%. After 3 months the concentration in the arterial wall was 17±11 μmol/L. Without a stent, drug transfer to the vessel wall was somewhat reduced and elimination faster. Immediately after treatment up to 26%±4% of dose was detected in the residual whole hearts. It dropped with a half-life of 45 days to 1.5%±0.6% of dose (0.3 μmol/L) within 6 months.
After a single local administration with coated balloons, paclitaxel stays in the vessel wall of pigs long enough to explain persistent inhibition of neointimal proliferation. The pharmacokinetics of paclitaxel does, however, not exclude other reasons for sustained efficacy such as early blocking of processes initiating excessive and prolonged neointimal proliferation.
closed_qa
Surgical treatment of left main disease and severe carotid stenosis: does the off-pump technique provide a better outcome?
Left main disease (LMD), combined with carotid artery stenosis (CAS), constitutes a high-risk patient population. Priority is often given to coronary revascularization, due to the severity of the angina. However, the choice of revascularization strategy [off-pump coronary artery bypass (OPCAB) vs coronary artery bypass grafting (CABG)] remains elusive. A total of 1340 patients with LMD were non-randomly assigned to either on-pump (CABG group, n = 680) or off-pump (OPCAB group, n = 634) revascularization between 1 January 2006 and 21 September 2010. Multivariable regression was used to determine the risk-adjusted impact of a revascularization strategy on a composite in-hospital outcome (MACCE), and proportional hazards regression was used to define the variables affecting long-term survival. Significant CAS was found in 130 patients: 84 (13.1%) patients underwent OPCAB, while 46 patients (6.8%) underwent CABG (P<0.05). Patients with a history of stroke/transient ischaemic attack were also more likely to receive OPCAB (7.1 vs 4.7%; P = 0.08). OPCAB patients were older, in a higher New York Heart Association (NYHA) class, with a lower LVEF and higher EuroSCORE. A calcified aorta was found in 79 patients [OPCAB-CABG: 49 (7.73%) vs 30 (4.41%); P = 0.016] and resulted in a less complex revascularization (OPCAB-CABG: 2.3 ± 0.71 vs 3.19 ± 0.82; P<0.05), and 30-day mortality was insignificantly higher in the CABG (2.7 vs 2.8%) as well as MACCE (11.2 vs 12.2%; P = NS). This trend reversed when late mortality was evaluated; however, it did not reach significance at 60 months. Preoperative renal impairment requiring dialysis was found to be a technique-independent predictor of MACCE. The number of arterial conduits also influenced MACCE.
Off-pump coronary revascularization may offer risk reduction of neurological complications in patients with a significant carotid artery disease and a history of previous stroke, but a larger study population is needed to support this thesis. The growing discrepancy in long-term survival should draw attention to a more complete revascularization in OPCAB patients.
closed_qa
Sleep apnea as a comorbidity in obese psoriasis patients: a cross-sectional study. Do psoriasis characteristics and metabolic parameters play a role?
Psoriasis is associated with a variety of comorbidities such as obesity and cardiovascular disease. In a cross-sectional study, we explored whether obstructive sleep apnea and hypopnea syndrome (OSAHS) is associated with psoriasis characteristics and metabolic parameters. Thirty-five patients with chronic plaque psoriasis underwent a nocturnal polysomnography study and were analysed for Apnoea-Hypopnoea Index to assess OSAHS severity and Framigham score to predict the absolute risk of coronary artery disease at 10 years. The association of OSAHS with psoriasis was examined according to psoriasis characteristics (PASI and DLQI scores, disease duration and previous use of systemic treatments), metabolic parameters (Body Mass Index - BMI, waist to hip ratio - WHR, lipid profile) and other comorbidities (obesity, hypertension, arthritis and cardiovascular disease). There was no correlation between psoriasis characteristics and OSAHS. Psoriasis patients with OSAHS presented more frequent snoring and lower sleep quality compared with those without OSAHS. In univariate analyses, OSAHS was associated with increased BMI and hypertension in psoriasis patients. In multivariable logistic regression models, there was statistically significant evidence that only BMI and hypertension were associated with increased risk of OSAHS, adjusting for psoriasis characteristics, age and gender. Presence of metabolic syndrome, WHR, and smoking were not significant risk factors for OSAHS. In subgroup analyses, OSAHS correlated with duration of psoriasis (>8 years) in women (P = 0.021) and with Framigham score in men (P = 0.035).
OSAHS may be a comorbidity in obese psoriasis patients with hypertension. Treatment with continuous positive airway pressure and weight loss interventions should be initiated.
closed_qa
Plasma YKL-40: a potential biomarker for psoriatic arthritis?
Plasma YKL-40 is an inflammatory biomarker. No useful biomarker exists in patients with psoriasis or psoriatic arthritis. To measure YKL-40 and high-sensitivity C-reactive protein (hs-CRP) in patients with psoriasis or psoriatic arthritis before and during treatment. In 48 patients with psoriasis, we measured YKL-40, hs-CRP and Psoriasis Area and Severity Index (PASI) at inclusion and in a subgroup of 14 patients, we repeated the measurements after four to six weeks of methotrexate treatment. In 42 patients with psoriatic arthritis, we measured YKL-40 and hs-CRP at inclusion and during 48 weeks of adalimumab treatment. The patients with psoriatic arthritis were divided into responders and non-responders. In patients with psoriasis, the baseline median PASI score was 10.8 and baseline YKL-40 was 45 μg/L. Seventeen per cent had elevated plasma YKL-40 compared with healthy subjects. Baseline PASI and YKL-40 were not correlated (rho = 0.14, P = 0.347) and YKL-40 and hs-CRP remained unchanged after treatment. In patients with psoriatic arthritis, the median pretreatment YKL-40 was 112 μg/L and 43% had elevated YKL-40. YKL-40 decreased in 33 patients who responded to adalimumab (from 112 μg/L to 68 at 48 weeks, P = 0.007). Hs-CRP decreased (from 4.65 mg/L to 0.91, P = 0.013) in the responders. In the non-responders (n = 9), YKL-40 and hs-CRP remained unchanged.
YKL-40 is elevated in many patients with psoriatic arthritis, but not in patients with psoriasis. YKL-40 decreased in patients with psoriatic arthritis who responded to treatment. YKL-40 may be a useful biomarker to monitor the effect of treatment with tumour necrosis factor-α inhibitors in patients with psoriatic arthritis.
closed_qa
Neoadjuvant therapy and liver transplantation for hilar cholangiocarcinoma: is pretreatment pathological confirmation of diagnosis necessary?
Neoadjuvant chemoradiotherapy followed by operative staging and liver transplantation is an effective treatment for patients with unresectable hilar cholangiocarcinoma (CCA) and CCA arising in the setting of primary sclerosing cholangitis (PSC). Pathologic confirmation of CCA is notoriously difficult, and many patients have been treated based on clinical criteria without pathological confirmation. We reviewed our experience with the specific aim of determining the need for pathological confirmation of CCA before treatment. Two hundred and fifteen patients received neoadjuvant therapy between 1992 and 2011. One hundred and eighty-two patients underwent operative staging and 38 (21%) had findings that precluded transplantation. Pathological confirmation of CCA before therapy was achieved in 45 of 87 (52%) PSC patients and 22 of 49 (45%) de novo patients who underwent transplantation. Pretreatment pathological confirmation was associated with significantly worse 5-year survival after start of therapy for PSC patients (50% vs 80%; p = 0.001), but not for de novo patients (39% vs 48%; p = 0.27). Pretreatment pathological confirmation was associated with worse 5-year survival after transplantation for PSC patients (66% vs 92%; p = 0.01), but not for de novo patients (63% vs 65%; p = 0.71). The difference in the PSC patients was not due to recurrent cancer. Absence of pretreatment pathological confirmation did not result in less detection of residual CCA in the explanted livers or in less recurrence after transplantation.
Rates of residual CCA in liver explants and recurrences after transplantation are comparable for patients with and without pretreatment pathological confirmation of CCA and attest to the accuracy of clinical diagnostic criteria. Pretreatment pathological confirmation of CCA is desirable but should not be a requirement for treatment.
closed_qa
Is renal thrombotic angiopathy an emerging problem in the treatment of ovarian cancer recurrences?
Ovarian cancer is usually diagnosed at an advanced stage, with most patients undergoing surgery followed by platinum- and taxane-based chemotherapy. After initial clinical remission, the majority recur, leading to additional treatments, including not only platinums and taxanes but also pegylated liposomal doxorubicin (PLD), gemcitabine, topotecan, and, more recently, bevacizumab, which may extend survival times. PLD, in particular, has been extensively studied by our group, with encouraging therapeutic results. We, however, observed instances of chronic kidney disease (CKD) developing among patients who received long-term treatment for recurrent ovarian cancer. To document the frequency and contributing factors to the emergence of CKD, we initiated a retrospective review at two institutions. Fifty-six consecutive patients with recurrent ovarian cancer receiving treatment at New York University Cancer Institute were reviewed for the presence of renal disease in 1997-2010. At Shaare Zedek Medical Center, 73 consecutive patients with ovarian cancer were reviewed in 2002-2010. Patients were diagnosed with CKD if they had an estimated GFR<60 mL/minute per 1.73 m2 for>3 months and were staged according to the National Kidney Foundation guidelines. Thirteen patients (23%) developed stage ≥3 CKD. Three patients had renal biopsies performed that showed thrombotic microangiopathy.
CKD is emerging as a potential long-term consequence of current chemotherapy for recurrent ovarian cancer.
closed_qa
Squamous cell carcinoma of the oral cavity in nonsmoking women: a new and unusual complication of chemotherapy for recurrent ovarian cancer?
To describe occurrences of oral squamous cell carcinoma (SCC) in patients who had received long-term pegylated liposomal doxorubicin (PLD) for ovarian cancer. In our cohort of patients on maintenance PLD for ovarian and related mullerian epithelial malignancies, we encountered two patients with invasive SCC of the oral cavity (one of them multifocal) and one with high-grade squamous dysplasia. Review of patients at our institution receiving PLD for recurrent ovarian cancer identified three additional patients. The duration of treatment, cumulative PLD dose, human papillomavirus (HPV) positivity, BRCA status, stage at diagnosis, outcome, and other characteristics are reviewed. All five cases were nonsmokers with no known risk factors for HPV and four were negative for p16 expression. Four of the patients had known BRCA mutations whereas one tested negative. Cumulative doses of PLD were>1,600 mg/m2 given over 30-132 months. Three had SCCs staged as T1N0 oral tongue, alveolar ridge (gingival), and multifocal oral mucosa; one had a T2N0 oral tongue; and one had dysplasia. After excision, two were given radiation but recurred shortly thereafter; the others remain well and have had no further exposure to cytotoxic drugs, including PLD.
Awareness of this possible long-term complication during PLD treatment should enhance the likelihood of early detection of oral lesions in these patients. Decisions to continue maintenance PLD after complete response of the original cancer should perhaps consider the benefits of delaying ovarian cancer recurrence versus the possible risk for a secondary cancer.
closed_qa
Transcutaneous electrical nerve stimulation: an effective treatment for refractory non-neurogenic overactive bladder syndrome?
To assess the effect of transcutaneous electrical nerve stimulation (TENS) for treating refractory overactive bladder syndrome (OAB). A consecutive series of 42 patients treated with TENS for refractory OAB was prospectively investigated at an academic tertiary referral centre. Effects were evaluated using bladder diary for at least 48 h and satisfaction assessment at baseline, after 12 weeks of TENS treatment, and at the last known follow-up. Adverse events related to TENS were also assessed. Mean age of the 42 patients (25 women, 17 men) was 48 years (range, 18-76). TENS was successful following 12 weeks of treatment in 21 (50 %) patients, and the positive effect was sustained during a mean follow-up of 21 months (range, 6-83 months) in 18 patients. Following 12 weeks of TENS treatment, mean number of voids per 24 h decreased significantly from 15 to 11 (p<0.001) and mean voided volume increased significantly from 160 to 230 mL (p<0.001). In addition, TENS completely restored continence in 7 (39 %) of the 18 incontinent patients. Before TENS, all 42 patients were dissatisfied or very dissatisfied; following 12 weeks of TENS treatment, 21 (50 %) patients felt satisfied or very satisfied (p<0.001). No adverse events related to TENS were noted.
TENS seems to be an effective and safe treatment for refractory OAB warranting randomized, placebo-controlled trials.
closed_qa
Does good clinical practice at the primary care improve the outcome care for diabetic patients?
The Middle East region is predicted to have one of the highest prevalence of diabetes mellitus (DM) in the world. This is the first study in the region to assess treatment outcome of DM according to gender. To assess the quality and effectiveness of diabetes care provided to patients attending primary care settings according to gender in the State of Qatar. It is an observational cohort study. The survey was carried out in primary health care (PHC) centers in the State of Qatar. The study was conducted from January 2010 to August 2010 among diabetic patients attending (PHC) centers. Of the 2334 registered with diagnosed diabetes, 1705 agreed and gave their consent to take part in this study, thus giving a response rate of 73.1%. Face to face interviews were conducted using a structured questionnaire including socio-demographic, clinical and satisfaction score of the patients. Majority of subjects were diagnosed with type 2 DM (84.9%). A significantly larger proportion of females with DM were divorced or widowed (9.1%) in comparison to males with DM (3.4%; p<0.001). A significantly larger proportion of females were overweight (46.5%; p=0.009) and obese (29.5%; p=0.003) in comparison to males. Males reported significantly greater improvements in mean values of blood glucose (mmol/l) (-2.11 vs. -0.66; p=0.007), HbA1c (%) (-1.44 vs. -0.25; p=0.006), cholesterol (mmol/l) (-0.16 vs. 0.12; p=0.053) and systolic blood pressure (mmHg) (-9.04 vs. -6.62; p<0.001) in comparison to females. While there was a remarkable increase in male patients with normal range of fasting blood glucose (FBG; 51.6%) as compared to the FBG measurement 1 year before (28.5%: p<0.001) there was only a slight increase in females normal range FBG during this period from 28.0% to 30.4% (p=0.357).
The present study revealed that the current form of PHC centers afforded to diabetic patients provided significantly improved outcomes for males, but only minor improved outcomes for females. This study reinforces calls for a gender-specific approach to diabetes care.
closed_qa
Does body mass index impact the number of LNs harvested and influence long-term survival rate in patients with stage III colon cancer?
The aim of this study is to evaluate whether different body mass index (BMI) values affect lymph node (LN) retrieval and whether such variations influence long-term survival in Asian patients. From January 1995 to July 2003, 645 stage III colon cancer patients were enrolled in our study. Patients were stratified into four groups: Obese (BMI ≧ 27 kg/m(2)), overweight (24 ≤ BMI<27 kg/m(2)), normal (18.5 ≤ BMI<24 kg/m(2)), and underweight (BMI<18.5 kg/m(2)). Mean BMI in the cohort was 23.3 kg/m(2). Mean number of LNs harvested was 23.1, 19.5, 19.8 and 28.1 in the normal, overweight, obese and underweight groups, respectively. There was a significant difference in the mean number of LNs harvested when comparing the overweight and underweight groups to the normal group (p = 0.013 and p = 0.04, respectively). Females were overrepresented in the underweight group (p = 0.011), and patients who had proximal colon cancers were more frequently underweight (p = 0.018). The mean number of LNs harvested varied by cases of right hemicolectomy (p = 0.009) and proximal cancer location (p = 0.009) for different BMI groups. Multivariate analysis showed that underweight, proximal colon cancer, well- or moderately differentiated adenocarcinoma and stage IIIC cancer were significant variables for adequate LN recovery. BMI was not significantly associated with relapse-free survival (p = 0.523) or overall survival (p = 0.127).
BMI is associated with LN harvest but is not an independent variable in stage III colon cancer survival.
closed_qa
A prospective single-center study of sentinel lymph node detection in cervical carcinoma: is there a place in clinical practice?
To establish the accuracy of sentinel lymph node (SLN) detection in early cervical cancer. Sentinel lymph node detection was performed prospectively over a 6-year period in 86 women undergoing surgery for cervical carcinoma by the combined method (Tc-99m and methylene blue dye). Further ultrastaging was performed on a subgroup of 26 patients who had benign SLNs on initial routine histological examination. The SLN was detected in 84 (97.7%) of 86 women by the combined method. Blue dye uptake was not seen in 8 women (90.7%). Sentinel lymph nodes were detected bilaterally in 63 women (73.3%), and the external iliac region was the most common anatomic location (48.8%). The median SLN count was 3 nodes (range, 1-7). Of the 84 women with sentinel node detection, 65 also underwent bilateral pelvic lymph node dissection, and in none of these cases was a benign SLN associated with a malignant non-SLN (100% negative predictive value). The median non-SLN count for all patients was 19 nodes (range, 8-35). Eighteen patients underwent removal of the SLN without bilateral pelvic lymph node dissection. Nine women (10.5%) had positive lymph nodes on final histology. One patient had bulky pelvic nodes on preoperative imaging and underwent removal of the negative bulky malignant lymph nodes and a benign SLN on the contralateral side. This latter case confirms the unreliability of the SLN method with bulky nodes. The remaining 8 patients had positive SLNs with negative nonsentinel lymph nodes. Fifty-nine SLNs from 26 patients, which were benign on initial routine histology, underwent ultrastaging, but no further disease was identified. Four patients (5%) relapsed after a median follow-up of 28 months (range, 8-80 months).
Sentinel lymph node detection is an accurate and safe method in the assessment of nodal status in early cervical carcinoma.
closed_qa
Impact of lap-band size on weight loss: does gender matter?
Laparoscopic adjustable gastric band (LAGB) has gone through major design modifications to improve clinical endpoints and reduce complications. Little is known, however, about the effects of LAGB size on clinical outcomes, or whether outcomes differ based on gender. We set out to examine the impact of band size on surgical weight loss, reoperations, comorbidity resolution, and compare outcomes within gender. We reviewed our prospectively collected longitudinal bariatric database between 2008 and 2010, and compared patients with BMI 35-50 kg/m(2) who had undergone LAGB with the LAP-BAND® APS to those who had the larger APL. Those patients with initial BMI > 50 kg/m(2) were excluded to reduce any possible selection bias which favors larger band use in such subjects. Three hundred ninety-four patients met our inclusion criteria; 230 (58 %) in the APS group and 164 (42 %) in the APL group. Female patients in APS group experienced significantly higher percentage excess body weight loss at 6 months, 1 year, and 2 years in comparison to female patients in APL group (p < 0.001 for all time points). In contrast, a reverse pattern was observed for male patients. No significant differences were observed between the groups regarding frequency of band adjustments, complications, or comorbidity resolution.
Male patients might benefit from APL bands, in contrast to female patients who appear to experience superior weight loss with the smaller APS bands. This study provides the first set of evidence to facilitate surgical decision making for band size selection and highlights differences between genders.
closed_qa
Is the expression of Transforming Growth Factor-Beta1 after fracture of long bones solely influenced by the healing process?
Circulating TGF-β1 levels were found to be a predictor of delayed bone healing and non-union. We therefore aimed to investigate some factors that can influence the expression of TGF-β1. The correlation between the expression of TGF-β1 and the different socio-demographic parameters was analysed. Fifty-one patients with long bone fractures were included in the study and divided into different groups according to their age, gender, cigarette smoking status, diabetes mellitus and regular alcohol intake. TGF-β1 levels were analysed in patient's serum and different groups were retrospectively compared. Significantly lower TFG-β1 serum concentrations were observed in non-smokers compared to smokers at week 8 after surgery. Significantly higher concentrations were found in male patients compared to females at week 24. Younger patients had significantly higher concentrations at week 24 after surgery compared to older patients. Concentrations were significantly higher in patients without diabetes compared to those with diabetes at six weeks after surgery. Patients with chronic alcohol abuse had significantly higher concentrations compared to those patients without chronic alcohol abuse.
TGF-β1 serum concentrations vary depending upon smoking status, age, gender, diabetes mellitus and chronic alcohol abuse at different times and therefore do not seem to be a reliable predictive marker as a single-point-in-time measurement for fracture healing.
closed_qa
Is there subclinical enthesitis in early psoriatic arthritis?
Enthesitis is a recognized feature of spondylarthritides (SpA), including psoriatic arthritis (PsA). Previously, ultrasound imaging has highlighted the presence of subclinical enthesitis in established SpA, but there are little data on ultrasound findings in early PsA. The aim of our study was to compare ultrasound and clinical examination (CE) for the detection of entheseal abnormalities in an early PsA cohort. Forty-two patients with new-onset PsA and 10 control subjects underwent CE of entheses for tenderness and swelling, as well as gray-scale (GS) and power Doppler (PD) ultrasound of a standard set of entheses. Bilateral elbow lateral epicondyles, Achilles tendons, and plantar fascia were assessed by both CE and ultrasound, the latter scored using a semiquantitative (SQ) scale. Inferior patellar tendons were assessed by ultrasound alone. A GS SQ score of>1 and/or a PD score of>0 was used to describe significant ultrasound entheseal abnormality. A total of 24 (57.1%) of 42 patients in the PsA group and 0 (0%) of 10 controls had clinical evidence of at least 1 tender enthesis. In the PsA group, for sites assessed by both CE and ultrasound, 4% (7 of 177) of nontender entheses had a GS score>1 and/or a PD score>0 compared to 24% (9 of 37) of tender entheses. CE overestimated activity in 28 (13%) of 214 of entheses. All the nontender ultrasound-abnormal entheses were in the lower extremity.
The prevalence of subclinical enthesitis in this early PsA cohort was low. CE may overestimate active enthesitis. The few subclinically inflamed entheses were in the lower extremity, where mechanical stress is likely to be more significant.
closed_qa
End-stage renal disease and critical limb ischemia: a deadly combination?
This study was planned to evaluate the prognostic impact of end-stage renal disease (ESRD) in patients with critical leg ischemia (CLI) undergoing infrainguinal revascularization. 1425 patients who underwent infrainguinal revascularization for CLI were the subjects of the present analysis. Ninety-five patients had ESRD (eGFR<15 ml/min/m²), and of them 66 (70%) underwent percutaneous transluminal angioplasty and 29 (30%) underwent bypass surgery. ESRD patients had significantly lower overall survival (at 3-year, 27.1% vs. 59.7%, p<0.0001), leg salvage (at 3-year, 57.7% vs. 83.0%, p<0.0001), and amputation free survival (at 3-year, 16.2% vs. 52.9%, p<0.0001) than patients with no or less severe renal failure. The difference in survival was even greater between 86 one-to-one propensity matched pairs (at 3-year, 23.1% vs. 67.3%, p<0.0001). ESRD was an independent predictor of all-cause mortality (RR 2.46, 95%CI 1.85-3.26). Logistic regression showed that age ≥ 75 years was the only independent predictor of 1-year all-cause mortality (OR 4.92, 95%CI 1.32-18.36). Classification and regression tree analysis showed that age ≥ 75 years and, among younger patients, bypass surgery for leg ulcer and gangrene were associated with significantly higher 1-year mortality
Lower limb revascularization in patients with CLI and end-stage renal failure is associated with favourable leg salvage. However, these patients have a very poor survival and this may jeopardize any attempt of revascularization. Further studies are needed to identify ESRD patients with acceptable life expectancy and who may benefit from lower limb revascularization.
closed_qa
Pulmonary embolism diagnosis and mortality with pulmonary CT angiography versus ventilation-perfusion scintigraphy: evidence of overdiagnosis with CT?
The purposes of this study were to determine whether pulmonary emboli diagnosed with pulmonary CT angiography (CTA) represent a milder disease spectrum than those diagnosed with ventilation-perfusion (V/Q) scintigraphy, to determine the trends in incidence and mortality among patients with the diagnosis of pulmonary embolism from 2000 to 2007, and to correlate incidence and mortality trends with imaging modality trends. Diagnoses of pulmonary embolism from 2000 to 2007 at an urban academic medical center were retrospectively identified. Patient data were collected from the hospital database and the Social Security Death Index. Incident diagnoses, type of imaging used, and date of death were documented. Bivariate and multivariate analyses were used to explore the relations between imaging use and the incidence and mortality of pulmonary embolism. Logistic regression analysis was used to estimate the odds of death of pulmonary embolism diagnosed with pulmonary CTA versus V/Q scintigraphy. The cases of 2087 patients (1361 women, 726 men; mean age, 61.8 years) with pulmonary embolism were identified. From 2000 to 2007 the incidence of pulmonary embolism increased from 0.69 to 0.91 per 100 admissions in strong correlation with increased use of pulmonary CTA. There was no change in mortality, but the case-fatality rate decreased from 5.7% to 3.3%. On average, pulmonary emboli diagnosed with pulmonary CTA were one half as lethal as those diagnosed with V/Q scintigraphy (odds ratio, 0.538; 95% CI, 0.314-0.921).
The results of this study are evidence that the shift in imaging from V/Q scintigraphy to pulmonary CTA resulted in increased diagnosis of a less fatal spectrum of pulmonary embolic disease, raising the possibility of overdiagnosis. Outcome-based clinical trials with long-term follow-up would be helpful to further guide management.
closed_qa
Can sarcoidosis and metastatic bone lesions be reliably differentiated on routine MRI?
Sarcoidosis lesions revealed on MRI in the axial skeleton and long bones resemble osseous metastases, which can lead to a potentially significant misdiagnosis. We hypothesized that osseous sarcoidosis lesions could be differentiated from osseous metastases on MRI and sought to propose and evaluate features distinguishing these entities. MR images obtained at 1.5 T of 34 subjects (22 with osseous metastatic disease, 12 with osseous sarcoidosis) with 79 single or multiple bone lesions (40 metastatic, 39 sarcoidal) were reviewed independently by two blinded, experienced musculoskeletal radiologists. Fluid-sensitive and T1-weighted images were viewed separately. Proposed discriminating features were peri- or intralesional fat, specified border characteristics, and the presence of an extraosseous soft-tissue mass. An additional feature for spinal lesions was posterior element involvement. On the basis of these criteria, the readers provided a binary diagnosis and confidence score. The overall sensitivity for both readers was 46.3% and specificity, 97.4%. T1-weighted images were associated with higher sensitivity than T2-weighted images (59.0% vs 34.1%, respectively; p = 0.025) and with comparable specificity (97.6% vs 97.2%, p = 0.91). Diagnostic accuracy was higher using the discriminators of a mass or posterior element involvement for metastasis (83.3%) than border characteristics (68.0%) or lesion fat (65.0%) for sarcoidosis; the latter two features provided near 100% specificity but poor sensitivity (14.3% and 0%, respectively). Readers reported higher confidence diagnosing osseous sarcoidosis lesions than metastatic lesions, with a trend for higher confidence with T1-weighted images (p = 0.088).
Osseous sarcoidosis lesions cannot be reliably distinguished from metastatic lesions on routine MRI studies by readers experienced in evaluating these lesions.
closed_qa
Can opposite clear corneal incisions have a role with post-laser in situ keratomileusis astigmatism?
To evaluate the astigmatic correcting effect of paired opposite clear corneal incisions (OCCIs) on the steep axis in patients with residual astigmatism after laser in situ keratomileusis (LASIK) Thirty-one eyes of 24 patients with a mean age of 28.4 years ±2.46 (range, 19-36 years) were recruited for the study. Inclusion criteria included residual astigmatism of ≥1.5 diopter (D) after LASIK with inadequate residual stromal bed thickness that precluded ablation. The cohort was divided into two groups; group I (with astigmatism ranging from -1.5 D to -2.5 D) and group II (with astigmatism>-2.5 D). The steep axis was marked prior to surgery. Paired three-step self-sealing opposite clear corneal incisions were performed 1-mm anterior to the limbus on the steep axis with 3.2-mm keratome for group I and 4.1 mm for group II. Patients were examined 1 day, 1 week, 1 month, 3 months and 6 months, postoperatively. Visual acuity, refraction, keratometry, and corneal topography were evaluated preoperatively and postoperatively. Analysis of the difference between groups was performed with the Student t-test. P<0.05 was considered statistically significant. The mean uncorrected visual acuity (UCVA) improved from 0.35±0.13 (range, 0.1-0.6) to 0.78±0.19 (range, 0.5-1) in group I and from 0.26±0.19 (range, 0.1-0.5) to 0.7±0.18 (range, 0.4-1) in group II. The increase in UCVA was statistically significant in both groups (P=0.001, both cases). The mean preoperative and postoperative keratometric astigmatism in group I was 2.0±0.48 D (range, 1.5-2.5 D) and 0.8±0.37 D (range, 0.1-1.4 D), respectively. The decrease in keratometric astigmatism was highly statistically significant in group II (P=0.001.). Mean surgically induced astigmatic reduction by vector analysis was 1.47±0.85 D and 2.21±0.97 D in groups I and II respectively. There were no incision-related complications.
Paired OCCIs were predictable and effective in correcting post-LASIK astigmatism and required no extra surgical skill or expensive instruments. OCCIs are especially useful in eyes with insufficient corneal thickness for LASIK retreatment.
closed_qa
Can passengers' active head tilt decrease the severity of carsickness?
We investigated the effect of the passenger head-tilt strategy on the severity of carsickness in lateral acceleration situations in automobiles. It is well known that the driver is generally less susceptible to carsickness than are the passengers. However, it is also known that the driver tilts his or her head toward the curve center when negotiating a curve, whereas the passenger's head moves in the opposite direction. Therefore, we hypothesized that the head-tilt strategy has the effect of reducing the severity of carsickness. A passenger car was driven on a quasi-oval track with a pylon slalom while the participant sat in the navigator seat. The experiment was terminated when either the participant felt the initial symptoms of motion sickness or the car finished 20 laps. In the natural head-tilt condition, the participants were instructed to sit naturally, to relax, and not to oppose the lateral acceleration intentionally. In the active head-tilt condition, the participants were asked to tilt their heads against the centrifugal acceleration, thus imitating the driver's head tilt. The number of laps achieved in the active condition was significantly greater than that in the natural condition. In addition, the subjective ratings of motion sickness and symptoms in the active condition were significantly lower than those in the natural condition.
We suggest that an active head tilt against centrifugal acceleration reduces the severity of motion sickness.
closed_qa
Duodenal bulb biopsies for diagnosing adult celiac disease: is there an optimal biopsy site?
Recent studies highlight the role of duodenal bulb biopsy in the diagnosis of celiac disease. To determine whether a targeted duodenal bulb biopsy in addition to distal duodenal biopsies is the optimal strategy to identify villous atrophy. Prospective cohort study. Tertiary-care referral center. Seventy-seven patients undergoing clinically indicated EGD with duodenal biopsies were recruited. Of these, 28 had newly diagnosed celiac disease and 49 were controls. At endoscopy, 8 duodenal biopsy specimens were taken: 4 from the second part of the duodenum and 4 quadrantically from the bulb (at the 3-, 6-, 9-, and 12-o'clock positions). Increasing the diagnostic yield and detection of the most severe villous atrophy in celiac disease with the addition of a targeted duodenal bulb biopsy. The most severe degree of villous atrophy was detected when distal duodenal biopsy specimens were taken in addition to a duodenal bulb biopsy specimen from either the 9- or 12-o'clock position (96.4% sensitivity; 95% CI, 79.7%-100%). The difference between the 12-o'clock position biopsy and the 3-o'clock position biopsy in detecting the most severe villous atrophy was 92% (24/26) versus 65% (17/26) (P = .02). Small sample and study performed in a tertiary referral center.
This study demonstrates the patchy appearance of villous atrophy that occurs within the duodenum. A targeted duodenal bulb biopsy from either the 9- or 12-o'clock position in addition to distal duodenal biopsies may improve diagnostic yields by detecting the most severe villous atrophy within the duodenum.
closed_qa
Interval colon cancer in a Lynch syndrome patient under annual colonoscopic surveillance: a case for advanced imaging techniques?
Lynch syndrome confers increased risk for various malignancies, including colorectal cancer. Colonoscopic surveillance programs have led to reduced incidence of colorectal cancer and reduced mortality from colorectal cancer. Colonoscopy every 1-2 years beginning at age 20-25, or 10 years earlier than the first diagnosis of colorectal cancer in a family, with annual colonoscopy after age 40, is the recommended management for mutation carriers. Screening programs have reduced colon cancer mortality, but interval cancers may occur. We describe a 48-year-old woman with Lynch syndrome who was found to have an adenoma with invasive colorectal cancer within one year after a normal colonoscopy.
Our patient illustrates two current concepts about Lynch syndrome: 1) adenomas are the cancer precursor and 2) such adenomas may be "aggressive," in the sense that the adenoma progresses more readily and more rapidly to carcinoma in this setting compared to usual colorectal adenomas. Our patient's resected tumor invaded only into submucosa and all lymph nodes were negative; in that sense, she represents a success for annual colonoscopic surveillance. Still, this case does raise the question of whether advanced imaging techniques are advisable for surveillance colonoscopy in these high-risk patients.
closed_qa
Is prevention of atopic eczema with hydrolyzed formulas cost-effective?
The German Infant Nutritional Intervention (GINI) trial, a prospective, randomized, double-blind intervention, enrolled children with a hereditary risk for atopy. When fed with certain hydrolyzed formulas for the first 4 months of life, the risk was reduced by 26-45% in PP and 8-29% in intention-to-treat (ITT) analyses compared with children fed with regular cow's milk at age 6. The objective was to assess the cost-effectiveness of feeding hydrolyzed formulas. Cost-effectiveness was assessed with a decision tree model programmed in TreeAge. Costs and effects over a 6-yr period were analyzed from the perspective of the German statutory health insurance (SHI) and a societal perspective at a 3% effective discount rate followed by sensitivity analyses. The extensively hydrolyzed casein formula would be the most cost-saving strategy with savings of 478 € per child treated in the ITT analysis (CI95%: 12 €; 852 €) and 979 € in the PP analysis (95%CI: 355 €; 1455 €) from a societal perspective. If prevented cases are considered, the partially whey hydrolyzed formula is cost-saving (ITT -5404 €, PP -6358 €). From an SHI perspective, the partially whey hydrolyzed formula is cost-effective, but may also be cost-saving depending on the scenario. An extensively hydrolyzed whey formula also included into the analysis was dominated in all analyses.
For the prevention of AE, two formulas can be cost-effective or even cost-saving. We recommend that SHI should reimburse formula feeding or at least the difference between costs for cow's milk formula and the most cost-effective formula.
closed_qa
Does the African-American-white mortality gap persist after playing professional basketball?
The African-American-white mortality gap for males in the United States is 6 years in favor of whites. Participation in professional sport may moderate this ethnic disparity. The historical cohort of professional basketball players, with nearly equal numbers of African-American and white players, can provide a natural experiment that may control for the classic confounders of income, education, socioeconomic status (SES), and physical factors related to mortality. The objectives of this study are to assess mortality and calculate survival for the overall study population and within ethnicity. Data were combined from several publicly available sources. The cohort was analyzed to compare longevity among all players, and for players stratified by ethnicity, with the general U.S. population. The final dataset included 3366 individuals, of whom 56.0% were African American. Results suggest white players live 18 months longer than their African-American colleagues. African-American players gained 9 years on their respective referent and live longer than white men in the general public. After controlling for covariates, we found that African-American players have a 75% increased risk of death compared with white players, a statistically significant gap (p<.0001, 95% confidence interval 1.41-2.44).
The African-American-white mortality gap for males is largely ameliorated (1.5 years vs. 6.1 years) in professional basketball but still persists.
closed_qa
Can erosions on MRI of the sacroiliac joints be reliably detected in patients with ankylosing spondylitis?
Erosions of the sacroiliac joints (SIJ) on pelvic radiographs of patients with ankylosing spondylitis (AS) are an important feature of the modified New York classification criteria. However, radiographic SIJ erosions are often difficult to identify. Recent studies have shown that erosions can be detected also on magnetic resonance imaging (MRI) of the SIJ early in the disease course before they can be seen on radiography. The goals of this study were to assess the reproducibility of erosion and related features, namely, extended erosion (EE) and backfill (BF) of excavated erosion, in the SIJ using a standardized MRI methodology. Four readers independently assessed T1-weighted and short tau inversion recovery sequence (STIR) images of the SIJ from 30 AS patients and 30 controls (15 patients with non-specific back pain and 15 healthy volunteers) ≤ 45 years old. Erosions, EE, and BF were recorded according to standardized definitions. Reproducibility was assessed by percentage concordance among six possible reader pairs, kappa statistics (erosion as binary variable) and intraclass correlation coefficient (ICC) (erosion as sum score) for all readers jointly. SIJ erosions were detected in all AS patients and six controls by ≥ 2 readers. The median number of SIJ quadrants affected by erosion recorded by four readers in 30 AS patients was 8.6 in the iliac and 2.1 in the sacral joint portion (P<0.0001). For all 60 subjects and for all four readers, the kappa value for erosion was 0.72, 0.73 for EE, and 0.63 for BF. ICC for erosion was 0.79, 0.72 for EE, and 0.55 for BF, respectively. For comparison, the kappa and ICC values for bone marrow edema were 0.61 and 0.93, respectively.
Erosions can be detected on MRI to a comparable degree of reliability as bone marrow edema despite the significant heterogeneity of their appearance on MRI.
closed_qa
Is obesity at individual and national level associated with lower age at menarche?
A unique standardized international data set from adolescent girls in 34 countries in Europe and North America participating in the Health Behaviour in School-aged Children Study (HBSC) is used to investigate the contribution of body mass index (BMI) at individual and country level to cross-national differences in age at menarche. Two independent nationally representative survey data sets from 15-year-olds (n = 27,878, in 34 countries, year = 2005/2006) and 11-year-olds (n = 18,101, in 29 countries, year = 2001/2002) were analyzed. The survey instrument is a self-report questionnaire. Median age at menarche and 95% confidence intervals (CIs) were estimated using Kaplan-Meier analysis. Hierarchical models were used to assess the relationship between BMI and age at menarche (months). "Country-level obesity" was measured by prevalence of overweight/obesity (%) in each country. Country-level median age at menarche ranged between 12 years and 5 months and 13 years and 5 months. Country-level prevalence of overweight among 15-year-old girls ranged from 4% to 28%. Age at menarche was inversely associated with individual BMI (unstandardized regression coefficient beta = -1.01; 95% CI, -1.09 to -.94) and country-level aggregate overweight at age 11 (unstandardized regression coefficient beta = -.25; 95% CI, -.43 to -.08). Individual- and country-level measures of BMI account for 40% of the country-level variance in age at menarche.
The findings add to the evidence that obesity in childhood is a risk factor for early puberty in girls and accounts for much of the cross-national variation in age at menarche. Future HBSC surveys can track this relationship in the wake of the obesity "epidemic."
closed_qa
Should measures of patient experience in primary care be adjusted for case mix?
Uncertainties exist about when and how best to adjust performance measures for case mix. Our aims are to quantify the impact of case-mix adjustment on practice-level scores in a national survey of patient experience, to identify why and when it may be useful to adjust for case mix, and to discuss unresolved policy issues regarding the use of case-mix adjustment in performance measurement in health care.DESIGN/ Secondary analysis of the 2009 English General Practice Patient Survey. Responses from 2 163 456 patients registered with 8267 primary care practices. Linear mixed effects models were used with practice included as a random effect and five case-mix variables (gender, age, race/ethnicity, deprivation, and self-reported health) as fixed effects. Primary outcome was the impact of case-mix adjustment on practice-level means (adjusted minus unadjusted) and changes in practice percentile ranks for questions measuring patient experience in three domains of primary care: access; interpersonal care; anticipatory care planning, and overall satisfaction with primary care services. Depending on the survey measure selected, case-mix adjustment changed the rank of between 0.4% and 29.8% of practices by more than 10 percentile points. Adjusting for case-mix resulted in large increases in score for a small number of practices and small decreases in score for a larger number of practices. Practices with younger patients, more ethnic minority patients and patients living in more socio-economically deprived areas were more likely to gain from case-mix adjustment. Age and race/ethnicity were the most influential adjustors.
While its effect is modest for most practices, case-mix adjustment corrects significant underestimation of scores for a small proportion of practices serving vulnerable patients and may reduce the risk that providers would 'cream-skim' by not enrolling patients from vulnerable socio-demographic groups.
closed_qa
Should orthotopic heart transplantation using marginal donors be limited to higher volume centers?
This study examined whether institutional volume impacts outcomes after orthotopic heart transplantation (OHT) utilizing marginal donors. Adult patients undergoing OHT with the use of marginal donors between 2000 and 2010 were identified in the United Network for Organ Sharing database. A previously derived and validated donor risk score (range, 1 to 15) was used to define marginal donors as those in the 90th percentile of risk (score≥7). Patients were stratified into equal-size tertiles based on overall institutional OHT volume. Posttransplant outcomes were compared between these center cohorts. A total of 3,176 OHTs utilizing marginal donors were identified. In Cox regression analysis, recipients undergoing OHT at low-volume centers were at significantly increased risk of 30-day (hazard ratio 1.82 [1.31 to 2.54], p<0.001), 1-year (hazard ratio 1.40 [1.14 to 1.73], p=0.002), and 5-year posttransplant mortality (hazard ratio 1.29 [1.10 to 1.52], p=0.02). These findings persisted after adjusting for recipient risk, differences in donor risk score, and year of transplantation (each p<0.05). In Kaplan-Meier analysis, there was a similar trend of decreasing 1-year survival with decreasing center volume: high (86.0%), intermediate (85.7%), and low (81.2%; log rank p=0.003). Drug-treated rejection within the first post-OHT year was more common in low-volume versus high-volume centers (34.3% versus 24.2%, p<0.001). At an overall mean follow-up of 3.4±2.9 years, low-volume centers also had higher incidences of death due to malignancy (2.8% versus 1.3%, p=0.01) or infection (6.2% versus 4.1%, p=0.02).
Consolidating the use of marginal donors to higher volume centers may be prudent in improving post-OHT outcomes in this higher risk patient subset.
closed_qa
Are thromboembolic and bleeding complications a drawback for composite aortic root replacement?
Valve-preserving aortic root reconstruction is being performed with increasing frequency. Independent of durability concerns, enthusiasm for retaining the native valve is often championed on the presumption that composite graft replacement of the aorta will be complicated by thromboembolism and bleeding. Our goal in this late follow-up study is to determine if thromboembolism or bleeding, or both, are indeed problematic after composite aortic root replacement. Between 1995 and 2011, 306 patients (mean age, 56±14 years) underwent composite graft replacement of the aorta. St. Jude mechanical valve conduits (St. Jude Medical, St Paul, MN) were used in 242 patients, and 64 received a biologic conduit. Long-term postoperative follow-up (mean, 56 months; range, 1 to 97 months) was performed through our Aortic Database, supplemented by patient interviews and use of the Social Security Death Index. Hospital mortality was 2.9% overall and 1.4% in the last 8 years. Kaplan-Meier curves showed freedom (±standard deviation) from bleeding, stroke, and distal embolism as 94.3%±1.7% at 5 years and 91.3%±2.4% at 10 years. Survival was 93.5%±1.8% at 5 years and 80.9%±4.6% at 10 years, which was not statistically different from that for an age- and sex-matched population in Connecticut. Freedom from reoperation of the aortic root was 99% at 10 years.
Patients had excellent survival and few thromboembolic and bleeding complications after composite aortic root replacement. These data supporting minimal morbidity in the setting of well-established durability should be used to put alternative procedures, such as valve-preserving aortic root reconstruction, into context.
closed_qa
Are treatments for vasovagal syncope effective?
Therapies used to treat vaso-vagal syncope (VVS) recurrence have not been proven effective in single studies. Comprehensive search of PubMed, EMBASE and Cochrane Central databases of published trials was done. Randomized or non-randomized studies, comparing the intervention of interest to control group(s), with the endpoint of spontaneous recurrence or syncope on head-up tilt test, were included. Data were extracted on an intention-to-treat basis. Study heterogeneity was analyzed by Cochran's Q statistics. A random-effect analysis was used. α-adrenergic agonists were found effective (n=400, OR 0.19, CI 0.06-0.62, p<0.05) in preventing VVS recurrence. β-blockers were not found to be effective when only randomized studies comparing β-blockers to non-pharmacologic agents were assessed (9 studies, n=583, OR 0.48, CI 0.22-1.04, p=0.06). Tilt-training had no effect when only randomized studies were considered (4 studies, n=298, OR 0.47, CI 0.21-1.05, p=0.07). Selective serotonin reuptake inhibitors were found effective (n=131, OR 0.28, CI 0.10-0.74, p<0.05), though the analysis contained only 2 studies. Pacemakers were found effective in preventing syncope recurrence when all studies were analyzed (n=463, OR 0.13, CI 0.05-0.36, p<0.05). However, studies comparing active pacemaker to sensing mode only did not show benefit (3 studies, n=162, OR 0.45, CI 0.09-2.14, p=0.32).
This meta-analysis highlights the totality of evidence for commonly used medications used to treat VVS, and the requirement for larger, double-blind, placebo controlled trials with longer follow-up.
closed_qa
Are poor health behaviours in anxious and depressed cardiac patients explained by sociodemographic factors?
While there is evidence of poor health behaviours in anxious and depressed cardiac patients, it is possible that sociodemographic factors explain these associations. Few previous studies have adequately controlled for confounders. The present study investigated health behaviours in anxious and depressed cardiac patients, while accounting for sociodemographic confounders. A consecutive sample of 275 patients admitted to hospital after acute myocardial infarction (32%) or for coronary bypass surgery (40%) or percutaneous coronary intervention (28%) was interviewed six weeks after hospital discharge. Anxiety and depression were assessed using the Hospital Anxiety and Depression Scale (HADS). Smoking, physical activity, alcohol intake and dietary fat intake were assessed by self-report. Backward stepwise logistic regression was used to identify the factors independently associated with anxiety and depression. In total, 41 patients (15.2%) were 'depressed' (HADS-D ≥8) while 68 (25.2%) were 'anxious' (HADS-A ≥8). Depressed patients reported higher rates of smoking (χ2)= 4.47, p = 0.034), lower physical activity (F = 8.63, p < 0.004) and higher dietary fat intake (F = 7.22, p = 0.008) than non-depressed patients. Anxious patients reported higher smoking rates (χ2)= 5.70, p = 0.024) and dietary fat intake (F = 7.71, p = 0.006) than non-anxious patients. In multivariate analyses, an association with depression was retained for both diet and physical activity, and an association with anxiety was retained for diet. Low social support and younger age were significant confounders with depression and anxiety respectively.
While the high smoking rates evidenced in anxious and depressed patients were explained by sociodemographic factors, their poor diet and low physical activity (depressed patients only) were independent of these factors. Given the impact of lifestyle modification on survival after a cardiac event, anxious and depressed patients should be a priority for cardiac rehabilitation and other secondary prevention programmes.
closed_qa
Chemotherapy-related thrombocytosis: does it increase the risk of thromboembolism?
Chemotherapy increases the risk of thromboembolism in patients with cancer. Although thrombocytopenia is a known side effect of chemotherapy, reactive thrombocytosis related to chemotherapy is uncommonly reported. The present study aimed to determine the incidence of gemcitabine-related thrombocytosis and the associated risk of thromboembolism. Medical records of 250 consecutive patients with a malignant disease who received gemcitabine-based therapy were reviewed. A multivariate analysis was done to determine factors associated with thromboembolism. A total of 220 eligible patients with a median age of 63 years (range 26-83) were identified. Of these 220 patients, 95% had advanced malignancy and 59% had received prior chemotherapy. A total of 69% of patients received a platinum combination. In all, 46% patients experienced thrombocytosis following chemotherapy, with a median platelet count of 632 × 10(9)/l (range 457-1,385). Twenty-three of the 220 patients experienced a vascular event within 6 weeks of treatment. Eleven patients with thrombocytosis experienced a vascular event compared with 10 patients without thrombocytosis (not significant). On multivariate analysis, leukocytosis (odds ratio 5.8, 95% confidence interval 2.1-15.8) and comorbid illnesses (odds ratio 4.1, 95% confidence interval 1.4-12.6) were correlated with thromboembolism.
Although gemcitabine-based therapy has been associated with an increased incidence of thrombocytosis, it does not increase the risk of thromboembolism in cancer patients. Leukocytosis and comorbid illnesses do increase the risk of thromboembolism.
closed_qa
Does risk-based coagulation screening predict intraventricular haemorrhage in extreme premature infants?
Intraventricular haemorrhage (IVH) continues to be a significant contributor to neonatal morbidity and mortality, especially in the extremely premature population (<26 weeks). The aims of the study were to test the hypothesis that risk-based coagulopathy screening could identify infants at risk of severe IVH/mortality, and whether preterm infants born at less than 26 weeks of gestation who received early (within first 48 h) fresh frozen plasma (FFP) had a lower incidence of IVH than those who did not. Chart review of preterm infants born less than 26-week gestation was conducted. The study compared two cohorts of infants who either had 'early' risk-based coagulopathy screening (within first 48 h, n = 47) or 'late' screening (n = 55). Baseline and clinical characteristics of the two cohorts were similar. 'Early' coagulopathy screening predicted infants at risk of severe IVH [relative risk (RR) 2.59, 95% confidence interval (CI) 1.18-5.67, P<0.01] but not mortality (RR 1.2, 95% CI 0.79-1.94). FFP was administered significantly more in the 'early' screened cohort (P<0.001); however, the incidence of IVH was similar in those who received early FFP administration than those who did not.
'Early' risk-based coagulopathy screening may identify preterm infants at risk of severe IVH; however, the study failed to show any benefit of early treatment of a coagulopathy with FFP in a small but high-risk population.
closed_qa
Are names of children with attention deficit hyperactivity disorder more 'hyperactive'?
The role of the meaning of given names has been noted in psychotherapy as well as in everyday life. This study aimed to investigate the possible association between the nature of given names of children and attention deficit hyperactivity disorder (ADHD) diagnosis. A total of 134 given names of children and adolescent patients diagnosed as having ADHD were compared with those of an age- and gender-matched randomly chosen control group from the general population. The first names of the two cohorts were compared with regard to the following: the literal meaning of their names, whether the name constitutes a verb, the prevalence of each name and their length (number of syllables). The meaning of first names of children and adolescents with ADHD combined type were rated by referees as expressing significantly more activity and containing less syllables than the names of controls. In addition, the prevalence of their names was significantly lower than that of names used in the general population. All findings remained significant following Bonferroni adjustment.
Our findings demonstrate an intriguing relationship between children's given names and ADHD diagnosis. Given names may serve as a possible predictor of later diagnosis of ADHD. Clinicians should be more attentive to given names in the context of child psychiatric evaluation and therapy.
closed_qa
Tricuspid valve repair: is ring annuloplasty superior?
Tricuspid regurgitation (TR) secondary to left heart disease is the most common aetiology of tricuspid valve (TV) insufficiency. Valve annuloplasty is the primary treatment for TV insufficiency. Several studies have shown the superiority of annuloplasty with a prosthetic ring over other repair techniques. We reviewed our experience with different surgical techniques for the treatment of acquired TV disease focusing on long-term survival and incidence of reoperation. A retrospective analysis of 717 consecutive patients who underwent TV surgery between 1975 and 2009 with either a ring annuloplasty [Group R: N = 433 (60%)] or a De Vega suture annuloplasty [Group NR: no ring; N = 255 (36%)]. Twenty-nine (4%) patients underwent other types of TV repair. A ring annuloplasty was performed predominantly in the late study period of 2000-09. TV aetiology was functional in 67% (479/717) of the patients. Ninety-one percent of the patients (n = 649) underwent concomitant coronary artery bypass grafting and/or mitral/aortic valve surgery. Patients who received a ring annuloplasty were older (67 ± 13 vs 60 ± 13 years; P<0.001). Overall 30-day mortality was 13.8% (n = 95) [Group R: n = 55 (12.7%) and Group NR: n = 40 (15.7%)]. Ten-year actuarial survival after TV repair with either the De Vega suture or ring annuloplasty was 39 ± 3 and 46 ± 7%, respectively (P = 0.01). Twenty-eight (4%) patients required a TV reoperation after 5.9 ± 5.1 years. Freedom from TV reoperation 10 years after repair with a De Vega annuloplasty was 87.9 ±3% compared with 98.4 ± 1% after the ring annuloplasty (P = 0.034).
Patients who require TV surgery either as an isolated or a combined procedure constitute a high-risk group. The long-term survival is poor. Tricuspid valve repair with a ring annuloplasty is associated with improved survival and a lower reoperation rate than that with a suture annuloplasty.
closed_qa
Is lidocaine Bier's block safe?
To assess the safety profile of lidocaine Bier's block when compared with that of prilocaine. A retrospective audit of patients undergoing Bier's block using 0.5% lidocaine during a 27-month period (April 2008-June 2010) at the Royal United Hospital Bath emergency department. 416 patients with sufficient data were included in the study; 360 women and 56 men. The mean patient age was 65 years. Complications were reported in 39 cases; transient hypotension/vasovagal episodes and transient mild bradycardia were most frequent. No patients required any medical intervention. There was no occurrence of anaphylaxis, convulsion, hypotensive episodes requiring medical intervention, collapse or death.
No clinically significant morbidity or mortality as a consequence of lidocaine Bier's block was demonstrated in this audit.
closed_qa
Are there differences in injury mortality among refugees and immigrants compared with native-born?
The authors studied injury mortality in Denmark among refugees and immigrants compared with that among native Danes. A register-based, historical prospective cohort design. All refugees (n=29, 139) and family reunited immigrants (n=27, 134) who between 1 January 1993 and 31 December 1999 received residence permission were included and matched 1:4 on age and sex with native Danes. Civil registration numbers were cross-linked to the Register of Causes of Death, and fatalities due to unintentional and intentional injuries were identified based on ICD-10 diagnosis. Sex-specific mortality ratios were estimated by migrant status and region of birth, adjusting for age and income and using a Cox regression model after a median follow-up of 11-12 years. Compared with native Danes, both female (RR=0.44; 95% CI 0.23 to 0.83) and male (RR=0.40; 95% CI 0.29 to 0.56) refugees as well as female (RR=0.40; 95% CI 0.21 to 0.76) and male (RR=0.22; 95% CI 0.12 to 0.42) immigrants had significantly lower mortality from unintentional injuries. Suicide rates were significantly lower for male refugees (RR=0.38; 95% CI 0.24 to 0.61) and male immigrants (RR=0.24; 95% CI 0.10 to 0.59), whereas their female counterparts showed no significant differences. Only immigrant women had a significantly higher homicide rate (RR=3.09; 95% CI 1.11 to 8.60) compared with native Danes.
Overall results were advantageous to migrant groups. Research efforts should concentrate on investigating protective factors among migrants, which may benefit injury prevention in the majority population.
closed_qa
Is the diagnosis of ADHD influenced by time of entry to school?
The authors examined the proposed immaturity hypothesis, which suggests that younger children may have developmental immaturity and not ADHD, using data from a large, clinically referred population of individuals with and without ADHD. The sample consisted of individuals with and without an ADHD diagnosis, ascertained from ongoing studies in our laboratory, born in August (Younger Cohort N = 562) and born in September (Older Cohort N = 529). The authors compared studywide diagnosis rates of ADHD, ADHD familiality patterns, ADHD symptoms, psychiatric comorbidity, and functional impairments between the two cohorts. Studywide rates of ADHD diagnosis, ADHD-associated symptoms, ADHD-associated impairments, ADHD-associated comorbid disorders, and familiality were similar in the two age cohorts.
Results showed that ADHD-associated familial, clinical, and functional correlates are similar irrespective of age at entry to school, indicating that when ADHD symptoms are present, a diagnosis of ADHD should be considered rather than attributing these symptoms to developmental immaturity.
closed_qa
Dysfunctional cognitions and their emotional, behavioral, and functional correlates in adults with attention deficit hyperactivity disorder (ADHD): is the cognitive-behavioral model valid?
To investigate the presence of dysfunctional cognitions in adults with ADHD and to determine whether these cognitions are associated with emotional symptoms, maladaptive coping, and functional impairment, as predicted by the cognitive-behavioral model. A total of 35 adult participants with ADHD, 20 nonclinical controls, and 20 non-ADHD clinical controls were assessed with measures of ADHD symptoms, dysfunctional cognitions, depression and anxiety symptoms, coping strategies, and quality of life. ADHD group showed elevated scores of dysfunctional cognitions relative to nonclinical control group and comparable with clinical control group. Dysfunctional cognitions were strongly associated with emotional symptoms. ADHD group also showed elevated scores in maladaptive coping strategies of the escape-avoidance type. Life impairment was satisfactorily predicted in data analysis when ADHD symptoms, dysfunctional cognitions, and emotional symptoms were fitted into a regression model.
Cognitive-behavioral therapy model appears to be a valid complementary model for understanding emotional and life impairment in adults with ADHD.
closed_qa
Social adversity, stress, and alcohol problems: Are racial/ethnic minorities and the poor more vulnerable?
Experiences of racial/ethnic bias and unfair treatment are risk factors for alcohol problems, and population differences in exposure to these social adversities (i.e., differential exposure) may contribute to alcohol-related disparities. Differential vulnerability is another plausible mechanism underlying health disparities, yet few studies have examined whether populations differ in their vulnerability to the effects of social adversity on psychological stress and the effects of psychological stress on alcohol problems. Data from the 2005 U.S. National Alcohol Survey (N = 4,080 adult drinkers) were analyzed using structural equation modeling to assess an overall model of pathways linking social adversity, depressive symptoms, heavy drinking, and alcohol dependence. Multiple group analyses were conducted to assess differences in the model's relationships among Blacks versus Whites, Hispanics versus Whites, and the poor (income below the federal poverty line) versus non-poor (income above the poverty line). The overall model explained 48% of the variance in alcohol dependence and revealed significant pathways between social adversity and alcohol dependence involving depressive symptoms and heavy drinking. The effects of social adversity and depressive symptoms were no different among Blacks and Hispanics compared with Whites. However, the poor (vs. non-poor) showed stronger associations between unfair treatment and depressive symptoms and between depressive symptoms and heavy drinking.
Contrary to some prior studies, these findings suggest that racial disparities in alcohol problems may be more a function of racial/ethnic minorities' greater exposure, rather than vulnerability, to chronic stressors such as social adversity. However, observed differences between the poor and non-poor imply that differential vulnerability contributes to socioeconomic disparities in alcohol problems. Efforts to reduce both differential exposure and vulnerability might help to mitigate these disparities.
closed_qa
Do substance use norms and perceived drug availability mediate sexual orientation differences in patterns of substance use?
Illicit drug and heavy alcohol use is more common among sexual minorities compared with heterosexuals. This difference has sometimes been attributed to more tolerant substance use norms within the gay community, although evidence is sparse. The current study investigated the role of perceived drug availability and tolerant injunctive norms in mediating the linkage between minority sexual orientation status and higher rates of prior-year substance use. We used data from the second California Quality of Life Survey (Cal-QOL II), a followback telephone survey in 2008-2009 of individuals first interviewed in the population-based 2007 California Health Interview Survey. The sample comprised 2,671 individuals, oversampled for minority sexual orientation. Respondents were administered a structured interview assessing past-year alcohol and illicit drug use, perceptions of perceived illicit drug availability, and injunctive norms concerning illicit drug and heavier alcohol use. We used structural equation modeling methods to test a mediational model linking sexual orientation and substance use behaviors via perceptions of drug availability and social norms pertaining to substance use. Compared with heterosexual individuals, sexual minorities reported higher levels of substance use, perceived drug availability, and tolerant social norms. A successfully fitting model suggests that much of the association between minority sexual orientation and substance use is mediated by these sexual orientation-related differences in drug availability perceptions and tolerant norms for substance use.
Social environmental context, including subcultural norms and perceived drug availability, is an important factor influencing substance use among sexual minorities and should be addressed in community interventions.
closed_qa
Rapid repeat pregnancy in adolescents: do immediate postpartum contraceptive implants make a difference?
The purpose of this study was to determine contraceptive continuation and repeat pregnancy rates in adolescents who are offered immediate postpartum etonogestrel implant (IPI) insertion. Participants in an adolescent prenatal-postnatal program were enrolled in a prospective observational study of IPI insertion (IPI group, 171) vs other methods (control group, 225). Contraceptive continuation and repeat pregnancies were determined. Implant continuation at 6 months was 96.9% (156/161 participants); at 12 months, the continuation rate was 86.3% (132/153 participants). At 6 months, 9.9% of the control participants were pregnant (21/213); there were no IPI pregnancies. By 12 months, 18.6% of control participants (38/204) experienced pregnancy vs 2.6% of IPI recipients (4/153; relative risk, 5.0; 95% confidence interval [CI], 1.9-12.7). Repeat pregnancy at 12 months was predicted by not receiving IPI insertion (odds ratio, 8.0; 95% CI, 2.8-23.0) and having>1 child (odds ratio, 2.1; 95% CI, 1.1-4.3; P = .03).
IPI placement in adolescents has excellent continuation 1 year after delivery; rapid repeat pregnancy is significantly decreased compared with control participants.
closed_qa
Are postoperative complications more common following colon and rectal surgery in patients with chronic kidney disease?
Patients with CKD were identified within our database. Patients with an eGFR of 15-59 ml/min (CKD Stages 3 and 4) formed the CKD group and were compared with American Society of Anesthesiology (ASA) score-matched controls with an eGFR of ≥ 60 ml/min. Assessments included demographics, comorbidity, ASA score, operative details and 30-day postoperative outcome. Seventy patients in the CKD group were matched with 70 controls. ASA scores and length of stay did not differ significantly between the groups. CKD patients were older (mean age 76.5 years vs 71.1 years; P<0.001) and had a lower mean body mass index (24.3 vs 28.2; P<0.001) compared with controls. Compared with the CKD group, the mean operation time was longer in the control group (181.5 min vs 151.6 min; P = 0.02) and the estimated blood loss was greater (232 ml vs 165 ml; P = 0.004). Postoperative infection was more common in the CKD group (60%vs 40%; P = 0.01). There were no significant differences in reoperation rates, 30-day readmissions or the incidence of acute renal failure (ARF).
Patients with CKD Stages 3 and 4 had a higher incidence of postoperative infections than matched controls after colorectal surgery. ARF developed in 18.6% of patients. Preoperative optimization should include adequate hydration and assessment of potentially nephrotoxic substances for bowel preparation, preoperative antibiotics and pain control.
closed_qa
Impact of trauma center designation on outcomes: is there a difference between Level I and Level II trauma centers?
Within organized trauma systems, both Level I and Level II trauma centers are expected to have the resources to treat patients with major multisystem trauma. The evidence supporting separate designations for Level I and Level II trauma centers is inconclusive. The objective of this study was to compare mortality and complications for injured patients admitted to Level I and Level II trauma centers. Using data from the Pennsylvania Trauma Outcomes Study registry, we performed a retrospective observational study of 208,866 patients admitted to 28 Level I and Level II trauma centers between 2000 and 2009. Regression modeling was used to estimate the association between patient outcomes and trauma center designation, after controlling for injury severity, mechanism of injury, transfer status, and physiology. Patients admitted to Level I trauma centers had a 15% lower odds of mortality (adjusted odds ratio [adj OR] 0.85; 95% CI 0.72 to 0.99) and a 35% increased odds of complications (adj OR 1.37; 95% CI 1.04 to 1.79). The survival benefit associated with admission to Level I centers was strongest in patients with very severe injuries (Injury Severity Score [ISS]≥ 25; adj OR 0.78; 95% CI 0.64 to 0.95). Less severely injured patients with an ISS<9 (adj OR 0.91; 95% CI 0.64 to 1.30) and with an ISS between 9 and 15 (adj OR 0.98; 95% CI 0.81 to 1.18) had similar risks of mortality in Level I and Level II trauma centers.
Severely injured patients admitted to Level I trauma centers have a lower risk of mortality compared with patients admitted to Level II centers. These findings support the continuation of a 2-tiered designation system for trauma.
closed_qa
Can hospitals "game the system" by avoiding high-risk patients?
It has been suggested that implementation of quality-improvement benchmarking programs can lead to risk-avoidance behaviors in some physicians and hospitals in an attempt to improve their rankings, potentially denying patients needed treatment. We hypothesize that avoidance of high-risk patients will not change risk-adjusted rankings. We conducted a simulation analysis of 6 complex operations in the Nationwide Inpatient Sample, including abdominal aortic aneurysm repair, aortic valve replacement, coronary artery bypass grafting, percutaneous coronary intervention, esophagectomy, and pancreatic resection. Primary outcomes included in-hospital mortality. Hospitals were ranked into quintiles based on observed-to-expected (O/E) mortality ratios, with their expected mortalities calculated based on models generated from the previous 3 years. Half of the hospitals were then randomly selected to undergo risk avoidance by avoiding 25% of patients with higher than median risks (ie, Charlson, Elixhauser, age, minority, or uninsured status). Their new O/E ratios and hospital-rank categories were compared with their original values. A total of 2,235,298 patients were analyzed, with an overall observed mortality rate of 1.9%. Median change in O/E ratios across all simulations was zero, and O/E ratios did not change in 97.5% to 99.3% of the hospitals, depending on the risk definitions. Additionally, 70.5% to 98.0% of hospital rankings remained unchanged, 1.3% to 13.1% of hospital rankings improved, and 0.7% to 14.3% of hospital rankings worsened after risk avoidance.
Risk-adjusted rankings of hospitals likely cannot be changed by simply avoiding high-risk patients. In the minority of scenarios in which risk-adjusted rankings changed, they were as likely to improve as worsen after risk avoidance.
closed_qa
Are high-frequency (600 Hz) oscillations in human somatosensory evoked potentials due to phase-resetting phenomena?
Median nerve somatosensory evoked potentials (SEP) contain a brief oscillatory wavelet burst at about 600 Hz (σ-burst) superimposed on the initial cortical component (N20). While invasive single-cell recordings suggested that this burst is generated by increased neuronal spiking activity in area 3b, recent non-invasive scalp recordings could not reveal concomitant single-trial added-activity, suggesting that the SEP burst might instead be generated by phase-reset of ongoing high-frequency EEG. Here, a statistical model and exemplary data are presented reconciling these seemingly contradictory results. A statistical model defined the conditions required to detect added-activity in a set of single-trial SEP. Its predictions were tested by analyzing human single-trial scalp SEP recorded with custom-made low-noise amplifiers. The noise level in previous studies did not allow to detect single-trial added-activity in the period concomitant with the trial-averaged σ-burst. In contrast, optimized low-noise recordings do reveal added-activity in a set of single-trials.
The experimental noise level is the decisive factor determining the detectability of added-activity in single-trials. A low-noise experiment provided direct evidence that the SEP σ-burst is at least partly generated by added-activity matching earlier invasive single-cell recordings.
closed_qa