Urban Living Environment and Myopia in Children
1. In a large Chinese cohort of elementary school children, living in urban areas was associated with increased prevalence and incidence of myopia.
2. Despite this, living in urban spaces was conversely associated with decreased severity and rates of progression of myopia.Â
Evidence Rating Level: 2 (Good)
The prevalence of myopia has been increasing in recent years, and represents a significant global health concern, as it can be associated with potentially blinding conditions. It is hypothesized that changes in living environments and lifestyles are affecting these rates, with urbanization representing a possible contributing factor. In this cohort study in Tianjin, China, researchers utilized vision examinations conducted over a 2-year period amongst 177,894 elementary school students. An urban score was additionally calculated for each individual using satellite data, of which researchers would then use to examine whether there was a relationship between urbanization and myopia. It was found that higher urbanization levels were associated with increased myopia incidence over 1-year (OR, 1.09) and 2-year (OR, 1.53) periods, as well as positively associated with myopia prevalence (OR, 1.62; 95% CI, 1.08-2.42; P = .02). Interestingly however, it was also found that urban living was also linked to lower myopia progression at 1 year (OR, 0.84) and 2 years (OR, 0.73), as well as with lower myopia severity (OR, 1.46). Study findings corroborated similar prior studies that although there was a greater incidence and prevalence of myopia in urban settings, those affected by the condition were observed to have better outcomes compared to those in rural settings. It is postulated that this may be attributed to the limited medical resources, education, and access to care that individuals living in rural areas may face. Less clear however, is the mechanism of action of which urbanization may be leading to increased rates of myopia, which will need to be further characterized in future studies.
1. In this double-blind clinical trial of older adults with hypertension and at least one additional risk factor for coronary artery disease, the risk of mortality due to cardiovascular disease was similar among those treated with ACE inhibitors, calcium channel blockers, and thiazide diuretics.Â
Evidence Rating Level: 1 (Excellent)
There are several medications commonly prescribed for treatment of hypertension, but little is known about the different effects these agents have on mortality due to cardiovascular disease (CVD). Researchers aimed to assess whether treatment of hypertension with ACE inhibitors, calcium channel blockers (CCB), or thiazide diuretics have different impacts on mortality due to CVD. This randomized, multicentre, double-blind clinical trial included 32,804 individuals over age 55 with a diagnosis of hypertension and at least one other risk factor for coronary artery disease, which is a subset of individuals from the Antihypertensive and Lipid-Lowering Treatment to Prevent Heart Attack Trial (ALLHAT). Participants were randomized to receive either a thiazide diuretic (n=15,002), a calcium channel blocker (CCB) (n=8,898), or an ACE inhibitor (n = 8,904). Doses were titrated to achieve a target blood pressure of under 140/90. Participants were followed within the trial for 4-8 years, followed by passive follow-up for up to 23 years.The primary endpoint of the study was mortality due to cardiovascular disease (CVD), which included myocardial infarction, stroke, and heart failure. At 23 years after trial initiation, mortality rates due to CVD per 100 persons were 23.7, 21.6, and 23.8 for the thiazide diuretic, CCB, and ACE-inhibitor groups respectively, with no statistically significant difference found between these three treatments. With regards to stroke risk, the ACE inhibitor group compared to the diuretic group had a 19% increased risk of stroke mortality (AHR, 1.19; 95% CI, 1.03-1.37) and an 11% increased risk of combined fatal and non-fatal strokes (AHR, 1.11; 95% CI, 1.03-1.20). A limitation of this study is that the period of passive-follow up did not include measurements of blood pressure or strict adherence to the originally prescribed antihypertensive agent. Future studies may aim to compare these antihypertensive agents with a longer formal trial period. Overall, this study demonstrates that risk of mortality due to CVD is similar among those treated with ACE inhibitors, CCBs, and thiazide diuretics, but ACE inhibitors were associated with increased stroke risk compared to diuretics.
1. In this prospective cohort study, participants with olfactory dysfunction after confirmed COVID-19 infection had improved olfactory functioning over the 1 year-following period.Â
Evidence Rating Level: 2 (Good)
Little is known about the long-term prognosis for olfactory dysfunction following infection with COVID-19. This study included 77 participants with olfactory dysfunction persisting after they were diagnosed with COVID-19. The aim of the study was to determine the risk of developing a persistent olfactory dysfunction at one-year following infection. Participants completed measurements of olfactory function via the Sniffin’s Sticks Test. The primary measurement was the threshold-discrimination identification (TDI) score, with secondary outcomes including gustatory function scores, as well as self-reported functioning for taste, smell, and impact on quality of life. The median TDI score at baseline was 21.25 (IQR 18.25-24.75), which increased to 27.5 (IQR 23.63-30.0) at 6 months, and further increased 30.75 (IQR 27.38-33.5) by at year, which is in the range of normosmia. Self-reported scores for taste, smell, and quality of life, as well as objective measures of taste function also improved over the study period. A limitation of this study is the small sample size. As well, adherence issues resulted in 9 individuals leaving the study prior to its conclusion, which may result in selection bias. in Overall, this study demonstrates that both subjective and objective measures of olfactory function improve over the year following infection with COVID-19 among those with initial olfactory dysfunction.
1. Consumption of a diet consisting of more foods with low inflammatory diet index (IDI) was associated with a decreased incidence of type 2 diabetes.Â
Evidence Rating: 2 (Good)
Diabetes is highly prevalent globally, and is associated with a great burden of morbidity. Although dietary modifications remain a mainstay of both diabetes prevention and treatment, the effect of low-inflammatory diets on type 2 diabetes risk remains unclear. In this large population-based prospective study, researchers analyzed data from 502,507 adults living the United Kingdom. Participants were followed up for up to 15 years, and had their diabetes incidence correlated with an inflammatory diet index (IDI) based on high-sensitivity C-reactive protein levels and was a weighted sum of 34 food groups (16 anti-inflammatory and 18 pro-inflammatory). Amongst 142,271 individuals in final analysis, it was found that at a follow up of 8.4 median years, type 2 diabetes risk was lower in participants with low IDI scores compared to those with high IDI scores for patients with both normoglycemia (hazard ratio [HR] = 0.71) and prediabetes (HR = 0.81). Diets with low or moderate IDI scores delayed type 2 diabetes onset by 2.20 and 1.07 years, respectively, compared to high IDI scores. Genetic predisposition significantly increased type 2 diabetes risk, and a low-inflammatory diet mitigated this risk. In summary, this is one of the first studies to suggest that in addition to glycemic control, inflammatory diets may also play a significant role in diabetes risk. Practitioners may use this information in the counseling of their patients for long term metabolic risk mitigation.
1. Patients with post traumatic stress disorder (PTSD) randomized to undergo Trauma Center Trauma-Sensitive Yoga (TCTSY) were found to have significantly improved PTSD symptoms at 3-month follow-up.
2. Improvements were found to be statistically equivalent to gold standard therapy with cognitive processing therapy (CPT)
Evidence Rating: 2 (Good)
Post traumatic stress disorder (PTSD) continues to be a significant burden in mental health worldwide, especially amongst combat veterans. Although cognitive processing therapy (CPT) continues to be the gold-standard of treatment, the rising prevalence of PTSD has created the need for the exploration of further alternative treatment options, especially as CPT may not always be effective, and also often has high dropout rates. Yoga, specifically Trauma Center Trauma-Sensitive Yoga (TCTSY) has emerged as one of the therapies of interest. In this study, 131 participants were randomly assigned to either receive CPT or TCTSY over a 10 week period, and had their PTSD symptom severity monitored via the clinician administered PTSD Scale (CAPS) for the DSM-5 as well as the PTSD checklist at 2-weeks and 3-months post intervention. Both treatment groups improved over time on the CAPS-5 (mean [SD] scores at baseline: 36.73 [8.79] for TCTSY and 35.52 [7.49] for CPT; mean [SD] scores at 3 months: 24.03 [11.55] for TCTSY and 22.15 [13.56]) and the PCL-5 (mean [SD] scores at baseline: 49.62 [12.19] for TCTSY and 48.69 [13.62] for CPT; mean [SD] scores at 3 months: 36.97 [17.74] for TCTSY and 31.76 [12.47]) (P < .001 for time effects), with statistically equivalent improvement in symptoms. TCTSY emerged as an effective and acceptable treatment option, particularly for women veterans resistant to conventional evidence-based therapies. The study highlights the potential of TCTSY to improve engagement and completion rates, offering a valuable addition to PTSD treatment options.
Image: PD
©2023 2 Minute Medicine, Inc. All rights reserved. No works may be reproduced without expressed written consent from 2 Minute Medicine, Inc. Inquire about licensing here. No article should be construed as medical advice and is not intended as such by the authors or by 2 Minute Medicine, Inc.