Evaluation of the diet wide contribution to serum urate levels: meta-analysis of population based cohorts

  1. Tanya J Major, postdoctoral fellow1,
  2. Ruth K Topless, assistant research fellow1,
  3. Nicola Dalbeth, professor2,
  4. Tony R Merriman, professor1
  1. 1Department of Biochemistry, University of Otago, 710 Cumberland Street, Dunedin 9054, New Zealand

  2. 2Department of Medicine, University of Auckland, Auckland, New Zealand
  1. Correspondence to: T R Merriman tony.merriman{at}otago.ac.nz

Abstract

Review methods 16 760 individuals of European ancestry (8414 men and 8346 women) from the US were included in analyses. Eligible individuals were aged over 18, without kidney disease or gout, and not taking urate lowering or diuretic drugs. All participants had serum urate measurements, dietary survey data, information on potential confounders (sex, age, body mass index, average daily calorie intake, years of education, exercise levels, smoking status, and menopausal status), and genome wide genotypes. The main outcome measures were average serum urate levels and variance in serum urate levels. β values (95% confidence intervals) and Bonferroni corrected P values from multivariable linear regression analyses, along with regression partial R2 values, were used to quantitate associations.

Introduction

Hyperuricaemia (elevated serum urate concentration) is a central risk factor for gout, and is also associated with chronic kidney disease, hypertension, and metabolic syndrome.1234 The balance between hepatic production of urate and intestinal or renal urate excretion pathways determine an individual’s serum urate levels.5 This balance can be modified by both genetic and environmental factors. Familial and twin studies estimating the heritability of serum urate suggest genetic factors explain 25% to 60% of the variability in serum urate levels,678910111213 consistent with estimates from a genome wide association study of unrelated individuals, which predicted that 25% to 40% of the variability in serum urate levels is controlled by common single nucleotide variants.14 The remaining 60% to 75% of serum urate variability is therefore explained by genetic factors (common and uncommon) not tagged by common variants, and non-genetic factors such as diet or other environmental exposures.

For centuries, diet has been identified as a risk factor for the development of gout.1516 Consumption of red meat, shellfish, alcoholic beverages, sugary drinks, and tomatoes has been associated with increased serum urate levels, and low fat milk and coffee consumption with reduced serum urate levels.1718192021 Certain diets (eg, the Dietary Approaches to Stop Hypertension (DASH) and the Mediterranean diet) have been shown to reduce serum urate levels and the risk of gout.222324 In addition, food consumption is heritable, for example, heritability of coffee consumption is estimated to be between 36% to 58%,25 alcohol consumption to be between 43% to 53%,26 and sugar sweetened beverage consumption to be 48%.27 Genome wide association studies have identified genetic associations with coffee and alcohol consumption habits.2829 It is therefore possible that the heritable component of specific food consumption contributes to the heritability of serum urate levels (eg, signals in the ABCG2, GCKR, and MLXIPL genes are common to genome wide association studies of coffee consumption and serum urate).1428 To date, a systematic analysis of the contribution of diet to serum urate levels has not been performed in a sufficiently large dataset. Furthermore, the relative contributions of inherited genetic variants and overall diet to variance in serum urate concentrations is unknown. This study aimed to systematically test individual dietary components for association with serum urate in a diet wide association study and quantify the relative contributions of overall diet and common, genome wide single nucleotide variants in determining serum urate levels.

Methods

Participants

Demographic, anthropomorphic, and clinical data are presented in the supplementary materials (table S1). Information from the baseline visit of the Atherosclerosis Risk in Communities (ARIC, 1987-89), Coronary Artery Risk Development in (Young) Adults (CARDIA, 1985), Cardiovascular Heart (CHS, 1989-90), and Framingham Heart (FHS, 2002-05) studies was sourced through the Database of Genotypes and Phenotypes. Anonymised information from the Third National Health and Nutrition Examination Survey (NHANES III, 1988-91) was also used. These five studies all recruited participants from the United States.

Analysis sample sets of people of European ancestry were developed by using consistent exclusion criteria between study cohorts (supplementary materials, fig S1). People without serum urate measurements or genome wide genotypes were excluded, along with individuals aged under 18, people with kidney disease or gout, and those taking urate lowering drugs or diuretic drugs. Individuals who did not provide information for any of the covariates used in analyses were also excluded. Quality controls for the dietary data were also applied, with participants who answered less than 10% of the food frequency survey excluded, along with individuals whose estimated average daily calorie intake was less than 600 kcal/day or greater than 4200 kcal/day (inclusive; 1 kcal=4.18 kJ). Participants whose questionnaire answers were deemed unreliable by the study interviewer at recruitment were also excluded.

Dietary assessment

During recruitment, participants from the five cohorts completed a validated food frequency questionnaire. The participants in the ARIC, CHS, and FHS studies completed similar questionnaires in which participants were asked to answer the question “How often, on average, in the past year did you eat [this food]?” by choosing from several frequency categories (66 questions and nine answer categories for ARIC, 99 questions and six answer categories for CHS, and 126 questions and nine answer categories for FHS).3031323334 These categorical answers were converted to average serves per week for analysis (supplementary materials, table S2). Participants in the CARDIA study answered a specifically designed and validated diet history which assessed their consumption frequency of 100 food items by using a series of questions, “Do you eat [this food]?” if yes, “How much do you usually have?” and “How often do you usually have it?” Answers were then converted to servings per week by the study researchers using the Nutrition Coordinating Centre dietary analysis system.3536 Participants in the NHANES III study were given a questionnaire (60 questions) similar to that of the ARIC, CHS, and FHS studies in which they were asked “How often, in your usual diet over the past month, have you eaten [this food item]?” Answers were given in serves per month and converted to serves per week for analysis (supplementary materials, table S2).37

As each study administered a slightly different food frequency questionnaire, with a differing number of questions (60 to 126) and a slightly different list of food items within each question, questionnaires were assessed for between-study comparability. Briefly, questions were grouped together based on food type. Where questions were identical between studies no changes to the data were made. Where questions were not identical between studies (eg, questionnaires asked about any wine consumption v separate red and white wine consumption) the answers were combined (after conversion to serves per week) to create identical questions with the aim of retaining as many food items as possible, while reducing the variability in questions between the five studies. The decisions on which questions were able to be combined were made by one analyst (TJM) in consultation with the other three authors. If an identical question could not be created, the non-matching information was excluded, either from only the cohort with non-matching data (eg, the NHANES III study asked about consumption of peanuts, peanut butter, nuts, and seeds in a single question, making this non-comparable to either the nuts or peanuts questions of the other four studies), or if at least three of the five cohorts did not have identical questions, the extra information was excluded from the entire analysis (eg, only the CHS and FHS studies asked about berry consumption, so berries were not included). These exclusions resulted in a group of 63 food items with comparable questions within at least three of the five studies (supplementary materials, table S3). Average consumption of each of these 63 food items, per sample set, is presented in the supplementary materials, table S4. We were not able to adjust average consumption for portion size in the aggregated data, because the NHANES III study did not specify portion size, the CHS study only specified a relative portion size (small, medium, or large), and the portion sizes specified by the ARIC, CARDIA, and FHS studies were inconsistent with each other.

Results

Diet wide association analysis

Figure 1 and table 1 show that 15 food items were significantly associated with serum urate levels in the full, male, or female cohorts (Pβ<7.94×10−4; supplementary materials, tables S4-S7). Seven food items were associated with raised serum urate levels (beer, liquor, wine, potato, poultry, soft drink, and meat (beef, pork, or lamb)) and eight were associated with lower serum urate levels (eggs, peanuts, cold cereal, skim milk, cheese, brown bread, margarine, and non-citrus fruit). The food items with the strongest urate raising effect (beer and liquor) were associated with a 1.38 μmol/L increase in serum urate per serving per week, equating to a 9.66 μmol/L (0.16 mg/dL) increase per daily serving. In the full cohort, wine was only nominally significant (Pβ<0.05, Pβ≥7.94×10−4). Wine was significantly associated with serum urate levels in the male cohort, along with eight other food items (beer, soft drink, skim milk, peanuts, eggs, cold cereal, brown bread, and non-citrus fruit). In the female cohort, seven food items (beer, liquor, cold cereal, skim milk, cheese, brown bread, and margarine) were significantly associated with serum urate (table 1). The effect size for skim milk was similar to that reported in a previous study.52 The effect sizes for beer, liquor, soft drink, and meat (beef, pork, or lamb) were within the range of previously reported values.1819215152

Association of diet scores with serum urate levels

Table 2 shows that increases in the Healthy Eating, DASH, and Mediterranean diet scores (indicating a healthier diet) were significantly associated with lowered serum urate levels in the full cohort (β=−0.72 μmol/L, Pβ<0.001; β=−0.73 μmol/L, Pβ<0.001; β=−0.38 μmol/L, Pβ<0.001, respectively) and the male cohort (β=−0.97 μmol/L, Pβ<0.001; β=−0.86 μmol/L, Pβ<0.001; β=−0.58 μmol/L, Pβ=0.02, respectively), but only the DASH diet score was significantly associated with lowered serum urate levels in the female cohort (β=−0.64 μmol/L, Pβ=0.03). The data driven diet pattern, which represented a diet high in less healthy foods, associated with increased serum urate levels in the full, male, and female cohorts (β=0.57 μmol/L, Pβ<0.001; β=0.62 μmol/L, Pβ<0.001; β=0.53 μmol/L, Pβ<0.001, respectively). These diet scores(the s were significantly correlated with each other (all ≥0.29 (absolute values), PCor<0.001) and the results of the regression analyses for the Healthy Eating, DASH, and Mediterranean diet scores were not significantly different in the male and female cohorts (PDiff≥0.10 and PDiff≥0.17, respectively). In the full cohort, the Healthy Eating and DASH diet scores and the Healthy Eating and Mediterranean diet scores did not have significantly different results (PDiff=0.95 and PDiff=0.06, respectively). The DASH and Mediterranean diet score results were mildly different (PDiff=0.03). The results of the data driven diet pattern were significantly different to the other three diet scores in the full, male, and female cohorts (PDiff<0.001, PDiff<0.001, and PDiff≤0.03, respectively).

Given that foods are rarely consumed in isolation, and significant correlations (PCor<0.001) were observed between every food item and at least one other food item (supplementary materials, fig S8), the diet wide analysis was repeated with adjustment for the diet scores to account for confounding owing to usual dietary habits. Eight (beer, liquor, cold cereal, skim milk, cheese, brown bread, margarine, and eggs) of the 14 significant food items in the full cohort remained significantly associated after adjustment for each of the four dietary scores (supplementary materials, table S5). Non-citrus fruit was non-significant after adjustment for each of the four diet scores. Peanuts, meat (beef, pork, or lamb), potatoes, soft drink, and poultry all had an attenuated association after adjustment for one (or more) of the separate diet scores. Adjustment for the Healthy Eating, DASH, and Mediterranean diet scores resulted in consistently significant associations between serum urate levels and fish consumption, while legumes, tomatoes, and white bread also had significant associations after adjustment for one or more of the separate diet scores. In the male cohort, five (beer, wine, eggs, skim milk, and brown bread) of the nine previously associated foods maintained their significance after adjustment for each of the four diet scores separately (supplementary materials, table S6). Peanuts and cold cereal did not maintain their significance after adjustment for the DASH diet score (supplementary materials, table S6). In the female cohort, only beer, liquor, cheese, and skim milk maintained their significance after the diet score adjustments. Margarine maintained significance when adjusted for three of the four diet scores. Brown bread and cold cereal were not significant after adjustment for each of the four dietary scores separately (supplementary materials, table S7).

Variance in serum urate explained by dietary scores and inherited genetic variants

Individually, the 14 food items associating with serum urate in the full cohort explained 0.06% to 0.99% of the variation in serum urate levels, and summed they explained 3.28% of the variation (table 1). All 63 food items, when summed, explained 4.29% of variation in serum urate levels (supplementary materials, table S5). Food groups (fruit, vegetables, meat, and dairy products) explained between 0.16% and 0.52% of variation in serum urate levels (supplementary materials, table S5). Unadjusted by the genetic risk score, the DASH diet score explained more of the variation in serum urate levels in the full cohort (0.28%; table 2) than the Healthy Eating (0.15%), Mediterranean (0.06%), or data driven (0.16%) diet scores, but each diet score explained less variation in serum urate than the most strongly associated individual food items (table 1).

In contrast, 30 genetic variants previously associated with serum urate levels at a genome wide level of significance in Europeans additively explained 8.7% of the variance in serum urate levels in the full cohort (excluding the NHANES III study; supplementary materials, table S8). A weighted serum urate genetic risk score constructed from these 30 variants,14 unadjusted by any dietary score, explained 7.9% of the variance (table 2). When included in models with the dietary scores, the percentage variance explained did not substantially change in the full, male, and female cohorts (maximum difference of 0.04%; table 2). The percentage variance explained by the dietary scores after adjustment for the genetic risk score fluctuated from a −0.09% difference for the data driven diet pattern in the male cohort to a +0.13% difference in the Mediterranean diet score in the male cohort (table 2). Genome wide estimations of serum urate heritability explained 23.9% of variance in serum urate levels in the full cohort (excluding the NHANES III study). The heritability estimates were 23.8% in the male cohort and 40.3% in the female cohort. Only the DASH diet score showed any evidence for an interaction with the weighted genetic risk score, and only in the female cohort (P=0.04); for all other interactions the P value was ≥0.21 (supplementary materials, table S9).

Discussion

Principal findings

Fifteen different food items were significantly associated with serum urate levels. These foods included six established urate modifying foods: beer, liquor, wine, soft drink, skim milk, and meat (beef, pork, or lamb). The nine other foods included two less established urate modifying foods (cheese and non-citrus fruit) and seven food items without established associations (poultry, potatoes, brown bread, peanuts, margarine, cold cereal, and eggs). The associations observed in this diet wide study with known, confirmed serum urate influencing food items were consistent in direction of effect and magnitude with previously reported associations (urate raising: beer, liquor, wine, soft drinks, and meat (beef, pork, or lamb); urate lowering: skim milk; table 1). However, each of these established foods explained less than 1% of variation in serum urate levels within the full cohort. Similarly, the diet scores explained very little variance in serum urate levels (0.28% for the DASH diet, 0.15% for the Healthy Eating diet, 0.06% for the Mediterranean diet, and 0.16% for the data driven diet pattern; table 2). In comparison, the heritability explained by common genetic variants, was estimated to be 23.9%, with a weighted genome wide association study identified genetic risk score explaining 7.9% of the variability in serum urate levels (table 2). Thus, in the datasets analysed here, overall diet explains much less variance in serum urate levels when compared with inherited genetic variants.

Strengths and limitations of study

The primary limitation to our study was the use of differing food frequency questionnaires between studies, which led to methodological challenges when combining the study specific effects and could have led to the study participants giving information of variable accuracy between studies. To circumvent these issues, the food frequency data were carefully inspected for between study comparability and several quality controls were applied to the data before use. Adjustment for estimated average daily calorie intake was also consistently performed during analysis to further minimise any bias or inaccuracies caused by these differing questionnaires. Owing to the differing questionnaires between studies, some food items were unable to be included in the diet wide analysis. These exclusions could have resulted in this study not including some foods that have real effects on urate; however, the number of these exclusions was minimal (several items per study, none in the ARIC study). Where the exclusion of a question only occurred in one cohort (owing to non-comparability of the question), it is possible that the analysis of the remaining cohorts had a reduction in power to detect an effect. Given that data were collected at different times (from 1985 to 2005) food compositions might also have changed, resulting in unintentional combining of non-comparable food items in this analysis. This situation might be particularly important when processed foods are being assessed (such as cereals, bread, and mayonnaise or dressing).53 This consideration is also important for the generalisation of results to the present day or to other countries. Our study population included individuals of European ancestry living in the US, and the dietary and genetic analysis might not be generalisable to other populations. Additionally, as with any large scale set of analyses, the likelihood of finding a falsely significant result increases with every extra test added. The application of a Bonferroni correction to account for this multiple testing effect reduces this likelihood. However, some of the food items that were nominally significant (P<0.05) could have had a real effect undetected in this study (type II error). Furthermore, measurement error of dietary intake will contribute to suppressed R2 estimates of the contribution of diet to variance in serum urate levels relative to that of the genetic R2 estimates,54 which will have minimal measurement error. Finally, a heritable component to food preferences has been reported in other studies, including food and alcohol consumption,252627 implying non-independence between the diet scores and the genetic risk score. To mitigate this non-independence, the additionally adjusted analyses presented in table 2 included both diet and genetic risk scores in the same model.

Comparison with other studies

Owing to the diet wide approach to our analysis, associations with novel and less established foods were identified. Of the nine novel or less established associations, we found some evidence in the literature to support the associations of cheese, non-citrus fruit, egg, brown bread, and cold cereal. Egg consumption has previously been associated with reduced urate levels in a Croatian study and protection from hyperuricaemia in a Taiwanese Nutrition and Health Survey.5556 A third study showed no noticeable association with the risk of hyperuricaemia in elderly Taiwanese men, although a trend towards protection was evident.57 Finally, association between egg consumption and increased serum urate levels has been reported in two cohorts of European ancestry.58 Certainly the current cumulative evidence is ambiguous regarding a possible role for egg consumption in urate control. We also observed an association between non-citrus fruit and reduced serum urate levels, which is supported by association of fruit consumption with reduced urate levels in an Australian cohort.58 The loss of significance (in the full cohort) when the association of non-citrus fruit with serum urate was adjusted for the diet scores could indicate that greater consumption of fruit is reflective of differing general dietary habits (also inferred from the correlation matrix; supplementary materials, fig S8) and could reflect confounding due to healthier dietary habits. Coarse bread and cheese were associated with reduced urate levels in two cohorts of European ancestry, and cereal in one of two cohorts of European ancestry.58 This finding supports our data associating brown bread, cold cereal, and cheese with reduced serum urate levels. We are not aware of other studies specifically testing for association of potatoes, peanuts, and margarine consumption with serum urate levels. Thus these findings need to be replicated before they can be identified as genuine urate raising or urate lowering foods.

Several studies have used food frequency data to estimate the effect of dietary habits on serum urate levels (similar to the various diet score analyses presented here) with varying results. Heidemann and colleagues used a factor analysis to create two indicators of dietary habits in a group of German individuals.59 This study showed that individuals whose diet was characterised by high intake of refined grains, processed meats, eggs, and sugar sweetened beverages (processed food dietary pattern) had higher urate levels than people who did not commonly eat these foods. We also used factor analysis, identifying a dietary pattern, comprising non-citrus juice, soft drinks, butter, white bread, pasta, meat (beef, pork, or lamb), and chips or popcorn, within the five combined cohorts. Given that this pattern includes similar foods to those in Heidemann and colleagues’ processed food pattern and several established urate raising foods, the association with raised serum urate levels (β=0.57 μmol/L per unit change; table 2) was not unexpected. When Heidemann and colleagues reversed their analysis using a diet score that represented a health conscious dietary pattern (characterised by a high intake of fruit, vegetables, and whole grains), no association with serum urate was seen.59 This result contradicts our study results, for both the individual effects of non-citrus fruit and brown bread, and the urate lowering influence of the three dietary scores constructed based on conventional healthy diet advice. In another study that assessed the association between estimates of three dietary patterns and serum urate levels in Taiwanese individuals, researchers found no noticeable association between urate levels and estimates of a urate raising dietary pattern (consuming high levels of seafood, meat, sugar sweetened beverages, and organ meats), a fish and fried food dietary pattern, or a vegetable and fruit dietary pattern. They posited that other clinical factors such as obesity and concomitant drugs are more important than diet in determining serum urate levels,60 a suggestion supported by the greater effect of genetics versus diet observed here.

Our results using the DASH diet score compare well with those from the randomised control trial by Juraschek and colleagues, who showed an average reduction of serum urate of 21 μmol/L (0.35 mg/dL) when comparing the DASH diet with an average American diet in individuals with prehypertension or stage 1 hypertension.23 There was a greater reduction of 77 μmol/L (1.29 mg/dL) in participants with hyperuricaemia (although the trial had very few participants with hyperuricaemia, n=8). In our analysis, the DASH diet scores varied from 8 to 40, with each unit increase in score associated with a 0.73 μmol/L decrease in serum urate. This finding corresponds with a decrease of 23.4 μmol/L between the least DASH-like diet and the most DASH-like diet, comparable to the decrease of 21 μmol/L reported by Juraschek and colleagues.23 Certainly, if a DASH diet can be maintained outside the research setting, our data as well as those from the Juraschek trial indicate that relative to a non-DASH diet,23 a clinically relevant decrease in serum urate levels can be achieved. However, implementation of the DASH diet might not be straightforward. This diet was reported two decades ago,45 but the barriers to implementing this diet both at the population level and in a primary care setting are yet to be overcome.61

Acknowledgments

The ARIC study is carried out as a collaborative study supported by National Heart, Lung, and Blood Institute contracts N01-HC-55015, N01-HC-55016, N01-HC-55018, N01-HC-55019, N01-HC-55020, N01-HC-55021, N01-HC-55022, R01HL087641, R01HL59367, and R01HL086694; National Human Genome Research Institute contract U01HG004402; and National Institutes of Health contract HHSN268200625226C. Infrastructure was partly supported by Grant Number UL1RR025005, a component of the National Institutes of Health and NIH Roadmap for Medical Research. The FHS and the Framingham SHARe project are conducted and supported by the National Heart, Lung, and Blood Institute in collaboration with Boston University. The Framingham SHARe data used for the analyses described in this manuscript were obtained through the Database of Genotypes and Phenotypes. The CHS research reported in this article was supported by contract numbers N01-HC-85079, N01-HC-85080, N01-HC-85081, N01-HC-85082, N01-HC-85083, N01-HC-85084, N01-HC-85085, N01-HC-85086, N01-HC-35129, N01 HC-15103, N01 HC-55222, N01-HC-75150, N01-HC-45133, N01-HC-85239, and HHSN268201200036C; grant numbers U01 HL080295 from the National Heart, Lung, and Blood Institute and R01 AG-023629 from the National Institute on Ageing, with additional contribution from the National Institute of Neurological Disorders and Stroke. A full list of principal CHS investigators and institutions can be found at www.chs-nhlbi.org/pi.htm. The Coronary Artery Risk Development in Young Adults Study (CARDIA) is conducted and supported by the National Heart, Lung, and Blood Institute (NHLBI) in collaboration with the University of Alabama at Birmingham (N01-HC95095 and N01-HC48047), University of Minnesota (N01-HC48048), Northwestern University (N01-HC48049), and Kaiser Foundation Research Institute (N01-HC48050).

We thank the staff, participants, and funding bodies of the ARIC, CARDIA, CHS, FHS, and NHANES III studies for their important contributions. This manuscript was not prepared in collaboration with, nor approved by, investigators of the ARIC, CARDIA, CHS, FHS, or NHANES III studies and does not necessarily reflect the opinions or views of these studies or their institutions or funding bodies (Boston University, or the National Heart, Lung and Blood Institute). We thank the Centres for Disease Control and Prevention (CDC) and National Centre for Health Statistics (NCHS) (Hyattsville, MD) for data from NHANES.

Footnotes

Source