Big-data baby steps

Current

By Mark Witten

Dr. Yasser Iturria-Medina, a postdoctoral researcher at the Ludmer Centre for Neuroinformatics & Mental Health, is applying a big-data approach to Alzheimer’s disease (AD) and dementia, which is a critical step forward for personalized medicine. AD and dementia are complex interactions of age and gender, genetics and epigenetics, environment and lifestyle. There is no single cause. Today, large
datasets and big-data analysis are crucial to advancing research and treatment, and developing early and effective interventions. “We’re in the era of big-data analysis and we should not depend entirely on subjective opinions and hypothetical models; instead we should allow the data to speak for itself,” says Medina, who trained at the Cuban  Neuroscience Centre.

His big-data, multi-factorial analysis of more than 7,700 brain images from patients with late-onset AD, for example, identified a decrease in blood flow in the brain as the first physiological sign of AD, contrary to previous understanding, and established a key biomarker for early detection. This computational power will be vital to gaining a fuller, integrated understanding necessary for a personalized approach to early detection and treatment. “Personalized medicine is proposing to identify individual risks with different doses for each patient, and it tries also to predict what will be the response of each person. We’re not there yet. We need more data-driven integrative studies, capable of considering all possible biological factors involved and clarifying the direct interactions among these factors in Alzheimer’s disease,” explains Medina. “These are essential steps toward developing effective, personalized treatments.”

Be Sociable, Share!

Comments are closed.