UCLA is set to establish a Center of Excellence for Big Data Computing under an $11 million research grant from the National Institutes of Health (NIH). The UCLA centre, part of an initial $32 million outlay for the $656 million Big Data to Knowledge (BD2K) initiative, will formulate new strategies for harnessing complex biomedical data sets known as Big Data.

As most medical journals and hospitals have entered the digital age, there has been a surge in research studies and mountains of electronic health records (EHR). However, data are stored in different formats, making it difficult particularly for researchers to compare and analyse such data. "Researchers and consumers need a way to easily access and make sense of these gold mines of information to benefit patients,” said the UCLA centre’s principal investigator Peipei Ping, a Professor of Physiology, Medicine and Bioinformatics at the David Geffen School of Medicine.

Being one of 11 centres across the United States, UCLA will create analytic tools to address the daunting challenges facing researchers in accessing, standardising and sharing scientific data to foster new discoveries. Also, the UCLA centre will be responsible for developing data science approaches for use by scientists and the public.

Tools to Analyse Data on Protein Markers Linked to Heart Disease

A key task is to create and test cloud-based tools for integrating and analysing data about protein markers linked to cardiovascular disease. For this work, UCLA’s David Geffen School of Medicine and Henry Samueli School of Engineering and Applied Science will be collaborating with five other institutes: The Scripps Research Institute, Scripps Health, University of Mississippi Medical Center, Sage Bionetworks, and the UK's European Bioinformatics Institute.

The findings of this joint research work will help shape guidelines for future data integration and use, analysis of genomic data and management of EHR data. To protect patient privacy, the investigators will process data analytical platforms under layers of security.

Experts expect Big Data to evolve into an integral component of future cardiac treatments. As such, the UCLA centre is planning a curriculum to train cardiology fellows and clinicians how to utilise Big Data, according to Dr. Karol Watson, Professor of Medicine in the Division of Cardiology at the David Geffen School of Medicine and Director of the UCLA Barbra Streisand Women’s Heart Health Program.

A Single Computer Program to Standardise Data

A long-term goal is to standardise data and integrate multiple approaches into a single computer program, enabling the user to push a single button for information instead of struggling with numerous fragmented methods, Prof. Ping explained.

For example, a healthy person’s medical records may contain 10 pages, while a patient with a chronic disease may have 150 pages of data. “Our centre will develop computational tools to extract keywords and summarise the most critical medical information for physicians," Prof. Ping said. "We want to streamline digital access to patient files and allow clinicians to quickly grasp the most relevant details of each case."

The goals of the BD2K initiative are in line with UCLA’s mission to broaden understanding of Big Data tools to enhance quality of patient care, emphasised Dr. A. Eugene Washington, Vice Chancellor for UCLA Health Sciences and Dean of UCLA's David Geffen School of Medicine. “Our leadership in forging global partnerships speaks to the strength of our commitment to provide the best possible environment for investigations in cutting-edge fields like Big Data science.”

Source: Newswise.com
Image Credit: UCLA's David Geffen School of Medicine

«« Diabetes Screening Urged For All After Age 45


System Cuts Hospital Bed Usage by 25 percent »»



Latest Articles

Cardiology, EHR, big data, computer, UCLA UCLA is set to establish a Center of Excellence for Big Data Computing under an $11 million research grant from the National Institutes of Health (NIH). Th...