Big Data “Storm Clouds” Gather to Transform Healthcare

I’m posting this from the Amazon #reinvent conference in Las Vegas. It’s only day one of this cloud mega-event, and one thing is already clear: Big data has the potential to solve many of our most serious and longstanding healthcare challenges.

Researchers at Stanford recently developed a deep-learning algorithm that’s better at diagnosing pneumonia than radiologists.

In fact, big data analytics and artificial intelligence (AI) are transforming healthcare with breakthroughs in everything from the early detection of heart failure to predicting who is most likely to contract type 2 diabetes to preventing worldwide pandemics by detecting who has Ebola before boarding a plane. Even the age-old quest for the cure to the common cold is getting serious attention by big data healthcare professionals.

The Big in Big Data Healthcare is Enormous

The challenge, however, is determining which data will lead to new and more effective treatments and cures. How much of this data will need to be stored, and for how long? It’s a serious issue and one that is beyond even the most knowledgeable healthcare big data professional. New comers like Wasabi hope to solve the problem by driving down the cost of cloud storage so researchers can afford to store more data longer. Because the “big” in big data is enormous.

One human genome represents approximately 130 terabytes (thousand-billion bits) of data. Analysts from IDC predict that healthcare data will reach 2.3 zettabytes (1 billion terabytes) by 2020. That’s because breakthrough solutions in healthcare require a vast and diverse amount of data from a wide variety of sources and locations with different research approaches. Capturing data directly from the human population is also difficult since there are billions of people each with their own health issues and complex interactions. The answer is coming by capturing human and other data from increasingly more intelligent healthcare systems.

The Big Data Healthcare Journey

To produce helpful insights, data first needs to be captured in real time, in other words, it is “always on.” This data is then shared in proprietary and public clouds with internal and external users. Once raw data is captured, a tremendous amount of analysis is needed to make it usable by anyone. For example, biologists determine what data needs to be captured and then add their own domain knowledge and expertise to the process. Next, data scientists analyze this “clean” data and transform it into usable formats to provide actionable insights. From there, doctors, researchers, and scientists convert those insights into viable solutions in the form of new medications, supplements, protocols, and therapies.

The final–and most important–step in the big data journey is the patient who hopes that all this accumulated knowledge and effort delivers improved health now and disease prevention and increased health span for all in the future.

In Summary

For big data to yield actionable insights, interdisciplinary collaboration from professionals in established and emerging fields, including expert systems, data science, cloud computing, storage, AI, and deep learning will be required. At the core of that process will be even bigger data systems and solutions that organize “storm clouds” full of data to help solve known healthcare issues and new ones that will undoubtedly arise.

To be continued…

Stay Connected with Wasabi
Subscribe to our Blog

This website uses cookies. By continuing to use this site, you accept our use of cookies.

Dismiss