Personal tools
You are here: Home EITA Emerging Technologies Research and Ventures Biomedical Research and Ventures The Theme - Biomedical Research

The Theme - Biomedical Research

(Harvard University - Joyce Yang)


 "Biomedical Research in A New Health ICT Framework"





"Biotech is the New Digital" -- [Prof. Nicholas Negroponte, Founder, MIT Media Lab.] 

In the 21st century, groundbreaking research and discovery in the Biomedical Research are more interdisciplinary than ever. Biomedical Research represents the (basic and applied) research activities in the areas of Medicine, Public Health, Pharmacology, Biology, Biochemistry, Chemistry, Physics, Mathematics, Statistics, Engineering, New Materials, Information and Communication Technology (ICT), and Health-related topics. These scientists work to understand the biological principles that govern the function of the human body, to discover the mechanisms of disease, and to find innovative ways to treat or cure disease by developing advanced diagnostic tools or new therapeutic strategies for physicians - especially new smart devices that could help transform the detection, prevention, and management of disease. The increased longevity of humans over the past century can be significantly attributed to advances resulting from Biomedical Sciences Research. 

We’re at the cusp of a major revolution in understanding the workings of the human body. According to Google Ventures, the following top eight life sciences technologies are the most promising and will transform medicine: Artificial Intelligence, Understanding the Brain, Reinventing Antibiotics, Battling Cancer, Genetic Repair, Understanding the Microbiome, Organ Generation, and Stem Cells. For example, stem cell research has the potential to revolutionize the way we treat many conditions, including degenerative diseases for which few effective treatments currently exist. Stem cell research is rapidly advancing towards potential therapeutic applications such as tissue and organ replacement, disease modelling and drug testing. Dr. Aaron Ciechanover, Nobel Prize in Chemistry 2004, characterizes 21st century medicine with four P’s: it’s personalized, predictive, preventive – and it should be participatory.

New Media, Cloud Computing, and Fog Computing

Modern healthcare is being transformed by new and growing electronic resources, with hospitals generating terabytes of imaging, diagnostic, monitoring, and treatment data. Machine learning (ML) is central to utilizing these rapidly expanding datasets, combing through data across patients, clinics, and hospitals to uncover more effective treatments and practices that increase the quality and longevity of human life. 

The rise of new media has increased communication between people all over the world and the Internet. It allows people to on-demand (cloud computing) access to content anytime, anywhere, on any digital device, as well as interactive user feedback, and creative participation. New media allows the real-time generation of new, unregulated content, including (at least for now) Internet, blogs, websites, computer multimedia (e.g., medical audio or speech, real-time or recorded video, high resolution still image, and so forth), pictures, and other user-generated media. The physical world is becoming a type of information system. 

Pushing computing, control, data storage and processing into the cloud has been a key trend in the past decade. However, cloud alone is encountering growing limitations in meeting the computing and intelligent networking demands of many new systems and applications. Local computing both at the network edge and among the connected things is often necessary to, for example, meet stringent latency requirements, integrate local multimedia contextual information in real time, reduce processing load and conserve battery power on the endpoints, improve network reliability and resiliency, and overcome the bandwidth and cost constraints for long-haul communications. 

The cloud is now "descending" to the network edge and sometimes diffused onto end user devices, which forms the "Fog". Fog computing will change the information technology industry in the next decade. It enables key applications in wireless 5G, the Internet of Things (IoT), and big data. Fog computing and networking present a new architecture vision where distributed edge and user devices collaborate with each other and with the clouds to carry out computing, control, networking, and data management tasks. The IoT may more likely be supported by fog computing, in which computing, storage, control and networking power may exist anywhere along the architecture, either in data centers, the cloud, edge devices such as gateways or routers, edge equipment itself such as a machine, or in sensors. Fog computing distributes the services of computation, communication, control and storage closer to the edge, access and users. 

Internet of Things (IoT)

The vision of “Internet of Things (IoT)” with more than 50 to 200 billion connected devices (containing embedded sensors and actuators, etc.), linked through wired and wireless networks by year 2020, will see profound changes in the way people, businesses and the society interact. 

Today, a variety of devices monitor every sort of patient behavior – from glucose monitors to fetal monitors to electrocardiograms to blood pressure. Many of these measurements require a follow-up visit with a physician. But smarter monitoring devices communicating with other patient devices could greatly refine this process, possibly lessening the needs for direct physician intervention and maybe replacing it with a phone call from a nurse. Other smart devices already in place can detect if medicines are being taken regularly at home from smart dispensers. If not, they can initiate a call or other contact from providers to get patients properly medicated. The possibilities offered by the healthcare IoT to lower costs and improve patient care are almost limitless. 

However, one of the central-most challenges facing IoT (still very immature, and a long way to go) is the enablement of seamless interoperability between each connection (i.e., lack of interoperability at the application level). Merely connecting "things" gives you very little or almost no benefit. The vast majority of "things" and data that we might be collecting may have no relevance to the decision we want to make. A business case is needed to justify the investment. It’s about getting the right data to the right person at the right time to make the right decision. Once we started collecting right data for decision making, applications like ERPs (Enterprise Resource Plannings) and analytics applications were re-written or enhanced to use this data. This is now Internet of Everything (IoE) - data, people, things and processes. 

In the IoT, hundreds of incompatible protocols co-exist today. This makes the integration of data and services from various devices extremely complex and costly. It is clearly time to consider how to expand the IoT beyond product silos into Web-scale open ecosystems based on open standards, including those for identification, discovery, and service interoperability across platforms from different vendors. 

In addition, the IoT technology implementations will likely raise concerns around data privacy and security. While most of today's devices use secure methods to communication information to the cloud, they could still be vulnerable to hackers.

(Stata Center, MIT - Yu-Chih Ko)

Web of Things (WoT)

The enablement or implementation of IoT/IoE is done using Web of Things (WoT). The Web of Things (WoT) is a term used to describe approaches, software architectural styles and programming patterns that allow real-world objects to be part of the World Wide Web. Similarly to what the Web (Application Layer) is to the Internet (Network Layer), the Web of Things provides an Application Layer that simplifies the creation of Internet of Things applications. Rather than re-inventing completely new standards, the Web of Things reuses existing and well-known Web standards used in the programmable Web, semantic Web, the real-time Web and the social Web.

The WoT focuses on software standards and frameworks such as REST, HTTP and URIs to create applications and services that combine and interact with a variety of network devices. It is intended to enable interoperability across IoT Platforms and application domains. Primarily, the WoT provides mechanisms to formally describe IoT interfaces to allow IoT devices and services to communicate with each other, independent of their underlying implementation, and across multiple networking protocols. Secondarily, it provides a standardized way to define and program IoT behavior.

The Web will enable a transition from costly monolithic software to open markets of apps. "The Web of Things (WoT) vision", going beyond the Internet of Things and where real world objects and cloud services interact through the Web, will produce large volumes of data related to the physical world, and intelligent solutions are required to enable connectivity, inter-networking, and relevance between the physical world and the corresponding digital world resources. 

In the Web of Things, any device can be accessed using standard Web protocols. Connecting heterogeneous devices to the Web makes the integration across systems and applications much simpler. The use of Web technologies is expected to dramatically reduce the cost for implementing and deploying IoT services. Correspondingly, WoT brings into focus a wide variety of challenges and opportunities while paving a way to a variety of exciting applications for individuals to industries. The reality of a hyper-connected world is here today.

Life Sciences, High Definition Medicine, and ICT

Life sciences and ICT are coming together to revolutionize scientific and medical discovery; comprising: acquisition, transmission, processing, storage and retrieval of biomedical and health information. The general computing trend is to leverage shared web resources and massive amounts of data over the Internet. 

The foundation for a new era of data-driven medicine has been set by recent technological advances that enable the assessment and management of human health at an unprecedented level of resolution (high-definition medicine). Telemedicine, predictive diagnostics, wearable sensors and a host of new apps will transform how people manage their health. With today’s high-throughput sequencing technology, it’s much easier to generate genomic data than to transform it into information or knowledge that can improve human health. We are at the beginning of the genomics revolution. 

The promise of genomics is to revolutionize treatment of disease, to personalize treatment. The unprecedented abundance of medically relevant data (e.g. molecular, cellular, organismal, ecological, behavioral, clinical), from detailed information about genes and genetic diseases and the relative efficacy of drugs in diverse patient populations, to three-dimensional imaging of living cells giving researchers a more detailed and accurate spatial visualization of the interplay of cells and their components, is driving the use of quantitative methods in medicine. For example, recent advances have made 3D imaging (e.g., enabling 3D images of living organisms to be obtained with greater speed and precision) a valuable tool for many applications, such as cell biology, developmental biology, neuroscience and cancer research. These new approaches will improve our understanding of finding better diagnostics, treatments and therapies for diseases. 

Artificial Intelligence (AI), Machine Learning (ML), and Deep Learning (DL) in Medicine

We are currently struggling to find the right information either about lifestyle or therapeutic decisions. Medicine is a field in which technology is much needed. Our increasing expectations of the highest quality healthcare and the rapid growth of ever more detailed medical knowledge leave the physician without adequate time to devote to each case and struggling to keep up with the newest developments in his (or her) field. Due to lack of time, most medical decisions must be based on rapid judgments of the case relying on the physician's unaided memory. This could change with Artificial Intelligence (AI). AI is transforming the world of medicine.

AI in medicine is a new research area that combines sophisticated representational and computing techniques with the insights of expert physicians to produce tools for improving health care. ML, referring to computer algorithms that can learn to perform particular tasks on their own by analyzing data, is the science of getting computers to act without being explicitly programmed. ML is an approach to achieve AI. Like a human, a ML application learns by experience and/or instruction. By applying the advanced ML capabilities, patients and healthcare providers benefit from more rapid and thorough analysis to translate DNA insights, understand a person’s genetic profile and gather relevant information from medical literature to personalize treatment options for patients. Deep Learning (DL), a technique for implementing ML, has enabled many practical applications of ML and by extension the overall field of AI. Three trends drive the DL revolution: more powerful GPUs, sophisticated neural network algorithms modeled on the human brain, and access to the explosion of data from the Internet. Thanks to DL, AI has a bright future.

Computer vision is a subdomain of AI that deals with how computers gain high level understanding through acquiring, processing and analyzing digital images and video. With Deep Learning (DL), a lot of new applications of computer vision technologies have been introduced. For example, we may use computer vision technologies to process medical images. These technologies help doctors detect malign changes such as tumors and hardening of the arteries and provide highly accurate measurements of organs and blood flow. Some medical startups claim they’ll soon be able to use computers to read X-rays, MRIs, and CT scans more rapidly and accurately than radiologists, to diagnose cancer earlier and less invasively, and to accelerate the search for life-saving pharmaceuticals. Hospitals and imaging centers that can interpret images faster and more accurately with the use of fewer radiologists.

"Data is the new oil.” If data is the new oil, AI is the new internal combustion engine, converting data into insights, predictions, and recommendations that boost productivity and augment decision-making. AI can spot trends and patterns that we would not otherwise see. After all, AI is not a solution. It is a capability that can be packaged into solutions that can increase their effectiveness, often dramatically. Success in any specific AI application is dependent on integrating context (data). You can't simply just install AI software tools and solve problems.

Healthcare providers will also share knowledge they glean from treating patients. This is key. In the era of Electronic Health Records (EHR), it is possible to examine the decision outcomes made by doctors. By enabling researchers at the institutions to mine a much larger store of data, they can more easily spot patterns and identify best practices. When it comes to effectiveness of ML, more data almost always yields better results—and the healthcare sector is sitting on a data goldmine. McKinsey estimates that big data and ML in pharma and medicine could generate a value of up to $100B annually, based on better decision-making, optimized innovation, improved efficiency of research/clinical trials, and new tool creation for physicians, consumers, insurers, and regulators. 

The list below is by no means complete, but provides a useful lay-of-the-land of some of ML’s impact in the healthcare industry: Scaled Up/Crowdsourced Medical Data Collection, Disease Identification/Diagnosis, Diagnosis in Medical Imaging, Personalized Treatment/Behavioral Modification, Treatment Queries and Suggestions, Drug Discovery/Manufacturing, Clinical Trial Research, Radiology and Radiotherapy, Smart Electronic Health Records, Epidemic Outbreak Prediction, Robotic Surgery, and Automatic Treatment or Recommendation.

Machine Learning in Pharmaceuticals

Machine Learning (ML), one of the most prominent approaches in artificial intelligence, is the future of pharma. The human genome project and thousands of subsequent discoveries at the DNA, RNA, and protein levels were made possible by ML's ability to detect patterns across large and often messy data sets. ML has the potential to expedite the clinical drug discovery and development process by applying sophisticated algorithms to the analysis and mining of different data sources to predict molecule behavior and suitability as drug targets or therapeutic entities.

The current drug discovery process – too lengthy and very expensive. It can take up to 15 years to translate a drug discovery idea from initial inception to a market ready product. Industry is currently said to spend well over $1 billion per drug. That’s partly because all the drugs that didn’t make it have to be paid for. As our understanding of biology deepens thanks to the availability of new data and algorithms capable of learning from it, the drug discovery process is literally being transformed. ML presents the pharmaceutical industry with a real opportunity to do R&D differently, so that it can operate more efficiently and substantially improve success at the early stages of drug development.

The drug discovery process and the researchers that drive the pipelines can be greatly aided by the latest innovations in ML technology. The average biomedical researcher is dealing with a huge amount of new information every day. It’s estimated that the bioscience industry is getting 10,000 new publications uploaded on a daily basis – from across the globe and among a huge variety of biomedical databases and journals. So it’s impossible for researchers to know, let alone process, all of the scientific knowledge out there relating to their area of investigation. What’s more, without the ability to correlate, assimilate and connect all this data, it’s impossible for new usable knowledge – which can be used to develop new drug hypotheses – to be created. 

ML has a vital role to play in augmenting the work of drug development researchers so that an informed, first analysis of the mass of scientific data can be conducted in order to form essential new knowledge. What was once an entirely hypothesis driven approach where humans posed the questions is shifting toward scientists starting with an outcome and using machine learning to help discover important relationships to that outcome within the data. 

ML will also help in terms of the industry’s selection of patients for clinical trials and enable companies to identify any issues with compounds much earlier when it comes to efficacy and safety. So the industry has much to gain by adopting ML approaches. It can be used to good effect to build a strong, sustainable pipeline of new medicines.

Stanford _00044
(Stanford University - Hank Ping Han Hsieh)

5G Wireless Network and 5G Technology

Mobile revolution has changed everything. Our future is a world of connected devices. That means enormous needs for infrastructure, speed and support. The next generation of wireless networks, the ‘fifth generation’ or 5G of which is seen as consumer oriented, will change the way we communicate, the way we do business, the way we do everything! The impact of 5G will extend well beyond telecommunications: by connecting people, machines and things on a massive scale. 5G networks will combine numerous wireless technologies, such as 4G LTE, Wi-Fi, and millimeter wave technology to push mobile connection speeds over 100 megabits per second. 5G will also leverages cloud infrastructure, intelligent edge services and virtualized network core. 

Instead of point-to-point communications provided by legacy mobile networks, 5G will move packets of data following the most efficient path to their destination. This shift enables real time aggregation and analysis of data, moving wireless technology from communication to computing. Four factors distinguish 5G from its predecessors: connected devices, fast and intelligent networks, back-end services and extremely low latency. These qualities enable a fully connected and interactive world with a variety of applications. 

5G wireless technology will provide the backbone for IoT (e.g., Health IoT) that greatly improves data transfer speeds and processing power over its predecessors. This combination of speed and computing power will enable new applications for mobile technologies, especially in health care. By 2020, the 5G network will support more than 20 billion connected devices, 212 billion connected sensors and enable access to 44 zettabytes of data gathered from a wide range of devices from smartphones to remote monitoring devices. Healthcare organizations are eager to embrace IoT devices because they save money by keeping patients out of the hospital. If IoT devices can diagnose people in advance then that saves huge costs. 

5G networks open up new avenues for the delivery of health care. Instead of bringing patients to a doctor for treatment, 5G networks can connect patients and doctors from across the globe. Digital imaging can be sent anywhere in the world for analysis, expanding access for patients who live far away from health care providers. 5G might be used for wireless remote surgery. The point of care will move rapidly into the home, With ubiquitous mobile broadband-enabled internet access, connectivity and networking are becoming completely independent of location.

Wireless Healthcare

The rapid evolution of wireless technologies coupled with advances in related fields such as biosensor design, low power battery operated systems, diagnosing and reporting for intelligent information management, genome sequencing, and advances in analytic software, etc. has opened up many new applications for wireless systems in medicine (uHealth – ubiquitous Health). With the inclusion of Electronic Health Care, Point-of-Care technologies, E-Health and M-Health protocols, and personalized healthcare/medicine, the medical informatics area is entering into another era of massive amount of information. The medical and health care information databases would lead to new knowledge bases, discoveries in medical research, engineering oriented developments and clinical translational research and practices. 

Data sharing is reaping huge rewards in the fight against cancer too. At the individual level, health tracker apps on our mobile devices are sending data to health care providers to improve patient care and provide early-warning signs in at-risk patients. Early detection and monitoring is critical to mounting effective cancer treatments (and speed is critical because cancer treatment is a race against fast-replicating cells.). By combining implantable cancer detectors (using new methods in molecular imaging and micro-electromechanical systems (MEMS) technologies) with wireless data transmission technologies, new tools and emerging technologies for continuous monitoring during and after cancer treatment to signal remission and relapse or even trigger micro-scale drug delivery systems for automatic therapeutic interventions are on the horizon. 

A Digital Revolution in Health Care

The convergence of several trends -- wider adoption of electronic health records (EHRs) or electronic medical records (EMRs), advances in mobile technology, and payment reform -- is accelerating the pace of change in how healthcare is delivered. A digital revolution in health care is speeding up. Telemedicine, predictive diagnostics, wearable sensors and a host of new apps (i.e., FDA-approved mobile devices) will transform how people, (or e-patients - individuals who are equipped, enabled, empowered and engaged in their health and health care decisions), manage their health. The age of digital health/medicine is here. 

At the intersection of health, technology and health care are the devices and instruments that capture physiological data. Digital health is an approach focused on using such technology to monitor and provide relevant health-related data about individuals. These technologies include a rapidly expanding array of consumer products and wearables, as well as complex clinical care platforms in academic medical centers. These new devices need to be tested and validated, which also falls into the digital health rubric.

Social media and mobile devices have swiftly become more ubiquitous in the healthcare industry and integrated into daily life (for example, our incessant need for instantaneous medical diagnoses via the web). And digital health tools like smartphones certainly do make it easier. As patients continue to gain access and share healthcare information (such as sleep patterns, heart rate, activity levels, blood oxygen, glucose levels, and even stress, etc.) through various forms of online media (such as via a smartphone, smartband, or glucose monitor, etc.), healthcare organizations have started to use social media (i.e., Internet-based applications) to better connect with patients and their community on a wide range of healthcare issues. This initiative could appeal to anyone with an interest in a healthier lifestyle or, more specifically, to patients who suffer from chronic illnesses like heart disease, diabetes, stroke, hypertension (high blood pressure). We have really entered the era of peer-to-peer healthcare.

Mobile Devices (or apps) and Biometric Data

Chronic diseases are long-term medical conditions that are generally progressive and are a significant cause of illness and death. These patients need closer health status monitoring and the study of their biometric data could allow physicians to foresee their crises. A number of technologies can reduce overall costs for the prevention or management of chronic illnesses. These include devices that constantly monitor health indicators, devices that auto-administer therapies, or devices that track real-time health data when a patient self-administers a therapy. Because they have increased access to high-speed Internet and smartphones, many patients have started to use mobile applications (i.e., FDA-approved mobile devices or apps) to manage various health needs. For example, having an EKG-accurate (Electrocardiogram) monitor strapped to (a mass amount of) wearers (or patients) throughout the day could be hugely beneficial to the study of heart disease. Accurate EKG data generated throughout a normal day and having the data collected sent automatically to scientists and doctors, combined with other metrics, could help researchers understand more about heart performance. Software could be used to warn users of a heart attack or stroke days in advance. Such data provide a practitioner immediate information about a patient, and when collected from large numbers of people, can reveal patterns and trends that are clinically useful. Data collection by patients gives them “ownership” of the process; they become more motivated to track and adjust their behavior to prevent disease, to recognize changes and to follow care plans developed in consultation with their providers. The “Internet of Things and Beyond” will make health monitoring, diagnostics and treatment more personalized (i.e., personalized and precision medicine), timely and convenient, while also lowering costs.

Electronic Health Records and Big Data

An electronic health record (EHR) is a digital version of a patient’s paper chart. EHRs are real-time, patient-centered records that make information available instantly and securely to authorized users. EHRs can: (a) contain a patient’s medical history, diagnoses, medications, treatment plans, immunization dates, allergies, radiology images, and laboratory and test results. (b) allow access to evidence-based tools that providers can use to make decisions about a patient’s care. (c) automate and streamline provider workflow.  

One of the key features of an EHR is that health information can be created and managed by authorized providers in a digital format capable of being shared with other providers across more than one health care organization. EHRs are built to share information with other health care providers and organizations – such as laboratories, specialists, medical imaging facilities, pharmacies, emergency facilities, and school and workplace clinics – so they contain information from all clinicians involved in a patient’s care.  

There are undeniable clinical, operational, and administrative benefits of embracing EHR in medical care. It helps in having a clear overview of the patient history and relevant data, it can safely store clinical notes, provide a thorough list of patient’s allergies, make viewing lab and imaging results a lot easier, and much more. It truly can improve patient care and help with increasing the level of safety when it comes to medical practice. 

However, there are many different EHRs systems used in the U.S. each with its own language for representing and sharing data. Critical information is often scattered across multiple facilities, and sometimes it isn’t accessible when it is needed most—a situation that plays out every day around the U.S., costing money and sometimes even lives. Technologists and health-care professionals across the globe see blockchain technology as a way to streamline the sharing of health records in a secure way, protect sensitive data from hackers, and give patients more control over their information. But before an industry-wide revolution in medical records is possible, a new technical infrastructure—a custom-built “health-care blockchain”—must be constructed.

Implementing electronic health records has been a goal of the U.S. government for years. There’s lots of hope and lots of excitement surrounding the promise of discovery held in the electronic health records that document the process of care. There are many challenges waiting to be resolved before the EHR becomes the new big data, and in fact EHRs themselves are changing. 

(The University of Chicago - Maya Lim)

Big Data Applications and Analytics in Biomedical Research and Healthcare

New medical breakthroughs as well as the effective management of healthcare in the future requires the integration of data (e.g., crowdsourced data collection) and methods across the different realms of fundamental research, development of therapeutics (e.g., nanotechnology-based cancer therapeutics), healthcare practice, and massive high performance computing infrastructure. 

The large volume of data coming from all the different health-monitoring devices and constituting the ‘individualome’ requires large-capacity hardware infrastructures for storage and processing. Such resources can be implemented locally at the data centers associated with hospitals or deployed on secured cloud computing or virtual private server computing environments. In particular, given the sensitivity of the information, only secured and HIPAA-approved architectures should be considered. 

High-throughput platforms such as microarray, mass spectrometry, and next-generation sequencing are producing an increasing volume of 'omics data that needs large data storage and computing power. Cloud computing offers massive scalable computing and storage, data sharing, on-demand anytime and anywhere access to resources and applications, and thus, it may represent the key technology for facing those issues. 

High-performance analytics, high-speed connections and affordable data storage have made large data-sharing projects possible in healthcare too. Healthcare analytics can not only help reduce the cost of healthcare facilities including treatments, medication, and diagnosis. Analytics in this area can also contribute to predicting the outbreak of endemic and epidemic diseases like SAARS and the Flu.

In healthcare, big data tools and technologies have the potential to create significant value by improving outcomes while lowering costs for each individual patient. Diagnostic images, genetic test results and biometric information are increasingly generated and stored in electronic health records presenting us with challenges in data that is by nature high volume, variety and velocity, thereby necessitating novel ways to store, manage and process big data. This presents an urgent need to develop new, scalable and expandable big data infrastructure and analytical methods that can enable healthcare providers access knowledge for the individual patient, yielding better decisions and outcomes. With these combined data sources from hundreds of studies and dozens of companies, researchers – from large academic institutions, commercial organizations, or small research labs in remote corners of the world – are finding deeper insights than ever before, getting answers faster, reducing duplication of effort and improving efficiency.

Data-driven healthcare has its own set of obstacles. Medical data encompasses different hospitals, districts, and states. They include several administrative systems. This call for the necessity of a new tool that can help data providers and data users collaborate with each other. This is why the creation of new analytics tools, strategies, and data applications are significant right now. Healthcare needs acute data analysis in the forms of the graph, machine learning and predictive analysis that other industries are already enjoying.

Supercomputing and Biomedical Research

Supercomputing (HTC and HPC ) efficiently solves extremely complex or data intensive problems by concentrating the processing power of multiple, parallel computers. It involves a system working at the maximum potential performance of any computer, typically measured in Petaflops. The majority of supercomputers today run Linux-based operating systems. 

Supercomputing enables problem solving and data analysis that would be simply impossible, too time-consuming or costly with standard computers. It enables a revolutionary approach to improve biological understanding, human health and biosecurity through application of advanced computational technology -- bringing together large-scale simulation, deep analysis of complex and diverse data and new targeted sensor and measurement technologies. Supercomputing opens up new horizons, offering the possibility of discovering new ways to understand life’s complexity.

For example, Supercomputers can help identify effective treatments for diseases such as cancer. Supercomputers have the potential to greatly accelerate the development of cancer therapies by finding patterns in massive datasets too large for human analysis. Supercomputers can help us better understand the complexity of cancer development, identify novel and effective treatments, and help elucidate patterns in vast and complex data sets that advance our understanding of cancer. 

Healthcare 4.0 and The Connected Healthcare Ecosystem of the Future

Big data technologies are increasingly used for biomedical and healthcare informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of healthcare.
To address the challenges of big data, innovative technologies are needed. Parallel, distributed computing paradigms, scalable machine learning algorithms, and real-time querying are key to analysis of big data. Distributed file systems, computing clusters, cloud computing, and data stores supporting data variety and agility are also necessary to provide the infrastructure for processing of big data. Workflows provide an intuitive, reusable, scalable and reproducible way to process big data to gain verifiable value from it in and enable application of same methods to different datasets.
In the digital age, health interoperability - the ability of different devices, IT systems and software to communicate, exchange and use shared data – will  play an increasingly important role in providing timely, accurate care based on access to real-time patient health data and records. An emerging open, standards based technology platform enables health systems, providers and app vendors to share and integrate health data from multiple sources, making pertinent patient information securely accessible when and where it’s needed. Telemedicine success hinges on connections. Hospitals need to integrate hardware and software to improve remote care delivery experiences for patients.
Big Data and machine learning, the Internet of Things as well as mobile patient platforms are starting to change every day healthcare practice. As a result, the industry is facing a second wave of digitalization, sometimes also referred to as “Healthcare 4.0”. The goal will be more patients to be seen, diagnosed and cared for in more affordable and effective ways, and to recognize that health and care management needs (a patient-centric business model) to occur wherever the patient is, not just in hospitals or physician offices. 

Fueled by three converging trends - increasing government support and reimbursement for telehealth services; purpose-built, integrated hardware-software solutions; and the “consumerization of medical devices” - telemedicine is expected to grow rapidly.



<updated by hhw: 1/29/18>



Document Actions