Personal tools
You are here: Home Research Trends & Opportunities Life Sciences, Biomedical and ICT Convergence Smart, Precision and Preventive Medicine

Smart, Precision and Preventive Medicine

Precision Medicine_060422A
[Precision Medicine, Hussien Heshmat]


Transforming Health Through Accurate Understanding of Genes, Environment and Lifestyle

 

 

- Overview

Medicine is difficult because each patient is different. Doctors keep saying "Medicine is not an exact science" The individual differences of patients make choosing therapy and applying it to different clinical scenarios very challenging. How to tailor medicine for each and every individual person? How can genetic testing guide us in the choice of anti-platelets? How can these polymorphisms determine the response of a patient to chemotherapy or to warfarin? How can these tests be incorporated into daily practice? Mixing clinical variables, genetic variants, and molecular profiles, all into Artificial Intelligence can lead to "Precision Medicine".

Precision medicine aims to collect, connect, and apply vast amounts of scientific research data and information about our health to understand why individuals respond differently to treatments and therapies, and help guide more precise and predictive medicine worldwide." 

Big data technologies are increasingly used for biomedical and healthcare informatics research. Large amounts of biological and clinical data have been generated and collected at an unprecedented speed and scale. For example, the new generation of sequencing technologies enables the processing of billions of DNA sequence data per day, and the application of electronic health records (EHRs) is documenting large amounts of patient data. Big data applications present new opportunities to discover new knowledge and create novel methods to improve the quality of healthcare.
 
To address the challenges of big data, innovative technologies are needed. Parallel, distributed computing paradigms, scalable machine learning algorithms, and real-time querying are key to analysis of big data. Distributed file systems, computing clusters, cloud computing, and data stores supporting data variety and agility are also necessary to provide the infrastructure for processing of big data. Workflows provide an intuitive, reusable, scalable and reproducible way to process big data to gain verifiable value from it in and enable application of same methods to different datasets.

 

  • Click here for the theme of biomedical research.
  • Click here for the biomedical research possible workshop topics
  • Click here for the theme of precision medicine revolution.
  • Click here for the precision medicine possible workshop topics.
  • Click here for the new media possible workshop topics
  •  

     

    [More to come ...]


    Document Actions