Cross- Project collaboration plan
The Neurotwin project (Grant No. 101017716) is a complementary project to two additional EU Horizon 2020 projects: OPTOMICS (EU Grant No. 101017802) and DIGIPREDICT (Grant No. 101017915). Collaboratively with these complementary projects the Neurotwin project aims to increase public awareness and streamline the process of developing and ensuring the success of digital twin technologies.
Edge AI-deployed DIGItal Twins for PREDICTing disease progression and need for early intervention in infectious and cardiovascular diseases beyond COVID-19
The interplay between viral infection, host response, development of (hyper)inflammation and cardiovascular injury in COVID-19 is currently poorly understood which makes it difficult to predict which patients remain with mild symptoms only and which patients rapidly develop multi organ failure. The solution offered by DIGIPREDICT is an Edge Artificial Intelligence (AI) based, high-tech personalized computational and physical Digital Twin vehicle representing patient-specific (patho)physiology, with embedded disease progression prediction capability, focusing on COVID-19 and beyond. DIGIPREDICT proposes the first of its kind Digital Twin, designed, developed and calibrated on i) patient measurements of various Digital Biomarkers and their interaction, ii) Organ-On-Chips (OoCs) as physical counterpart using patient blood for personalized screening and iii) integration of those physiological readouts using AI at Edge technologies. The final goal is to identify and validate patient-specific dynamic digital fingerprints of complex disease state and prediction of the progression as a basis for assistive tools for medical doctors and patients. Using and improving state-of-the-art OoCs and Digital Biomarkers (for physiology and biomarkers in interstitial fluid) we will measure detailed response to viral infection. By closely monitoring the response with wearable multi-modal Edge AI patches, we aim to predict in near real-time the progression of the disease, support early clinical decision and to propose patient-specific therapy using existing drugs. We will combine scientific and technical excellence in a highly multi- and inter-disciplinary project, bringing together medical, biological, electronical, computer, signal processing and social science communities around Europe to setup Digital Twin at Edge. We will enable an Edge-to-Cloud vision, significantly advancing current state of the art and setting up a new European community for researching and applying Digital Twins.
Combining optoacoustic imaging phenotypes and multi-omics to advance diabetes healthcare
Diabetes has emerged as a global pandemic affecting more than 420 million people worldwide, a number expected to further rise in the next decades. The disease has very heterogeneous outcomes and accurate patient staging or prediction of subsets of individuals likely to develop disease and/or progress to disease complications are currently unmet clinical challenges in need of urgent attention. OPTOMICS aims to research methodology that can deliver a paradigm shift in type 2 diabetes healthcare, by integrating 1) molecular phenotyping, 2) a new generation of phenotypic measurements in humans, representative of diabetes onset and progression, allowed by novel portable and non-invasive optoacoustic technology and 3) cutting-edge computational approaches leveraging progress in Artificial Intelligence. This research will develop and validate a digital twin model that catalyses a step change in shortening the path to translation, enabling applications in the entire spectrum from target identification & prevention/prognosis to patient stratification for type 2 diabetes and its complications. In addition to the research and technology goals, OPTOMICS places special attention to the ethical needs and implications of the work performed and further aims at exemplary project management, human measurements, dissemination and communication activities and updating an adept exploitation plan for the digital twin developed.