This poster is published under an
open license. Please read the
disclaimer for further details.
Keywords:
Image verification, Arteriosclerosis, Outcomes analysis, Computer Applications-General, Computer Applications-3D, Neural networks, Image manipulation / Reconstruction, CT-Quantitative, Computer applications, Cardiac
Authors:
H. Lyshkow1, N. Reichek2, T. Sippel Schmidt3, E. Lyshkow4, W. Balcavage5, J. Tang2, A. Katz2, J. J. Cao2; 1Bristol/UK, 2Roslyn/US, 3Heartland/US, 4Bath/UK, 5Terre Haute/US
DOI:
10.1594/ecr2017/C-0849
Conclusion
The retrospective capacity for deriving new findings from many existing datasets is hampered by the ability to read legacy media and file formats,
as well as errors in vendor interpretation of data standards.
As healthcare facilities collect ever increasing amounts of data,
the ability to break down the monolithic silos of data found within Radiology,
Cardiology and other data intensive specialties offers the potential for opening new doors into the understanding of disease and health.
Proper data preparation of existing data is mandatory for gleaning new insights and creating a sustainable data lifecycle (Figure 9),
whether for feeding Machine Learning algorithms or applying novel investigations into existing datasets.
In the case of the St.
Francis Heart Study data,
risk factor assessment has changed markedly since the study was designed and new questions arise continually that can be answered by in depth analysis of the previously collected data.
Further,
frozen plasma and serum from the original SFHS is still available.
Using modern,
inexpensive tools these samples can provide genomic,
proteomic and metabolomics data which when coupled with the original SFHS data will likely enable the discovery of many new cardiac biomarkers that will lead to highly improved risk analysis and new cardiovascular therapies.