Cited 0 times in Scipus Cited Count

A Machine-Learning Approach Using PET-Based Radiomics to Predict the Histological Subtypes of Lung Cancer

DC Field Value Language
dc.contributor.authorHyun, SH-
dc.contributor.authorAhn, MS-
dc.contributor.authorKoh, YW-
dc.contributor.authorLee, SJ-
dc.date.accessioned2022-01-14T05:16:42Z-
dc.date.available2022-01-14T05:16:42Z-
dc.date.issued2019-
dc.identifier.issn0363-9762-
dc.identifier.urihttp://repository.ajou.ac.kr/handle/201003/19967-
dc.description.abstractPURPOSE: We sought to distinguish lung adenocarcinoma (ADC) from squamous cell carcinoma using a machine-learning algorithm with PET-based radiomic features.
METHODS: A total of 396 patients with 210 ADCs and 186 squamous cell carcinomas who underwent FDG PET/CT prior to treatment were retrospectively analyzed. Four clinical features (age, sex, tumor size, and smoking status) and 40 radiomic features were investigated in terms of lung ADC subtype prediction. Radiomic features were extracted from the PET images of segmented tumors using the LIFEx package. The clinical and radiomic features were ranked, and a subset of useful features was selected based on Gini coefficient scores in terms of associations with histological class. The areas under the receiver operating characteristic curves (AUCs) of classifications afforded by several machine-learning algorithms (random forest, neural network, naive Bayes, logistic regression, and a support vector machine) were compared and validated via random sampling.
RESULTS: We developed and validated a PET-based radiomic model predicting the histological subtypes of lung cancer. Sex, SUVmax, gray-level zone length nonuniformity, gray-level nonuniformity for zone, and total lesion glycolysis were the 5 best predictors of lung ADC. The logistic regression model outperformed all other classifiers (AUC = 0.859, accuracy = 0.769, F1 score = 0.774, precision = 0.804, recall = 0.746) followed by the neural network model (AUC = 0.854, accuracy = 0.772, F1 score = 0.777, precision = 0.807, recall = 0.750).
CONCLUSIONS: A machine-learning approach successfully identified the histological subtypes of lung cancer. A PET-based radiomic features may help clinicians improve the histopathologic diagnosis in a noninvasive manner.
-
dc.subject.MESHAged-
dc.subject.MESHArea Under Curve-
dc.subject.MESHBayes Theorem-
dc.subject.MESHCarcinoma, Squamous Cell-
dc.subject.MESHFemale-
dc.subject.MESHHumans-
dc.subject.MESHImage Processing, Computer-Assisted-
dc.subject.MESHLung Neoplasms-
dc.subject.MESHMachine Learning-
dc.subject.MESHMale-
dc.subject.MESHMiddle Aged-
dc.subject.MESHPositron Emission Tomography Computed Tomography-
dc.subject.MESHROC Curve-
dc.subject.MESHRetrospective Studies-
dc.titleA Machine-Learning Approach Using PET-Based Radiomics to Predict the Histological Subtypes of Lung Cancer-
dc.typeArticle-
dc.identifier.pmid31689276-
dc.subject.keywordadenocarcinoma-
dc.subject.keywordmachine learning-
dc.subject.keywordnon–small cell lung cancer-
dc.subject.keywordPET-
dc.subject.keywordtexture analysis-
dc.contributor.affiliatedAuthorAhn, MS-
dc.contributor.affiliatedAuthorKoh, YW-
dc.contributor.affiliatedAuthorLee, SJ-
dc.type.localJournal Papers-
dc.identifier.doi10.1097/RLU.0000000000002810-
dc.citation.titleClinical nuclear medicine-
dc.citation.volume44-
dc.citation.number12-
dc.citation.date2019-
dc.citation.startPage956-
dc.citation.endPage960-
dc.identifier.bibliographicCitationClinical nuclear medicine, 44(12). : 956-960, 2019-
dc.embargo.liftdate9999-12-31-
dc.embargo.terms9999-12-31-
dc.identifier.eissn1536-0229-
dc.relation.journalidJ003639762-
Appears in Collections:
Journal Papers > School of Medicine / Graduate School of Medicine > Hematology-Oncology
Journal Papers > School of Medicine / Graduate School of Medicine > Pathology
Journal Papers > School of Medicine / Graduate School of Medicine > Nuclear Medicine & Molecular Imaging
Files in This Item:
There are no files associated with this item.

qrcode

해당 아이템을 이메일로 공유하기 원하시면 인증을 거치시기 바랍니다.

Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.

Browse