The tumor, node, metastasis (TNM) staging system enables clinicians to describe the spread of head-and-necksquamous-cell-carcinoma (HNSCC) cancer in a specific manner to assist with the assessment of disease status, prognosis, and management. This study aims to predict TNM staging for HNSCC cancer via Hybrid Machine Learning Systems (HMLSs) and radiomics features. In our study, 408 patients were included from the Cancer Imaging Archive (TCIA) database, included in a multi-center setting. PET images were registered to CT, enhanced, and cropped. We created 9 sets including CT-only, PET-only, and 7 PET-CT fusion sets. Next, 215 radiomics features were extracted from HNSCC tumor segmented by the physician via our standardized SERA radiomics package. We employed multiple HMLSs, including 16 feature-extraction (FEA) + 9 feature selection algorithms (FSA) linked with 8 classifiers optimized by grid-search approach, with model training, fine-tuning, and selection (5-fold cross-validation; 319 patients), followed by external-testing of selected model (89 patients). Datasets were normalized by z-score-technique, with accuracy reported to compare models. We first applied datasets with all features to classifiers only; accuracy of 0.69 ± 0.06 was achieved via PET applied to Random Forest classifier (RFC); performance of external testing (~ 0.62) confirmed our finding. Subsequently, we employed FSAs/FEAs prior to the application of classifiers. We achieved accuracy of 0.70 ± 0.03 for Curvelet transform (fusion) + Correlation-based Feature Selection (FSA) + K-Nearest Neighbor (classifier), and 0.70 ± 0.05 for PET + LASSO (FSA) + RFC (classifier). Accuracy of external testing (0.65 & 0.64) also confirmed these findings. Other HMLSs, applied on some fused datasets, also resulted in close performances. We demonstrate that classifiers or HMLSs linked with PET only and PET-CT fusion techniques enabled relatively low improved accuracy in predicting TNM stage. Meanwhile, the combination of PET and RFC enabled good prediction of TNM in HNSCC.
Multi-level multi-modality fusion radiomics is a promising technique with potential to improve the prognostication of cancer. We aim to use advanced fusion techniques on PET and CT images coupled with deep learning (DL) to improve outcome prediction in head and neck squamous cell carcinoma (HNSCC). In our study, 408 HNSCC patients were included from The Cancer Imaging Archive (TCIA) in a multi-center setting. Prognostic outcomes (binary classification) included overall survival (OS), distant metastasis (DM), locoregional recurrence (LR), and progression free survival (PFS). We utilized a DL algorithm with a 17-layer 3D convolutional neural network (CNN) architecture. Prior to training, each image underwent min-max-normalization, image-augmentation by using random rotations (0-20°) to improve the performance and generalizability of our model and followed by 5-fold-cross-validation. We employed 12 datasets, including CT, PET, and 10 image-level fused datasets. The best OS performance was achieved via Discrete-wavelet-transform (DWT) resulting in mean accuracy of 0.93±0.06. The best DM score was achieved via ratio of low-pass pyramid (RLPP), resulting in an accuracy of 0.95±0.02. Optimal LR and PFS scores were achieved using DWT and RLPP for LR, and Laplacian pyramid for PFS, resulting in accuracies of 0.90-0.92. Comparatively, when using a machine learning framework instead of deep learning, we obtained scores of 0.83, 0.90, and 0.87 for the prediction of OS, DM, and LR. Our study demonstrates that our multi-modality fusion techniques performed better than using standalone PET or CT in prognostication of HNSCC patients and that a high level of accuracy can be achieved for prognostication of HNSCC patients when combining multi modality fusion techniques with DL.
Access to the requested content is limited to institutions that have purchased or subscribe to SPIE eBooks.
You are receiving this notice because your organization may not have SPIE eBooks access.*
*Shibboleth/Open Athens users─please
sign in
to access your institution's subscriptions.
To obtain this item, you may purchase the complete book in print or electronic format on
SPIE.org.
INSTITUTIONAL Select your institution to access the SPIE Digital Library.
PERSONAL Sign in with your SPIE account to access your personal subscriptions or to use specific features such as save to my library, sign up for alerts, save searches, etc.