Please use this identifier to cite or link to this item: https://hdl.handle.net/20.500.11851/12158
Full metadata record
DC FieldValueLanguage
dc.contributor.authorArslan, D.-
dc.contributor.authorSehlaver, S.-
dc.contributor.authorGuder, E.-
dc.contributor.authorTemena, M.A.-
dc.contributor.authorBahcekapili, A.-
dc.contributor.authorOzdemir, U.-
dc.contributor.authorAcar, A.-
dc.date.accessioned2025-03-22T20:56:06Z-
dc.date.available2025-03-22T20:56:06Z-
dc.date.issued2025-
dc.identifier.issn2405-8440-
dc.identifier.urihttps://doi.org/10.1016/j.heliyon.2025.e42467-
dc.identifier.urihttps://hdl.handle.net/20.500.11851/12158-
dc.description.abstractRoutine pathology assessment for the tumor grading is currently performed under the microscope by experienced pathologists which might be prone to interpersonal variability and requiring years of experience. Over the past decade, with the help of whole-slide scanning technology, it is now possible to generate whole-slide images. Indeed, this provides an opportunity to extract vision-based information latent in these images and automate and assist pathologists in their daily workflow. In this process, key machine learning algorithms have been developed enabling an automatic segmentation of pathology slides. Here, in this study, we present a novel dataset for Colorectal Cancer Tumor Grade Segmentation, which contains a total of 103 whole-slide images. The ground-truth annotations for these images were obtained from two independent pathologists. The annotations include pixelwise segmentation masks for “Grade-1”, “Grade-2”, “Grade-3” tumor classes, and “Normal-mucosa” for the normal class. To establish baseline results for this dataset, we trained and evaluated prominent convolutional neural network and transformer models. Our results show that SwinT, a transformer-based model, achieves 63 % mean-dice score, outperforming other transformer-based models and all CNN based models, aligning with the recent success of transformer-based models in the field of computer vision. Most importantly, our new dataset addresses the absence of publicly available datasets for tumor segmentation. Taken together, the findings from our study indicate that integrating various deep neural network structures is promising at facilitating a more unbiased and consistent tumor grading of colorectal cancer using a novel dataset which is publicly available to all researchers. © 2025 The Authorsen_US
dc.description.sponsorshipAmazon Web Services, AWS; Türkiye Academy of Sciences, (118C197); Council of Higher Education Research Universities Support Program, (ADEP-108-2022-11202, ADEP-312-2024-11455)en_US
dc.language.isoenen_US
dc.publisherElsevier Ltden_US
dc.relation.ispartofHeliyonen_US
dc.rightsinfo:eu-repo/semantics/closedAccessen_US
dc.subjectConvolutional Neural Networksen_US
dc.subjectDigital Pathologyen_US
dc.subjectTransformer Modelsen_US
dc.subjectTumor Grade Segmentationen_US
dc.titleColorectal Cancer Tumor Grade Segmentation: a New Dataset and Baseline Resultsen_US
dc.typeArticleen_US
dc.departmentTOBB University of Economics and Technologyen_US
dc.identifier.volume11en_US
dc.identifier.issue4en_US
dc.identifier.scopus2-s2.0-85217973854-
dc.identifier.doi10.1016/j.heliyon.2025.e42467-
dc.authorscopusid59560324100-
dc.authorscopusid59559116600-
dc.authorscopusid59559915600-
dc.authorscopusid57846427100-
dc.authorscopusid59560324200-
dc.authorscopusid36165126800-
dc.authorscopusid15123935000-
dc.relation.publicationcategoryMakale - Uluslararası Hakemli Dergi - Kurum Öğretim Elemanıen_US
dc.identifier.scopusqualityQ1-
dc.identifier.wosqualityQ2-
item.fulltextNo Fulltext-
item.languageiso639-1en-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
item.openairetypeArticle-
item.grantfulltextnone-
crisitem.author.dept06.01. Department of Architecture-
Appears in Collections:Scopus İndeksli Yayınlar Koleksiyonu / Scopus Indexed Publications Collection
Show simple item record



CORE Recommender

Page view(s)

2
checked on Mar 31, 2025

Google ScholarTM

Check




Altmetric


Items in GCRIS Repository are protected by copyright, with all rights reserved, unless otherwise indicated.