Please use this identifier to cite or link to this item: https://research.matf.bg.ac.rs/handle/123456789/1296
DC FieldValueLanguage
dc.contributor.authorSavić, Đorđe V.en_US
dc.contributor.authorJankov, Isidoraen_US
dc.contributor.authorYu, Weixiangen_US
dc.contributor.authorPetrecca, Vincenzoen_US
dc.contributor.authorTemple, Matthew J.en_US
dc.contributor.authorNi, Qinglingen_US
dc.contributor.authorShirley, Raphaelen_US
dc.contributor.authorKovačević, Anđelkaen_US
dc.contributor.authorNikolić, Mladenen_US
dc.contributor.authorIlić, Draganaen_US
dc.contributor.authorPopović, Lukaen_US
dc.contributor.authorPaolillo, Maurizioen_US
dc.contributor.authorPanda, Swayamtruptaen_US
dc.contributor.authorĆiprijanović, Aleksandraen_US
dc.contributor.authorRichards, Gordon T.en_US
dc.date.accessioned2024-06-05T10:39:05Z-
dc.date.available2024-06-05T10:39:05Z-
dc.date.issued2023-08-20-
dc.identifier.issn00046256-
dc.identifier.urihttps://research.matf.bg.ac.rs/handle/123456789/1296-
dc.description.abstractDevelopment of the Rubin Observatory Legacy Survey of Space and Time (LSST) includes a series of Data Challenges (DCs) arranged by various LSST Scientific Collaborations that are taking place during the project's preoperational phase. The AGN Science Collaboration Data Challenge (AGNSC-DC) is a partial prototype of the expected LSST data on active galactic nuclei (AGNs), aimed at validating machine learning approaches for AGN selection and characterization in large surveys like LSST. The AGNSC-DC took place in 2021, focusing on accuracy, robustness, and scalability. The training and the blinded data sets were constructed to mimic the future LSST release catalogs using the data from the Sloan Digital Sky Survey Stripe 82 region and the XMM-Newton Large Scale Structure Survey region. Data features were divided into astrometry, photometry, color, morphology, redshift, and class label with the addition of variability features and images. We present the results of four submitted solutions to DCs using both classical and machine learning methods. We systematically test the performance of supervised models (support vector machine, random forest, extreme gradient boosting, artificial neural network, convolutional neural network) and unsupervised ones (deep embedding clustering) when applied to the problem of classifying/clustering sources as stars, galaxies, or AGNs. We obtained classification accuracy of 97.5% for supervised models and clustering accuracy of 96.0% for unsupervised ones and 95.0% with a classic approach for a blinded data set. We find that variability features significantly improve the accuracy of the trained models, and correlation analysis among different bands enables a fast and inexpensive first-order selection of quasar candidates.en_US
dc.language.isoenen_US
dc.publisherIOP publishingen_US
dc.relation.ispartofAstronomical Journalen_US
dc.titleThe LSST AGN Data Challenge: Selection Methodsen_US
dc.typeArticleen_US
dc.identifier.doi10.3847/1538-4357/ace31a-
dc.identifier.scopus2-s2.0-85163759317-
dc.identifier.isi001046561500001-
dc.identifier.urlhttps://api.elsevier.com/content/abstract/scopus_id/85163759317-
dc.contributor.affiliationAstronomyen_US
dc.contributor.affiliationInformatics and Computer Scienceen_US
dc.contributor.affiliationAstronomyen_US
dc.relation.issn0004-637Xen_US
dc.description.rankM21en_US
dc.relation.firstpageArticle no. 138en_US
dc.relation.volume953en_US
dc.relation.issue2en_US
item.fulltextNo Fulltext-
item.languageiso639-1en-
item.openairetypeArticle-
item.grantfulltextnone-
item.openairecristypehttp://purl.org/coar/resource_type/c_18cf-
item.cerifentitytypePublications-
crisitem.author.deptAstronomy-
crisitem.author.deptInformatics and Computer Science-
crisitem.author.deptAstronomy-
crisitem.author.orcid0000-0001-5139-1978-
crisitem.author.orcid0000-0002-1134-4015-
Appears in Collections:Research outputs
Show simple item record

SCOPUSTM   
Citations

3
checked on Nov 10, 2024

Page view(s)

22
checked on Nov 14, 2024

Google ScholarTM

Check

Altmetric

Altmetric


Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.