{"id":121464,"date":"2026-02-17T16:56:10","date_gmt":"2026-02-17T16:56:10","guid":{"rendered":"https:\/\/ekamu.net\/?p=121464"},"modified":"2026-02-17T16:56:10","modified_gmt":"2026-02-17T16:56:10","slug":"yerli-yapay-zeka-cosmos-t1-kullanima-sunuldu","status":"publish","type":"post","link":"https:\/\/ekamu.net\/index.php\/2026\/02\/17\/yerli-yapay-zeka-cosmos-t1-kullanima-sunuldu\/","title":{"rendered":"Yerli yapay zeka Cosmos T1 kullan\u0131ma sunuldu"},"content":{"rendered":"<p><figure> <span> <img decoding=\"async\" src=\"https:\/\/ekamu.net\/wp-content\/uploads\/2026\/02\/yerli-yapay-zeka-cosmos-t1-kullanima-sunuldu-0-h2Lk8Vzd.jpg\"\/> <\/span> Y\u0131ld\u0131z Teknik \u00dcniversitesi\u2019ndeki Cosmos ara\u015ft\u0131rma ekibi taraf\u0131ndan, T\u00fcrk\u00e7e do\u011fal dil i\u015fleme (NLP) g\u00f6revlerinde y\u00fcksek performans hedefleyen <strong>Cosmos T1<\/strong> adl\u0131 a\u00e7\u0131k kaynakl\u0131 yapay zeka dil modeli resmen kullan\u0131ma sunuldu. <strong>9 milyar parametre<\/strong> ile geli\u015ftirilen Cosmos T1, T\u00fcrk\u00e7e\u2019nin karma\u015f\u0131k yap\u0131s\u0131n\u0131 ve k\u00fclt\u00fcrel ba\u011flam\u0131n\u0131 derinlemesine \u00f6\u011frenme yetene\u011fiyle dikkat \u00e7ekiyor. <\/figure>\n<p>T\u00fcrk\u00e7e d\u00fc\u015f\u00fcnebilen yeni nesil yapay zeka modeli Cosmos T1, Google&#8217;\u0131n yapay zeka modeli Gemma 2 modelinin \u00fczerine in\u015fa edildi. Cosmos T1, T\u00fcrk\u00e7e GSM8K testinde <strong>y\u00fczde 77,41 do\u011fruluk<\/strong> oran\u0131na ula\u015farak 70 milyar parametreli Llama-3.1-70B modelinin (y\u00fczde 66,13) ve Gemma-2-9B modelinin (y\u00fczde 63,10) \u00f6n\u00fcne ge\u00e7meyi ba\u015fard\u0131.<\/p>\n<figure> <span> <img decoding=\"async\" src=\"https:\/\/ekamu.net\/wp-content\/uploads\/2026\/02\/yerli-yapay-zeka-cosmos-t1-kullanima-sunuldu-1-I2xJcudu.jpg\"\/> <\/span> Cosmos T1\u2019in e\u011fitim s\u00fcrecinde 200 milyar token\u2019dan fazla T\u00fcrk\u00e7e veri i\u015flendi. \u00d6nceki dil modellerinden farkl\u0131 olarak, Masked Language Modeling (MLM) ile T\u00fcrk\u00e7e\u2019ye \u00f6zg\u00fc zorluklar \u00fczerine odakland\u0131. 12 katmanl\u0131 decoder-only Transformer mimarisi \u00fczerine in\u015fa edilen model, T\u00fcrk\u00e7e\u2019nin ek-fiil yap\u0131lar\u0131n\u0131 ve sondan eklemeli s\u00f6zdizimini i\u015flemek i\u00e7in optimize edildi. <\/figure>\n<p>Cosmos T1, Hugging Face platformunda a\u00e7\u0131k kaynak olarak kullan\u0131ma sunuldu. Cosmos T1\u2019i ayr\u0131ca Cosmos LLM\u2019in web sitesinden \u00fccretsiz olarak test edebilirsiniz.<\/p>\n<p>ChatGPT\u2019den farkl\u0131 olarak yan\u0131t\u0131n nas\u0131l \u00fcretildi\u011fini ad\u0131m ad\u0131m g\u00f6sterdiklerine vurgu yapan Y\u0131ld\u0131z Teknik \u00dcniversitesi Yapay Zeka ve Veri M\u00fchendisli\u011fi B\u00f6l\u00fcm\u00fc \u00d6\u011fretim \u00dcyesi <strong>Prof. Dr. Mehmet Fatih Amasyal\u0131<\/strong>, \u015funlar\u0131 s\u00f6yledi:<\/p>\n<p>\u201cChatGPT ve benzeri modelleri geli\u015ftiriyoruz. T1 di\u011ferlerinden farkl\u0131 bir d\u00fc\u015f\u00fcnce s\u00fcrecine sahip ve bu d\u00fc\u015f\u00fcnce s\u00fcrecini T\u00fcrk\u00e7e olarak yans\u0131t\u0131yor. B\u00fct\u00fcn arka planda yan\u0131t\u0131n nas\u0131l \u00fcretildi\u011fini ara y\u00fczden ad\u0131m ad\u0131m g\u00f6rebiliyoruz. Asl\u0131nda Chat GPT ile farkl\u0131 kulvardalar. Bir\u00e7ok sekt\u00f6r g\u00fcn\u00fcm\u00fczde verilerini sa\u011fl\u0131k ve savunma sanayi gibi bu tarz firmalarla payla\u015fam\u0131yor, payla\u015fmak istemiyor, payla\u015fmamas\u0131 da \u00e7ok do\u011fal. Bunun yerine on-premise&#8217;de \u00e7\u00f6z\u00fcm \u00fcretmeleri yani kendi makinelerinin \u00e7al\u0131\u015fmas\u0131 gerekiyor. T1&#8217;in ve bizim di\u011fer \u00fcretti\u011fimiz modellerin hepsi asl\u0131nda a\u00e7\u0131k a\u011f\u0131rl\u0131kl\u0131 olarak yay\u0131nlan\u0131yorlar ve insanlar da bu modelleri kendi bilgisayarlar\u0131na kurup kendi \u015firketlerinde \u00f6zg\u00fcrce ve rahat\u00e7a kullanabiliyorlar.\u201d<\/p>\n<p>Cosmos T1&#8217;in Google&#8217;\u0131n yapay zeka modeli Gemma 2 \u00fczerinde d\u00fc\u015f\u00fcnen bir model olarak geli\u015ftirildi\u011fini belirten Amasyal\u0131, &#8220;Fakat T\u00fcrk\u00e7esini biz burada \u00e7ok \u00e7ok iyile\u015ftirmeyi d\u00fc\u015f\u00fcnd\u00fck ve \u2018thinking&#8217; d\u00fc\u015f\u00fcnen bir model geli\u015ftirdik. Gemma 2 modelinde temelde b\u00f6yle bir \u00f6zellik yok. Gemma 2 soru sorunca cevap veren bir model, Cosmos T1&#8217;de ise bir d\u00fc\u015f\u00fcnce s\u00fcreci, onun arkas\u0131ndan bir cevap \u00fcretme s\u00fcreci var ki bu da modellerin performans\u0131n\u0131 \u00e7ok art\u0131r\u0131yor. \u00d6nce d\u00fc\u015f\u00fcn\u00fcp sonra cevap verdi\u011finde direkt cevaba ge\u00e7mektense \u00e7ok daha iyi bir performans sergiliyor&#8221; ifadelerini kulland\u0131.<\/p>\n\n<p><span style=\"display: block; width: 343.125px; color: rgb(55, 58, 60); font-size: 14px; background-color: rgb(255, 249, 236);\"><\/span><\/p>\n<p>Kaynak :\u00a0<span style=\"background-color: rgb(255, 249, 236); color: rgb(55, 58, 60); font-size: 14px;\">https:\/\/www.donanimhaber.com\/yerli-yapay-zeka-cosmos-t1-kullanima-sunuldu&#8211;202220<\/span><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Y\u0131ld\u0131z Teknik \u00dcniversitesi\u2019ndeki Cosmos ara\u015ft\u0131rma ekibi taraf\u0131ndan, T\u00fcrk\u00e7e do\u011fal dil i\u015fleme (NLP) g\u00f6revlerinde y\u00fcksek performans hedefleyen Cosmos T1 adl\u0131 a\u00e7\u0131k kaynakl\u0131 yapay zeka dil modeli resmen kullan\u0131ma sunuldu. 9 milyar parametre ile &#8230;<\/p>\n","protected":false},"author":1,"featured_media":121465,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[8],"tags":[7295,566,1891,505,843],"class_list":["post-121464","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-teknoloji","tag-gemma","tag-modeli","tag-modelleri","tag-sureci","tag-turkce"],"_links":{"self":[{"href":"https:\/\/ekamu.net\/index.php\/wp-json\/wp\/v2\/posts\/121464","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/ekamu.net\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/ekamu.net\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/ekamu.net\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/ekamu.net\/index.php\/wp-json\/wp\/v2\/comments?post=121464"}],"version-history":[{"count":1,"href":"https:\/\/ekamu.net\/index.php\/wp-json\/wp\/v2\/posts\/121464\/revisions"}],"predecessor-version":[{"id":121468,"href":"https:\/\/ekamu.net\/index.php\/wp-json\/wp\/v2\/posts\/121464\/revisions\/121468"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/ekamu.net\/index.php\/wp-json\/wp\/v2\/media\/121465"}],"wp:attachment":[{"href":"https:\/\/ekamu.net\/index.php\/wp-json\/wp\/v2\/media?parent=121464"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/ekamu.net\/index.php\/wp-json\/wp\/v2\/categories?post=121464"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/ekamu.net\/index.php\/wp-json\/wp\/v2\/tags?post=121464"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}