Kakao releases top-performing lightweight MLLM, MoE AI models

24.07.25 09:37 Uhr

Werte in diesem Artikel
Rohstoffe

5.439,00 GBP 113,00 GBP 2,12%

Kakao has become the first company in Korea to make publicly available its top-performing lightweight multimodal large language model (MLLM) and mixture of experts (MoE) model. The company announced Thursday that it released two new models — the lightweight MLLM Kanana-1.5-v-3b and the MoE-based language model Kanana-1.5-15.7b-a3b — on the global open-source artificial intelligence (AI) platform Hugging Face. The latest release comes just two months after Kakao made available its initial set of four Kanana-1.5 models in May. “Open-sourcing these models marks a milestone in achieving technical breakthroughs, delivering both cost efficiency and high performance,” said Kim Byung-hak, Kakao’s head of Kanana Alpha. “It’s not just an architectural upgrade; it’s a crucial step toward product-level deployment and technological independence.” Kanana-1.5-v-3b, which was developed from scratch entirely with Kakao’s technology, builds on the Kanana 1.5 architecture and is capable of understanding user instructions to follow and comprehend both Korean and English images. For benchmWeiter zum vollständigen Artikel bei Korea Times

Quelle: Korea Times

Nachrichten zu Kakaopreis