Moreh and SGLang team up to showcase distributed inference system on AMD at AI Infra Summit 2025

11.09.25 03:06 Uhr

Introducing distributed inference systems on AMD with high efficiency, and unveiling collaborations with Tenstorrent and SGLang

SEOUL, South Korea and SANTA CLARA, Calif., Sept. 11, 2025 /PRNewswire/ -- Moreh, an AI infrastructure software company, unveiled its distributed inference system on AMD and showcased the progress of its collaborations with Tenstorrent and SGLang at the AI Infra Summit 2025 in Santa Clara, held September 9–11.

Moreh and SGLang team up to showcase distributed inference system on AMD at AI Infra Summit 2025

The AI Infra Summit is the world's largest and most established AI conference dedicated to the infrastructure layer of AI & Machine Learning. Originating as the AI Hardware Summit back in 2018, the summit has evolved from a semiconductor-focused conference into a full-stack AI infrastructure event.

The 2025 summit attracted 3,500 attendees and over 100 partners, with content designed for hardware providers, hyperscalers, and all enterprise IT and AI infrastructure specialists building fast, efficient, and affordable AI.

At the Enterprise AI session on September 10, Moreh CEO Gangwon Jo introduced the company's distributed inference system and presented benchmark results demonstrating that it optimized the latest deep learning models, such as DeepSeek, more efficiently than NVIDIA. He also unveiled a next-generation AI semiconductor system combining Moreh's software with Tenstorrent's hardware, offering a range of cost-competitive alternatives to NVIDIA.

During the summit, Moreh co-hosted a presentation with SGLang, a leader in the deep learning inference software ecosystem, and organized a booth and networking sessions together. This serves as an opportunity to further strengthen collaboration with the global AI ecosystem, particularly in the North American market. Furthermore, Moreh plans to jointly develop an AMD-based distributed inference system with SGLang to accelerate its expansion of the rapidly growing deep learning inference market.

Moreh CEO Gangwon Jo stated, "Moreh possesses the strongest technical capabilities among AMD's global software partners and is currently conducting proof-of-concept (PoC) projects with several leading LLM companies," and added, "Through close collaboration with AMD, Tenstorrent, and SGLang, we aim to establish ourselves as a global company providing customers with diverse AI computing alternatives."

Moreh is developing its own core AI infrastructure engine and, through its foundation LLM subsidiary Motif Technologies, is securing comprehensive technological capabilities that span the model domain. Simultaneously, the company is making its mark in the global market through collaborations with key partners such as AMD, Tenstorrent, and SGLang.

 

Moreh CEO Gangwon Jo is giving a presentation at the AI Infra Summit 2025 on the afternoon of September 10.

Cision View original content to download multimedia:https://www.prnewswire.com/apac/news-releases/moreh-and-sglang-team-up-to-showcase-distributed-inference-system-on-amd-at-ai-infra-summit-2025-302553303.html

SOURCE Moreh