lablup.com

  • sns
  • sns
  • sns
  • sns
Sep 25, 2025

Lablup Hosts lab | up > /conf/5, Presenting the “Make AI Composable” Vision

DSC03604.jpg

  • Lablup hosts 5th annual technology conference on September 24 at the aT Center
  • Releases Backend.AI 25.14 with enhanced stability and greater scalability
  • Unveils next-generation products including AI:DOL alongside the third update of the FastTrack MLOps platform

Lablup, a leading AI infrastructure platform company, hosted its fifth annual technology conference ‘lab | up > /conf/5’ on September 24 at the aT Center in Yangjae, Seoul. Now in its fifth year, the event served as a forum to share Lablup’s latest technological achievements and product innovations, while bringing together AI industry leaders and practitioners to discuss challenges and solutions.

Held under the theme “Make AI Composable with Lablup,” the conference featured speakers from Rebellions, Moreh, kt cloud, HyperAccel, Lotte Innovate, Miracom Inc., Microsoft, Teddynote, AIM Intelligence, SKT, Nota AI, NHN Cloud, NAVER Cloud, ETRI, and Modulab, sharing insights on advances across AI hardware, infrastructure, and software. In addition, the conference offered attendees to have demonstrations of Lablup’s latest product updates, including Backend.AI 25.14.

Charting the path toward AGI-ready infrastructure

In the opening keynote, Jeongkyu Shin, CEO of Lablup, delivered a talk AAA: Agentic, Autonomous, Adaptive Intelligence’. He explained that as the AI field races toward the threshold of AGI, innovations across all layers are required to underpin this progress. Introducing the concept of “quantifying intelligence,” Shin emphasized Lablup’s new vision of transforming into a company that quantitatively supplies intelligence on demand. He also announced technological shifts spanning the Backend.AI Continuum project for fault-tolerant LLM and multimodal operations, a new App Proxy system, Backend.AI Artifact, and the AI-native platform AI:DOL.

In a parallel keynote, CTO Joongi Kim delivered a talk Composable AI, Composable Software’, emphasizing that productivity gains hinge on leveraging AI models optimized for diverse purposes and applying them to the right tasks. He underscored the need to marshal the full capabilities of the computing industry from physical hardware to abstraction and management software to AI models into a cohesive, composable ecosystem.

AI:DOL – A next-generation platform for natural language AI design and prototyping

AI:DOL is Lablup’s new AI-native platform for enterprises and individual developers aiming to provide AI services to both internal and external clients using models deployed on Backend.AI. Built on Backend.AI Continuum, AI:DOL integrates on-premises and cloud resources with seamless failover, enabling users from beginners to experts to easily prototype and create.

Hyounkyoung Moon, who led the development, showcased AI:DOL in the session ‘Building Conversational AI Products with Backend.AI Continuum’, demonstrating its chat-based interface for designing and deploying AI with natural language. Through conversational interactions, users can generate documents, code, and other artifacts, which is also planned to have expansion into image, video, and AI coding capabilities.

Backend.AI FastTrack 3 – Workflow automation on hyperscale GPU clusters

The updated Backend.AI FastTrack 3 platform was introduced as a workflow engine running atop Backend.AI, which unifies massive GPU resources into a single pool to power the training and serving of frontier-scale models.

Eunjin Hwang and Jeongseok Kang presented ‘Take the FastTrack 3: A Backend.AI approach to LLMOps’, introducing an independent AI foundation model project aimed at developing a Korean-language model reflecting the nation’s cultural identity. They demonstrated how Backend.AI and FastTrack 3 contribute to efficient model development through workflow orchestration and large-scale GPU utilization.

Backend.AI – The center of cloud-native infrastructure

At the core of Lablup’s portfolio, Backend.AI is an AI infrastructure operating platform built for every scale. Backend.AI version 25.14 has been engineered through Lablup’s ongoing ‘hardening’ process to deliver stronger stability and support AI workloads at unprecedented scale across distributed environments.

Composable AI – unlocking new possibilities

“Lablup has been driving innovation in distributed AI systems and pursuing transformative changes since 2024,” said Jeongkyu Shin, CEO of Lablup. “With the unveiling of AI:DOL, Backend.AI FastTrack 3, and Backend.AI 25.14, we reaffirm our vision as a platform technology company driving a better AI-powered future.”

lab | up > /conf/5 highlighted the importance of composability in AI. By emphasizing modular AI architectures that can be selected and combined as needed, the conference showcased how independently evolving AI components can operate as a single ecosystem, unlocking new possibilities for innovation.

  • sns
  • sns
  • sns
  • sns

© Lablup Inc. All rights reserved.

KR Office: 8F, 577, Seolleung-ro, Gangnam-gu, Seoul, Republic of Korea US Office: 3003 N First st, Suite 221, San Jose, CA 95134
KR Office: 8F, 577, Seolleung-ro, Gangnam-gu, Seoul, Republic of Korea US Office: 3003 N First st, Suite 221, San Jose, CA 95134