lablup.com

  • sns
  • sns
  • sns
  • sns
Jan 7, 2026

Lablup unveils 'Backend.AI:GO' at CES 2026: Run LLMs on personal computers without cloud connectivity

- Enterprise AI infrastructure now extends to personal PCs while maintaining centralized management
- Cloud version Backend.AI:DOL also launches, completing full-stack coverage from AI PCs to hyperscale clusters

(Lablup) CES 2026.jpg

LAS VEGAS, January 6, 2026 – Lablup, an AI infrastructure platform company, today announced the official launch of Backend.AI:GO at CES 2026. The desktop application enables users to run Small Language Models (SLMs) directly on personal computers without cloud servers. This marks Lablup's fourth consecutive year exhibiting at CES.

Backend.AI:GO (where "GO" stands for "Generative On-device") accelerates AI inference using local computing resources in on-premises environments. Once users download a model, the application operates entirely offline without requiring an internet connection. User data is never transmitted to external servers. Users can securely download models to their personal computers and freely utilize them for internal document analysis, image understanding and generation, code review, and more. The application also allows users to adjust hyperparameters and resource allocation to customize model response characteristics and resource usage.

The key differentiator of Backend.AI:GO lies in its integration with Lablup's AI infrastructure management platform, Backend.AI. By connecting multiple Backend.AI:GO instances or integrating with existing Backend.AI environments and cloud AI services such as OpenAI and Google, users can leverage both large-scale AI models and local PC resources in a unified manner. Organizations can extend the same management approach used for thousands of GPU clusters to AI PCs, enabling centralized model deployment, version control, and usage monitoring from a single console.

Lablup will also officially launch Backend.AI:DOL at CES 2026. Previously introduced as a beta version at AI Infra Summit, Backend.AI:DOL (short for "Deployable Omnimedia Lab") is a cloud-based solution that allows users to access various open models directly through a web browser. With Backend.AI:GO handling on-device workloads and Backend.AI:DOL managing cloud deployments, Lablup now offers a complete full-stack AI infrastructure solution spanning from AI PCs to hyperscale GPU clusters.
"As the AI PC era accelerates, enterprise demand for running LLMs directly in on-device environments is growing rapidly," said Jungkyu Shin, CEO of Lablup. "With Backend.AI:GO and DOL, we will provide an environment where organizations can operate AI infrastructure with a consistent experience, regardless of scale, from a single PC to thousands of GPU clusters."

Shin added, "Lablup has expanded its presence in the global AI infrastructure market through four consecutive years of CES participation. At this year's CES, we are introducing a full-stack AI infrastructure solution that spans on-device and cloud, setting a new standard for AI infrastructure operations."

Lablup is exhibiting at CES 2026 in LVCC North Hall, Booth #9529, from January 6–9, 2026.

About Lablup
Lablup Inc. is redefining how modern AI systems are powered by delivering the operating model for scalable, high-performance GPU computing. Its flagship platform, Backend.AI, combines software-defined GPU virtualization, ultra-fast containerized environments, and intelligent workload orchestration to maximize performance, utilization, and efficiency across heterogeneous clusters.

Founded in 2015, Lablup works with global technology leaders, research institutions, and top universities to advance AI infrastructure and accelerate data-driven innovation worldwide. Learn more at lablup.com and follow Lablup on LinkedIn.

  • sns
  • sns
  • sns
  • sns

© Lablup Inc. All rights reserved.

KR Office: 8F, 577, Seolleung-ro, Gangnam-gu, Seoul, Republic of Korea US Office: 3003 N First st, Suite 221, San Jose, CA 95134
KR Office: 8F, 577, Seolleung-ro, Gangnam-gu, Seoul, Republic of Korea US Office: 3003 N First st, Suite 221, San Jose, CA 95134