lablup.com

  • sns
  • sns
  • sns
  • sns
Nov 12, 2023

Lablup introduces the hyperscaled AI service built from Backend.AI at SC23

Backend.AI package.jpg

  Lablup announced today that will participate in "SC23 (November 12-17)" at the Colorado Convention Center (Denver) in the United States. The conference is an event for high-performance computing, networking, and storage analytics, and attracts top domestic and international companies such as AMD, ARM, AWS, IBM, Intel, NVIDIA, and SK hynix.

  Lablup will introduce its platform, Backend.AI, for constructing a super-giant AI ecosystem at this year's SC23. Not only does Backend.AI support a variety of GPUs, including NVIDIA CUDA and AMD ROCm, it also offers optimized features for various NPUs and AI semiconductors. It supports Google TPU, Graphcore, Rebellion Atom, and Furiosa Warboy, among others, along with optimization features. Utilizing a GPU-NPU integrated pipeline to train AI models on GPUs and serve them on AI semiconductors, it achieves both the highest performance and the lowest cost.

  Furthermore, Backend.AI is the first globally to implement GPU Direct Storage technology in a container cluster, providing tremendous data I/O speeds by directly connecting the GPU and network storage. This has led to partnerships with the top three in the high-speed storage market: PureStorage, NetApp, and Dell EMC. In addition, Backend.AI was the first and only one selected for the NVIDIA DGX-Ready program in the Asia-Pacific region and was chosen as a partner of the NVIDIA AI Accelerated Program along with VMWare, RedHat, and Canonical.

  At SC23, Backend.AI will be demonstrating LLM services that can be directly developed and operated using this technology. It plans to announce that it can develop and serve LLM services in an on-premises environment through Backend.AI's automation system, FastTrack, by fine-tuning a variety of large public language models, including the Llama2 model disclosed by Meta, for various application areas. They will also showcase VisuTale, a multimodal demo that creates stories and generates images based on user-provided photos, subsequently creating a picture book.

 "We are excited to introduce Backend.AI, which is optimized for high-performance computing, along with various supercomputer solutions, to supercomputing researchers around the world. We will continue to strive to achieve performance and cost innovations and overwhelming convenience through Backend.AI in technology-leading fields that require huge amounts of computational resources," said Jungkyu Shin, CEO of Rabbleup.

  • sns
  • sns
  • sns
  • sns

© Lablup Inc. All rights reserved.

8F, 577, Seolleung-ro, Gangnam-gu, Seoul, Republic of Korea
8F, 577, Seolleung-ro, Gangnam-gu, Seoul, Republic of Korea