Liquid AI

SKU: liquid-ai

Liquid AI, an MIT spinoff, specializes in creating efficient and adaptive AI systems through liquid neural networks inspired by the nervous system of C. elegans. Their Liquid Foundation Models (LFMs) offer scalable solutions across various industries, providing enhanced performance with reduced energy consumption. In December 2024, Liquid AI secured $250 million in funding led by AMD to further advance their innovative AI technologies.

Developing AI models with reduced energy consumption.
Implementing adaptive AI systems capable of learning post-deployment.
Enhancing transparency in AI decision-making processes.
Applying AI solutions in industries like finance, biotechnology, and telecommunications.
Liquid AI demonstrates high autonomy through its edge-native models capable of offline operation without cloud dependency, enabling secure on-device processing across industries like banking and IoT. Its liquid neural networks adapt dynamically to new tasks with minimal retraining (1-shot learning capabilities), featuring near-constant memory complexity regardless of input length (32k token context window). The STAR architecture's adaptive linear operators and Mixture of Experts design enable autonomous decision-making in resource-constrained environments through hardware-aware optimizations for AMD GPUs and Apple silicon. Models maintain operational independence through compressed KV caching techniques that reduce memory footprint by 47% compared to transformers while handling multimodal inputs.
Closed Source
Contact