← Back to Text Models LFM2-350M is Liquid AI’s smallest text model, designed for edge devices with strict memory and compute constraints. Delivers surprisingly strong performance for its size, making it ideal for low-latency applications.Documentation Index
Fetch the complete documentation index at: https://liquidai-fix-android-sdk-qa-issues.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Specifications
| Property | Value |
|---|---|
| Parameters | 350M |
| Context Length | 32K tokens |
| Architecture | LFM2 (Dense) |
Ultra-Light
Minimal memory and compute footprint
Low Latency
Fastest inference in the LFM family
Edge-Ready
Runs on IoT and embedded devices
Quick Start
- Transformers
- llama.cpp
- vLLM
- SGLang