β Back to Text Models LFM2.5-1.2B-Base is the pre-trained foundation model for the LFM2.5 series. Ideal for fine-tuning on custom datasets or building specialized checkpoints. Not instruction-tunedβuse LFM2.5-1.2B-Instruct for chat applications.Documentation Index
Fetch the complete documentation index at: https://liquidai-fix-android-sdk-qa-issues.mintlify.app/llms.txt
Use this file to discover all available pages before exploring further.
Specifications
| Property | Value |
|---|---|
| Parameters | 1.2B |
| Context Length | 32K tokens |
| Architecture | LFM2.5 (Dense) |
Fine-tuning
TRL compatible (SFT, DPO, GRPO)
Custom Training
Build domain-specific models
32K Context
Extended context for long documents
Quick Start
- Transformers
- vLLM
- SGLang
Install:Download & Run: