LFM2-700M-8bit

Run locally on Apple devices with Mirai

Type

Type

Local

From

From

LiquidAI

Quantisation

Quantisation

uint8

Size

Size

700M

Source

Source

Hugging Face Logo

This model is the MLX format version of LiquidAI's LFM2-700M, a 700 million parameter language model converted for efficient inference on Apple Silicon using the MLX framework. LFM2 is a Liquid Foundation Model designed for edge deployment and supports multiple languages including English, Arabic, Chinese, French, German, Japanese, Korean, and Spanish. The 8-bit quantized version provides a balance between model performance and computational efficiency for text generation tasks.

1
Choose framework
2
Run the following command to install Mirai SDK
SPMhttps://github.com/trymirai/uzu-swift
3
Set Mirai API keyGet API Key
4
Apply code
Loading...

LFM2-700M-8bit

Run locally on Apple devices with Mirai

Type

Local

From

LiquidAI

Quantisation

uint8

Size

700M

Source

Hugging Face Logo

This model is the MLX format version of LiquidAI's LFM2-700M, a 700 million parameter language model converted for efficient inference on Apple Silicon using the MLX framework. LFM2 is a Liquid Foundation Model designed for edge deployment and supports multiple languages including English, Arabic, Chinese, French, German, Japanese, Korean, and Spanish. The 8-bit quantized version provides a balance between model performance and computational efficiency for text generation tasks.

1
Choose framework
2
Run the following command to install Mirai SDK
SPMhttps://github.com/trymirai/uzu-swift
3
Set Mirai API keyGet API Key
4
Apply code
Loading...