

LiteRT Community (FKA TFLite)
AI & ML interests
None defined yet.
Recent Activity
LiteRT Community
A community org for developers to discover models that are ready for deployment to edge platforms. LiteRT, formerly known as TensorFlow Lite, is a high-performance runtime for on-device AI.
Models in the organization are pre-converted and ready to be used on Android and iOS. For more information on how to run these models see our LiteRT Documentation.
LLMs
To make LLMs as simple as possible, LiteRT models can be bundled into .task files compatible with MediaPipe LLM Inference API. MediaPipe LLM Inference API wraps LiteRT to provide an easy prompt in -> response out interface on Android, iOS, and Web.
How to Convert and Contribute Models
Follow the instructions for converting from TensorFlow, PyTorch, or JAX.
For LLMs specifically, use the LiteRT Torch Generative API.
Once converted, join the LiteRT community org and add the model yourself.
models
1
