Home - Developer Tools - Page 8
We may earn compensation if you purchase via some links
Analyze images and text 85 times faster on iPhone/iPad with this ultra-compact AI based on visual language. A valuable tool for offline handwriting recognition, object counting, and visual questions
A new series of GPT models with significant improvements in coding, long context and instruction following. Available exclusively via API
This open-source multimodal model outperforms its competitors with a 128k token context window. Enjoy an inference speed of 150 tokens per second and excellent image understanding performance, even on a…
Explore and integrate advanced AI models into your applications easily. Benefit from optimized performance and a wide range of capabilities for your projects
New-generation open-source AI models from Google DeepMind. More powerful and more accurate, these LLM models are available in two versions: 9B and 27B
A family of advanced multimodal language models developed by NVIDIA. Outperforms proprietary models on many vision and language tasks, while improving performance on text tasks
A high-performance open-source LLM model for text and code generation. Available in several optimization formats, including BF16 and F8. API already available
A multimodal AI model that analyzes text, images and short videos in over 140 languages. Benefit from exceptional performance on a single GPU and great flexibility for developing applications on…
An open-source model optimized for autonomous coding that rivals Claude Sonnet 4.5 while costing 10 times less. Specialized in multilingual coding, tool usage, and long-horizon planning
A powerful LLM model with 236 billion parameters. It performs brilliantly in mathematics, coding and reasoning, with very competitive API pricing ($0.14/million tokens)
Access over 200 AI models via a unified API. Easily integrate AI functionalities into your applications with a single API key for 200+ models.
A set of powerful generative AI models called LFM. Get superior performance to traditional large language models, while using less memory and computing resources