• Sat. Nov 23rd, 2024

Reka AI Releases Reka Flash: An Efficient and Capable State-of-the-Art 21B Multimodal Language Model

Feb 29, 2024

Reka addresses the need for advanced language and vision models with their state-of-the-art multimodal and multilingual language model, Reka Flash. It can perform excellently on various benchmarks of LLM even with a smaller model, Reka Edge, with just 7B of trainable parameters. The models solve the challenges of achieving high performance across diverse tasks and languages using limited computational resources.

Existing models like Gemini Pro, GPT-3.5, and Llama-2 70B are discussed as benchmarks for comparison. Reka Flash and Reka Edge offer turbo-class and compact variant models, which leverage pretraining on text from over 32 languages and are evaluated on multiple benchmarks, including language understanding, reasoning, multilingual tasks, and multimodal interactions. Reka Flash competes with leading models on language benchmarks and vision tasks, while Reka Edge targets local deployments with resource constraints.

Reka Flash utilizes instruction tuning and reinforcement learning with Proximal Policy Optimization (PPO) to enhance its chat capabilities. Its performance is evaluated in both text-only and multimodal chat domains, and it presents competitive results against models like GPT-4, Claude, and Gemini Pro. Reka Edge, optimized for local deployments, outperforms comparable models such as Llama 2 7B and Mistral 7B on standard language benchmarks, indicating efficiency in resource-constrained environments.

In conclusion, both the models, Reka Flash and Reka Edge, introduced by Reka, successfully perform on LLM benchmarks using much smaller resources. It gives tight competition to existing state-of-the-art LLMs like Google’s Gemini Pro and OpenAI’s Gpt-4. Reka Flash uses knowledge in techniques like instruction tuning and reinforcement learning to excel in chat interactions. Meanwhile, Reka Edge stands out for its efficiency in local deployments.

The post Reka AI Releases Reka Flash: An Efficient and Capable State-of-the-Art 21B Multimodal Language Model appeared first on MarkTechPost.


#AIShorts #Applications #ArtificialIntelligence #EditorsPick #LanguageModel #LargeLanguageModel #Staff #TechNews #Technology #Uncategorized
[Source: AI Techpark]

Related Post