Exclusive: AI startup Tenyx’s fine-tuned open-source Llama 3 model outperforms GPT-4


Discover how companies are responsibly integrating AI in production. This invite-only event in SF will explore the intersection of technology and business. Find out how you can attend here.


In an exclusive interview with VentureBeat, Itamar Arel, founder and CEO of AI startup Tenyx, revealed a groundbreaking achievement in the field of natural language processing. Tenyx has successfully fine-tuned Meta’s open-source Llama-3 language model (now known as Tenyx-70B) to outperform OpenAI’s GPT-4 in certain domains, marking the first time an open-source model has surpassed the proprietary gold standard.

“We developed this fine-tuning technology that allows us to take a foundational model and sort of polish it or train it beyond what it was trained on,” Arel explained. “What we’ve been getting more and more excited about is that we could take that technology, which allows us essentially to exploit some redundancy in these large models, to allow for what’s probably better called continual learning or incremental learning.”

Screenshot 2024 05 06 at 11.01.52%E2%80%AFPM
A radial chart shows the Tenyx-optimized Llama 3 model outperforming GPT-4 in math and coding while surpassing the base Llama 3 model across all capabilities, a first for an open-source AI model according to Tenyx founder Itamar Arel. (Image Credit: Tenyx)

Overcoming ‘catastrophic forgetting’

Tenyx’s novel approach to fine-tuning tackles the issue of “catastrophic forgetting,” where a model loses previously learned knowledge when exposed to new data. By selectively updating only a small portion of the model’s parameters, Tenyx can efficiently train the model on new information without compromising its existing capabilities.

“If you end up changing, say, just 5% of the model parameters, and everything else stays the same, you’re able to do so more aggressively without running the risk that you’re going to distort other things,” Arel said. This selective parameter updating method has also enabled Tenyx to achieve remarkably fast training times, fine-tuning the 70-billion-parameter Llama-3 model in just 15 hours using 100 GPUs.

VB Event

The AI Impact Tour – San Francisco

Join us as we navigate the complexities of responsibly integrating AI in business at the next stop of VB’s AI Impact Tour in San Francisco. Don’t miss out on the chance to gain insights from industry experts, network with like-minded innovators, and explore the future of GenAI with customer experiences and optimize business processes.

Request an invite

Screenshot 2024 05 06 at 11.49.58%E2%80%AFPM
At the time of release, Llama3-TenyxChat-70B is the highest-ranked open source model on the MT-Bench evaluation available for download. (Credit: Tenyx)

Commitment to open-source AI

Tenyx’s commitment to open-source AI shines through in their decision to release their fine-tuned model, named Tenyx-70B, under the same license as the original Llama-3. “We’re big believers in open-source models,” Arel told VentureBeat. “The more advances are shared with the community, the more cool applications and just better for everybody.”

The potential applications of Tenyx’s post-training optimization technology are vast, ranging from creating highly specialized chatbots for specific industries to enabling more frequent incremental updates to deployed models, keeping them current with the latest information between major releases.

Reshaping the AI landscape

The implications of Tenyx’s breakthrough are profound, potentially leveling the playing field by providing businesses and researchers with access to state-of-the-art language models without the high costs and restrictions associated with proprietary offerings. This development may also spur further innovation in the open-source community, as others seek to build upon Tenyx’s success.

“It kind of raises questions about what does that mean for the industry? What does it mean for the OpenAIs of the world?” Arel pondered. As the AI arms race continues to heat up, Tenyx’s achievement in fine-tuning open-source models could reshape the AI industry and transform the way businesses approach natural language processing tasks.

While the Tenyx-optimized Llama-3 model inherits the same limitations as the base model, such as occasional illogical or ungrounded responses, the improvements in performance are significant. Arel highlighted the model’s impressive gains in math and reasoning tasks, where it achieved nearly 96% accuracy compared to the base model’s 85%.

As Tenyx opens the door to a new era of open-source AI innovation, the impact of their breakthrough on the AI ecosystem remains to be seen. However, one thing is certain: Tenyx has demonstrated that open-source models can compete with and even surpass their proprietary counterparts, paving the way for a more accessible and collaborative future in artificial intelligence.



Source link

About The Author

Scroll to Top