Page : 2

Tutorial

Giraffe - Long Context LLMs

Following up on the work from our previous blog post, we are releasing today an arXiv paper titled “Giraffe: Adventures in Expanding Context Lengths in LLMs”. Giraffe is a new family of models that are finetuned from base LLaMA and LLaMA2 that we release. We include a…
Tutorial

Treating Attention Deficit Disorder in LLMs

We have seen an explosion of open-source LLMs lately. While these OSS LLMs have shown comparable performance to closed source LLM APIs offered by OpenAI, Google and others, they suffer from one serious limitation. They only support context lengths of 2K compared to closed…
Research

Abacus.AI at NeurIPS 2022

The Abacus.AI team has published two papers to appear at the Conference on Neural Information Processing Systems (NeurIPS) 2022. NeurIPS is a top machine learning conference held every December. Abacus.AI is also planning a social event during NeurIPS – stay tuned for…
Tutorial

Beginners Guide To Transformer Models

In our previous post, we presented an introduction to Seq2Seq models – models that take a sequence as an input and produce a sequence for their output. Back in 2014, they revolutionized the field of Natural Language Processing (NLP), especially in translation…