HUGE 🔥 Llama 2 with 32K Context Length

Sushil Kumar 💝❤️
8 min readJul 30, 2023
HUGE 🔥 Llama 2 with 32K Context Length

LLaMA 2 model with a 32k context window is here, yes you are right, 32,000 tokens can be inputted or either be outputted, you can generate or give as an input. This is a huge achievement and accomplishment in the open source world because now the short-term memory of the LLaMA 2 model has tremendously increased.

HUGE 🔥 Llama 2 with 32K Context Length

Thanks to a company called Together.ai and they have done this massive work of taking the existing LLaMA 2 model and using a technique called position interpolation to create this. Position interpolation and the model and how the model does. Before we discuss all these things we are going to just see a quick demo just to show you that this works. So I went and got Paul Graham’s essays. So when I took this and then added it to the playground that they have got and I added this and I asked a question. The question is do you know what happened in 2019 from the above document? So in a particular place at the end of this document, this says what happened. So it says in the fall of 2019 Bell was finally finished and it talks about Lisp and if you go there and it tells you the exact answer.

--

--

Sushil Kumar 💝❤️
Sushil Kumar 💝❤️

Written by Sushil Kumar 💝❤️

Hello, I'm someone who loves to share knowledge about side hustles. I've attempted successful side business there is, I'll speak about them all, which possible.

No responses yet