Sep 11, 2024
Yes the model is exactly what you mentioned, but it is the quantized version of that. The context length is 128k indeed for these family of models.
If I misunderstood your question, please elaborate.
Yes the model is exactly what you mentioned, but it is the quantized version of that. The context length is 128k indeed for these family of models.
If I misunderstood your question, please elaborate.
Writes about Machine Learning/Data Science. Say Hi on LinkedIn - https://www.linkedin.com/in/krnk97/