This model is trained on the Yi-34B model for 3 epochs on the Capybara dataset. It's the first 34B Nous model and first 200K context length Nous model. **Note:** This endpoint currently supports 32k context.
Depending on their size, subject, and complexity, your prompts will be sent to [Mistral: Mistral Medium](/models/mistralai/mistral-medium) or [OpenAI: GPT-4 Turbo (preview)](/models/openai/gpt-4-turbo-preview). To see which model was used, visit [Activity](/activity).