Table Of Contents
Meta Code Llama 70B
Model Card
You can find details about this model in the model card. Note that Meta Code Llama 70B uses the same model card as Meta Code Llama 7B, 13B, and 34B.
Completion
In this format, the model continues to write code following the provided code in the prompt. An implementation of this prompt can be found here.
<s>{{ code_prompt }}Instructions
Meta Code Llama 70B has a different prompt template compared to 34B, 13B and 7B. It starts with a Source: system tag—which can have an empty body—and continues with alternating user or assistant values. Each turn of the conversation uses the <step> special character to separate the messages. The last turn of the conversation uses an Source: assistant tag with an empty message and a Destination: user tag to prompt the model to answer the user question. A detailed implementation of this format is provided.
- The structure requires a Source: system tag, but the system prompt can be empty.
- Each user query is preceded by a blank line.
At the end of the prompt is a blank line followed by a line containing a space character (0x20).
<s>Source: system
System prompt <step> Source: user
First user query <step> Source: assistant
Model response to first query <step> Source: user
Second user query <step> Source: assistant
Destination: user