<s>{{ code_prompt }}
Meta Code Llama 70B has a different prompt template compared to 34B, 13B and 7B. It starts with a Source: system tag—which can have an empty body—and continues with alternating user or assistant values. Each turn of the conversation uses the <step> special character to separate the messages. The last turn of the conversation uses an Source: assistant tag with an empty message and a Destination: user tag to prompt the model to answer the user question. A detailed implementation of this format is provided.
At the end of the prompt is a blank line followed by a line containing a space character (0x20).
<s>Source: system
System prompt <step> Source: user
First user query <step> Source: assistant
Model response to first query <step> Source: user
Second user query <step> Source: assistant
Destination: user