Back to Benchmarks

codellama-13b-instruct

text-generation
Uploaded: 09.01.2024

Inference Input Arguments

{ "messages": <List[Dict]>, "temperature": <Optional[int]>, "max_tokens": <Optional[int]>, "stream": <Optional[bool]> }

Inference Output Format

{ "model": "codellama-13b-instruct", "created": <int>, "id": <int>, "choices": <List[str]>, "object": "chat.completion", "usage": <Dict> }