Skip to main content

Bitdeer AI

info

We support ALL Bitdeer AI models, just set bitdeerai/ as a prefix when sending completion requests

API Keys​

import os 
os.environ["BITDEERAI_API_KEY"] = "your-api-key"

Sample Usage​

chat​

import os
from litellm import completion

os.environ["BITDEERAI_API_KEY"] = "your-api-key"

messages = [
{
"role":"system",
"content":"You are a knowledgeable assistant. Provide concise and clear explanations to scientific questions."
},
{
"role": "user",
"content": "Can you explain the theory of evolution in simple terms?"
}
]

completion(model="bitdeerai/OpenGVLab/InternVL2_5-78B-MPO", messages=messages)

embedding​

import os
from litellm import embedding

response = embedding(
model="bitdeerai/BAAI/bge-m3", input=['The cat danced gracefully under the moonlight, its shadow twirling like a silent partner.']
)
print(response)

Bitdeer AI Models​

liteLLM supports non-streaming and streaming requests to all models on https://www.bitdeer.ai

Example Bitdeer AI Usage - Note: liteLLM supports all models deployed on Bitdeer AI

LLMs models​

Model NameFunction Call
bitdeerai/deepseek-ai/DeepSeek-R1completion('bitdeerai/deepseek-ai/DeepSeek-R1', messages)
bitdeerai/deepseek-ai/DeepSeek-V3completion('bitdeerai/deepseek-ai/DeepSeek-V3', messages)
bitdeerai/Qwen/QwQ-32Bcompletion('bitdeerai/Qwen/QwQ-32B', messages)
bitdeerai/Qwen/Qwen2.5-VL-72B-Instructcompletion('bitdeerai/Qwen/Qwen2.5-VL-72B-Instruct', messages)
bitdeerai/Qwen/Qwen2.5-Coder-32B-Instructcompletion('bitdeerai/Qwen/Qwen2.5-Coder-32B-Instruct', messages)
bitdeerai/meta-llama/Llama-3.3-70B-Instructcompletion('bitdeerai/meta-llama/Llama-3.3-70B-Instruct', messages)
bitdeerai/OpenGVLab/InternVL2_5-78B-MPOcompletion('bitdeerai/OpenGVLab/InternVL2_5-78B-MPO', messages)

Embedding models​

Model NameFunction Call
bitdeerai/Alibaba-NLP/gte-Qwen2-7B-instructcompletion('bitdeerai/Alibaba-NLP/gte-Qwen2-7B-instruct', inputs)
bitdeerai/BAAI/bge-m3completion('bitdeerai/BAAI/bge-m3', inputs)
bitdeerai/BAAI/bge-large-en-v1.5completion('bitdeerai/BAAI/bge-m3', inputs)
bitdeerai/intfloat/multilingual-e5-large-instructcompletion('bitdeerai/intfloat/multilingual-e5-large-instruct', inputs)