πŸ€— Hugging Face   |   πŸ€– ModelScope

News

Introduction

Ring is a reasoning MoE LLM provided and open-sourced by InclusionAI, derived from Ling. We introduce Ring-lite-distill-preview, which has 16.8 billion parameters with 2.75 billion activated parameters. This model demonstrates impressive reasoning performance compared to existing models in the industry.

Model Downloads

You can download the following table to see the various parameters for your use case. If you are located in mainland China, we also provide the model on ModelScope.cn to speed up the download process.

Model#Total Params#Activated ParamsContext LengthDownload
Ring-lite-distill-preview16.8B2.75B64KπŸ€— HuggingFace
πŸ€– ModelScope
Ring-lite16.8B2.75B128KπŸ€— HuggingFace
πŸ€– ModelScope

Quickstart

πŸ€— Hugging Face Transformers

Here is a code snippet to show you how to use the chat model with transformers:

from transformers import AutoModelForCausalLM, AutoTokenizer

model_name = "inclusionAI/Ring-lite"

model = AutoModelForCausalLM.from_pretrained(
    model_name,
    torch_dtype="auto",
    device_map="auto"
)
tokenizer = AutoTokenizer.from_pretrained(model_name)

prompt = "Give me a short introduction to large language models."
messages = [
    {"role": "system", "content": "You are Ring, an assistant created by inclusionAI"},
    {"role": "user", "content": prompt}
]
text = tokenizer.apply_chat_template(
    messages,
    tokenize=False,
    add_generation_prompt=True
)
model_inputs = tokenizer([text], return_tensors="pt").to(model.device)

generated_ids = model.generate(
    **model_inputs,
    max_new_tokens=8192
)
generated_ids = [
    output_ids[len(input_ids):] for input_ids, output_ids in zip(model_inputs.input_ids, generated_ids)
]

response = tokenizer.batch_decode(generated_ids, skip_special_tokens=True)[0]

πŸ€– ModelScope

If you’re in mainland China, we strongly recommend you to use our model from πŸ€– ModelScope.

Deployment

Please refer to Ling

Finetuning

Please refer to Ling

License

This code repository is licensed under the MIT License.

Citation

[TBD]