ByteDance Unveils Seed-OSS: Powerful Open-Source LLMs with 36B Parameters, 512K Context, and Advanced Reasoning
August 21, 2025
ByteDance has launched Seed-OSS, a series of open-source large language models (LLMs) under the Apache-2.0 license, featuring a 36-billion-parameter model with advanced capabilities for general-purpose and reasoning tasks.
The flagship model, Seed-OSS-36B, supports long-context processing up to 512,000 tokens and includes features like a controllable thinking budget, making it highly adaptable for complex tasks.
Seed-OSS-36B is designed for easy deployment via Hugging Face Transformers, with quantization support and integration with scalable serving tools like vLLM, facilitating use in enterprise and research environments.
The release offers two versions: one with synthetic instruction data for enhanced performance and a 'purer' version without, catering to diverse research needs.
Training efficiency is notable, with the model trained on 12 trillion tokens, demonstrating effective data quality and training strategies suitable for research, commercial applications, and AI agent development.
Benchmark results show Seed-OSS-36B excels in reasoning, math, and coding tasks, with high scores such as 87.7 on BBH reasoning and 91.7 on AIME24, outperforming many competitors.
The models demonstrate state-of-the-art performance across various benchmarks, including reasoning, coding, and long-context handling, with the instruction-tuned variant achieving a 91.7 score on AIME24.
Seed-OSS features a native 512K context window, enabling it to handle complex, long-form tasks like legal review and extensive report analysis, surpassing mainstream models.
A unique 'Thinking Budget' mechanism allows users to set token limits for dynamic reasoning depth, optimizing inference efficiency and task adaptability.
The open license and high performance of Seed-OSS provide versatile, cost-effective options for enterprises and developers across various AI workloads.
The instruction-tuned Seed-OSS-36B-Instruct scored 91.7 on AIME24 math questions, ranking second after OpenAI's OSS-20B, despite being trained on fewer tokens.
Future plans include releasing larger models, gathering community feedback, and exploring new reasoning control mechanisms, with easy access via Hugging Face and GitHub.
Summary based on 4 sources
Get a daily email with more Tech stories
Sources

VentureBeat • Aug 20, 2025
TikTok parent company ByteDance releases new open source Seed-OSS-36B model with 512K token context
DEV Community • Aug 21, 2025
2025 Complete Guide: ByteDance Seed-OSS-36B Open Source LLM In-Depth Analysis
South China Morning Post • Aug 21, 2025
ByteDance unleashes open-source AI model to challenge DeepSeek, Alibaba Cloud