All public logs
Combined display of all available logs of AI Wiki. You can narrow down the view by selecting a log type, the username (case-sensitive), or the affected page (also case-sensitive).
- 23:14, 8 January 2025 Alpha5 talk contribs created page DeepSeek 3.0 (Created page with "'''DeepSeek 3.0''' DeepSeek 3.0 (often referred to as DeepSeek-V3) is an open-source Mixture-of-Experts (MoE) Large Language Model (LLM) consisting of 671 billion total parameters, with 37 billion parameters activated for each token. It is designed for efficient training, cost-effective inference, and strong performance across various language understanding, coding, and mathematical tasks. DeepSeek 3.0 is developed by DeepSeek-AI and is the successor to DeepSeek-V2. ==...")