株式会社オブライト
Services
About
Company
Column
Contact
日本語
日本語
メニューを開く
Column
MoE
Articles tagged "MoE"
2 articles
AI
2026-03-17
Complete Guide to Rakuten AI 3.0 Architecture: Next-Gen Japanese LLM with MoE
A comprehensive analysis of Rakuten AI 3.0's Mixture of Experts architecture with 700B parameters. Explore the 8-expert configuration, 40B active parameter efficiency, and technical background behind achieving 8.88 on Japanese MT-Bench.
Rakuten AI 3.0
MoE
Mixture of Experts
AI
2026-03-17
NemoClaw's NIM Inference Microservices and Nemotron Models — Deployment Strategies from Edge to Cloud
A technical deep dive into NemoClaw's NIM inference microservices and Nemotron model family. We examine containerized API endpoints, elastic scaling, Nemotron 3 Super performance (120B parameters, MoE with 12B active), deployment comparisons across AWS, Azure, GCP, and on-premises, lightweight edge device operations, and partner integration use cases with Salesforce, CrowdStrike, and more.
NemoClaw
NIM
Nemotron