Can't stop laughing: “Japan's highest-performance AI model” — is DeepSeek V3 just a renamed version?
What happened?
Rakuten (楽天) on March 17 unveiled Rakuten AI 3.0 and billed it as “Japan’s largest high-performance AI model,” with an advertised parameter count of roughly 700 billion and an Apache 2.0 open-source license. But it has been reported that less than 12 hours after the launch someone opened the model’s config.json on Hugging Face and found the architectures field reading "DeepseekV3ForCausalLM" and model_type set to "deepseek_v3." In other words, the core architecture and the layer sizes (hidden_size 7168, num_hidden_layers 61, n_routed_experts 256, vocab_size 129,280) match the DeepSeek V3 configuration almost exactly — a finding that immediately spread on social media.
Technical context
Fine-tuning an open-source model is industry standard, and DeepSeek V3 is available under a permissive license that permits commercial reuse. Rakuten has said it used open-source community models and then fine-tuned with Japanese bilingual data. That is technically true — but is it full disclosure? Reportedly, Rakuten’s public materials and press releases made no mention of DeepSeek, and the Hugging Face page was auto-tagged with "deepseek_v3" based on the uploaded config, not a user edit. So the row between lawful reuse and what looks like rebranding has provoked ridicule and sharp questions: did Rakuten overstate novelty for PR value?
Political and reputational fallout
This is not just a corporate embarrassment. DeepSeek, the Chinese-origin model that exploded onto the scene in 2025, has already been framed in Japan as an “AI black ship” — a loaded historical metaphor for an unsettling external force. Japan’s digital minister previously advised caution in government use of DeepSeek over personal data and security concerns, and major firms such as Toyota, Mitsubishi Heavy and SoftBank put internal restrictions on the model. Against that backdrop, rebranding a Chinese-developed model as a domestic flagship raises geopolitical sensitivities: open-source licensing does not erase public perception or national-security anxieties, especially amid broader tech decoupling and export-control debates.
So what now?
Legally, Rakuten appears on solid ground. Practically, it faces reputational fallout and fresh questions about transparency. Will Japanese users accept a repackaged model without a clear attribution? And how will regulators and corporate customers react when provenance matters as much as performance? Reportedly, the episode has already ignited a lively debate online — and a few very public laughs.
