Honor announced it will open-source its first MoE agent model MagicAgent at the #MWC26 yesterday. Co-developed by Honor and Fudan University, the 30-billion-parameter agent foundation model leapfrogs GPT-5.2 in six criteria.
{{item.ShowEnTime3.substring(0,13)}}
NEW