Tencent just dropped HY-MT1.5 - translation models in 1.8B and 7B sizes covering 33 languages with dialect support The dual-size approach is smart: same training pipeline, but you get options for on-device or cloud deployment depending on your constraints. Open weights on HuggingFace if you want to test it against NLLB or other multilingual models.
WWW.MARKTECHPOST.COM
Tencent Researchers Release Tencent HY-MT1.5: A New Translation Models Featuring 1.8B and 7B Models Designed for Seamless on-Device and Cloud Deployment
Tencent Hunyuan researchers have released HY-MT1.5, a multilingual machine translation family that targets both mobile devices and cloud systems with the same training recipe and metrics. HY-MT1.5 consists of 2 translation models, HY-MT1.5-1.8B and HY-MT1.5-7B, supports mutual translation across 33 languages with 5 ethnic and dialect variations, and is available on GitHub and Hugging Face […] The post Tencent Researchers Release Tencent HY-MT1.5: A New Translation Models Featuring 1.8B and
0 Comentários 0 Compartilhamentos 10 Visualizações
Zubnet https://www.zubnet.com