Technology
Japan
SME/Startup
“Qwen2.5 has enhanced its performance in base Japanese processing, providing it with an edge over other models. Axcxept's proprietary training process has led to the development of a Japanese LLM with the highest level of accuracy."
Kazuya Hodatsu
CEO of Axcxept Inc.
About Axcxept
Axcxept focuses on maximizing the power of AI and cloud computing; their goal is to embrace the technological innovation of AI and Cloud, and to transform to the changing times in a highly adaptable manner. Above all else, the company pursues safety and efficiency, and provides industry-leading diversified solutions and services.
Challenge
In the LLM landscape of Japan, there isn’t a dominant “go-to” LLM specifically for the Japanese speaker market. Axcxept wanted to pinpoint a base LLM that worked better with Japanese as a language, as well as incorporating nuanced cultural substance.
Why Alibaba Cloud
Qwen2.0's impressive Japanese language processing capabilities attracted a lot of interest across social media. Following this, Qwen2.5 has improved Japanese language processing capabilities even further. Additionally, the flexibility of being able to customize and expand the necessary functions easily has also gained a lot of attention internationally.
Architecture
The EZO series is implemented with fine tuning based on Qwen2.5. The 32B and 72B models are equipped with technology incorporating Auto-CoT and Real-time Augmented Generation (RAG). This allows for complex Japanese language support, as well as continuous updating.
Since the EZO x Qwen2.5 can be built in a local environment that does not need to communicate with external networks, it is suitable for industries where high security is an important factor, such as medical and public institutions.
Key Results
A fine-tuned implementation based on Qwen2.5, EZO x Qwen2.5 achieved accuracy that exceeded that of existing closed models in the Japanese MT Bench, the top Japanese language evaluation index in Japan.
It also has the advantage of being lightweight and low latency, and is able to run at high speed on large-scale servers, typically found in many industries.
The EZO x Qwen2.5 model outperforms GPT-4-Turbo across areas including coding, extraction, math, reasoning, roleplay, and writing with a total average score of 8.44 compared to the GPT score of 8.35.
Looking Forward
As developer of the world's best Japanese language processing solution, in both closed and open models, Axcxept wants to increase the presence of their products based on Qwen both domestically and globally.
Featured Products
Top-performance foundation models from Alibaba Cloud.
Other Related Stories
Shiseido Drunk Elephant
Drunk Elephant partnered with Alibaba Cloud because it had the cosmetics industry know-how, was familiar with the client’s IT infrastructure, and, as the largest cloud computing company in China and Asia Pacific, offered digital expertise and a huge advantage in the Eastern market.
Chainbase
By offering a cost-effective and robust hosting solution, Alibaba Cloud enabled Chainbase's rapid expansion and smooth data migration without downtime, halving infrastructure costs and enhancing efficiency and security.
Lightblue
Lightblue leveraged Alibaba Cloud Tongyi Qianwen (Qwen) to support the development of its Karasu and Qarasu LLMs due to its advanced architecture and extensive training in East Asian languages, specifically Japanese.
Snapshot
By partnering with Alibaba Cloud, Axcxept has developed the EZO x Qwen2.5 LLM for the Japanese-speaking market as a rival which outperforms open-source models.
Product/Solution Used
View More Solutions
Related Whitepaper
Reaching the New Gold Standard Using Big Data, AI, and Blockchain
This whitepaper focuses on architecting a big data solution in retail and finance scenarios, as well as the motivation and implementation of blockchain.
Download
A Free Trial That Lets You Build Big!
Start building with 50+ products and up to 12 months usage for Elastic Compute Service
Get Started for Free Get Started for Free