This topic describes how to use the distributed task scheduler XXL-JOB with the large language model DeepSeek to automatically push trending financial news and analyze financial data on a schedule.
Background
As the capabilities of large AI models grow, their business applications expand. Many business scenarios can be automated and run as scheduled tasks in the background instead of being manually initiated. These tasks can be enhanced with the capabilities of large language models. The following are common scenarios:
Risk monitoring: You can monitor key system metrics on a schedule and use the smart analysis of large language models to detect potential threats.
Data analytics: You can collect online financial data on a schedule and use a large language model to perform intelligent analysis and generate decision-making advice for investors.
Prerequisites
You have activated Alibaba Cloud Model Studio and created an API-KEY in the Alibaba Cloud Model Studio console.
Prepare the environment
Set up DeepSeek
Set up XXL-JOB
Push hot news
This example shows how to implement the solution using the MSE XXL-JOB version or a self-hosted XXL-JOB and DeepSeek-R1 hosted on Alibaba Cloud Model Studio. For more information about the demo, see xxljob-demo (SpringBoot).
Step 1: Connect the application to the XXL-JOB scheduling platform
Log on to the Alibaba Cloud Container Service for Kubernetes (ACK) console and create an ACK serverless cluster. To pull the demo image, you must enable and configure Server Name Address Translation (SNAT) for the current VPC. If SNAT is already configured for the VPC, you can skip this step.
On the Clusters page of the ACK console, click the name of the target cluster. In the navigation pane on the left, choose Workloads > Stateless. Click Create from YAML. Use the following YAML configuration to connect the application to the MSE XXL-JOB scheduling platform. For more information about the parameters
-Dxxl.job.admin.addresses,-Dxxl.job.executor.appname,-Dxxl.job.accessToken,-Ddashscope.api.key, and-Dwebhook.url, see Configure startup parameters.
Step 2: Configure startup parameters
Obtain the startup parameters.
Log on to the MSE XXL-JOB console and select a region in the top menu bar.
Click the target instance. On the Application Management page, click Access in the Executor Count column for the target application.

Replace the placeholders with the access configuration of the target instance. Then, click Copy to copy the parameters to your YAML configuration:
-Dxxl.job.admin.addresses=http://xxljob-xxxxx.schedulerx.mse.aliyuncs.com -Dxxl.job.executor.appname=xxxxx -Dxxl.job.accessToken=xxxxxxxLog on to the Alibaba Cloud Model Studio platform. Click the profile icon in the upper-right corner and select API-KEY to go to the management page. On this page, you can create or copy an API-KEY.
Replace the placeholder with your API-KEY and copy the parameter to your YAML configuration:
-Ddashscope.api.key=sk-xxxTo obtain the DingTalk group webhook URL, add a custom robot in your DingTalk group settings.
Replace the
access_tokenvalue and copy the parameter to your YAML configuration:-Dwebhook.url=https://oapi.dingtalk.com/robot/send?access_token=xx
Step 3: Create and run the AI task
MSE XXL-JOB console
Log on to the MSE XXL-JOB console and select a region in the top menu bar. Click the target instance. In the navigation pane on the left, click Task Management, and then click Create Task.
In the Create Task panel, set JobHandler Name to
sinaNews. Configure the Task Parameter with the followingpromptinformation. Keep the default settings for the other parameters and save the configuration.On the Task Management page, find the
sinaNewstask that you created. In the Actions column, click Run Once. Wait for the task to run successfully. The DingTalk group will then receive AI-analyzed news summaries periodically.
Self-hosted XXL-JOB Admin
On your self-hosted XXL-JOB Admin console, create a task. Set JobHandler to
sinaNews. For the task parameters, see Sample Task Parameter configuration.On the task management page, run the task once manually. You will receive a DingTalk notification as shown in the following figure.
Performing financial data analytics
In the Push trending financial news example, you pull news only from Sina Finance. However, to pull domestic and international financial news and data in near real-time for quick decision-making, a single-machine job is not fast enough. To address this, you can use MSE XXL-JOB sharding broadcast jobs to split large jobs into smaller jobs that pull different data. You can then use the job orchestration feature of MSE XXL-JOB to create a flow to complete your task step by step.
Create three tasks in MSE XXL-JOB and establish their dependencies: Pull financial data → Data analysis → Generate report. Set the routing policy for the Pull financial data task to broadcast sharding.
When the Pull financial data task starts, broadcast sharding dispatches multiple sub-tasks to different executors to pull domestic and international financial news and data. The results are then stored in a location such as a database, Redis, or Object Storage Service.
When the Data analysis task starts, it retrieves the collected financial data. The task then calls DeepSeek to analyze the data and stores the results.
After the data analysis is complete, the Generate report task generates a report from the analysis results. The task then sends the report, which contains investment advice, to users through DingTalk or email.






