Workflow applications streamline complex tasks by breaking the tasks down into a series of steps. You can create a workflow application in Alibaba Cloud Model Studio console to integrate large laguage models (LLMs), APIs, and other nodes to effectively reduce coding efforts. This topic describes how to create a workflow application and use the nodes.
Overview
Scenarios
Travel planning: Specify parameters such as destination to automatically generate travel plans, including flights, accommodations, and attractions.
Report analysis: Use data processing, analysis, and visualization plug-ins to produce structured and formatted analysis reports for complex datasets.
Customer service: Automatically classify and deal with customer inquiries to enhance the speed and precision of customer service responses.
Content creation: Produce content such as articles and marketing materials based on themes and requirements.
Education and training: Design personalized learning plans that include progress tracking and assessments, facilitating student self-learning.
Medical consultation: Use various analysis tools to generate preliminary diagnoses or examination recommendations based on patient symptoms and help doctors make further decisions.
Supported Models
Qwen-Max
Qwen-Plus
Qwen-Turbo
QwenVL-Plus
QwenVL-Max
For more information about the models, see List of models.
Use cases
For beginners
This example shows how to create a workflow application to identify whether a text message is related to telecom fraud.
Go to My Applications in the Model Studio console.
Click Create Application, choose Workflow Application, and click Create Task-based Workflow.
On the Canvas Configuration page, you can see the Start node already has two preset parameters. You can modify them based on your requirements.
Drag an LLM node from the left-side pane into the canvas. Connect it to the Start node and configure the parameters.
Configure the Start node: Delete the
city
anddate
parameters. The input node has a built-in default parameterquery
.Configure the LLM node:
Parameter
Example
Model Configuration
Qwen-Max
Temperature
Default
Maximum Reply Length
1024
enable_search
Disable
Prompt
System Prompt: Analyze and determine whether the given information is suspected of fraud. Provide a definite answer on whether there is a suspicion of fraud. Processing requirements: Carefully review the content of the information, focusing on keywords and typical fraud patterns, such as requests for urgent transfers, provision of personal information, and promises of unrealistic benefits. Procedure: 1. Identify key elements in the information, including but not limited to the sender's identity, requests made, promised returns, and any urgency expressions. 2. Compare with known fraud case characteristics to check if there are similar tactics or language patterns in the information. 3. Evaluate the overall reasonableness of the information, considering whether the requests made are in line with conventional logic and processes. 4. If the information contains links or attachments, do not click or download them directly to avoid potential security risks, and remind users of the dangers of such content. Output format: Clearly indicate whether the information exhibits characteristics of fraud and briefly explain the basis for judgment. If there is a suspicion of fraud, provide some suggestions or preventive measures to protect user safety. User Prompt: Determine whether “${sys.query}” is suspected of fraud.
NoteYou can enter
/
to insert variables. Select query from System Variables.Output
Default
Drag an Intent Classification node from the left-side pane into the canvas, connect the LLM node to the Intent Classification node, and configure the following parameters.
Parameter
Example
Input
Select
Model Configuration
Qwen-Plus
Intent Configuration
Add Category
The information involves fraud.
The information does not involve fraud.
Other Intents
Default
Output
Default
Drag a Text Conversion node from the left-side pane into the canvas, connect all outputs from the Intent Classification node to the Text Conversion node. Configure the following parameter.
Parameter
Example
Text Template
Enter
/
to insert variables. Select and subject, and .Connect the Text Conversion node to the End node, and configure the following parameters.
Parameter
Example
Output Mode
Text Output.
Input box
Enter
/
to insert variables. ChooseClick Test in the upper right corner, enter
Your mom misses you, call her when you have time
as the query, and click Execute.After the workflow is executed, the End node displays the Run Result.
Click Test in the upper right corner, enter
You have won a prize, please check
as the query, and click Execute.After the workflow is executed, the End node displays the Run Result.
Click Publish in the upper-right corner to publish the workflow application.
Advanced case
This example shows how to build an intelligent shopping assistant to recommend phones, TVs, and refrigerators using a dialog workflow. For a dialog workflow, the variable ${sys.query}
is the user input in the dialog box.
Go to My Applications in the Model Studio console.
Click Create Application, choose Workflow Application, and click Create Dialog Workflow.
On the Canvas Configuration page, you can see the Start node already has two preset parameters. You can modify them based on your requirements.
Drag an Intent Classification node from the left-side pane into the canvas. Connect the Start node to the intent classification node, and configure the following parameters.
Configure the Start node: Delete the
city
anddate
parameters. The input node has a built-in default parameterquery
.Configure the intent classification node:
Parameter
Example
Input
Select
.Model Configuration
Qwen-Plus
Intent Configuration
Add Category
TV
Phone
Refrigerator
Other Intents
Default
Output
Default
Drag an LLM node into the canvas configuration page, connect the TV output of the Intent Classification node to the LLM node, and configure the following parameters.
Parameter
Example
Model Configuration
Qwen-Max
Temperature
Default
Maximum Reply Length
1024
enable_search
Disable
Prompt
System Prompt: You are an intelligent shopping assistant responsible for recommending TVs to customers. You need to actively ask users what parameters they need for a TV according to the order in the [TV Parameter List] below, asking only one parameter at a time and not repeating questions for one parameter. If the user tells you the parameter value, you need to continue asking for the remaining parameters. If the user asks about the concept of this parameter, you need to use your professional knowledge to answer and continue to ask which parameter is needed. If the user mentions that they do not need to continue purchasing the product, please output: Thank you for visiting, looking forward to serving you next time. [TV Parameter List] 1. Screen Size: [50 inches, 70 inches, 80 inches] 2. Refresh Rate: [60Hz, 120Hz, 240Hz] 3. Resolution: [1080P, 2K, 4K] If all parameters in the [TV Parameter List] have been collected, you need to ask: "Are you sure you want to purchase?" and output the customer's selected parameter information at the same time, such as: 50 inches|120Hz|1080P. Ask if they are sure they need a TV with these parameters. If the customer decides not to purchase, ask which parameters need to be adjusted. If the customer confirms that these parameters meet their requirements, you need to output in the following format: [Screen Size: 50 inches, Refresh Rate: 120Hz, Resolution: 1080P]. Please only output this format and do not output other information. User Prompt: The user's question is: ${sys.query}
NoteYou can also enter
/
to insert variables. Select .Context
Disable
Output
Default
Similarly, drag the second LLM node into the canvas, connect
Phone
to the LLM node, and set up the relevant parameters.Parameter
Example
Model Configuration
Qwen-Max
Temperature
Default
Maximum Reply Length
1024
enable_search
Disable
Prompt
System Prompt: You are an intelligent shopping assistant responsible for recommending phones to customers. You need to actively ask users what parameters they need for a phone according to the order in the [Phone Parameter List] below, asking only one parameter at a time and not repeating questions for one parameter. If the user tells you the parameter value, you need to continue asking for the remaining parameters. If the user asks about the concept of this parameter, you need to use your professional knowledge to answer and continue to ask which parameter is needed. If the user mentions that they do not need to continue purchasing the product, please output: Thank you for visiting, looking forward to serving you next time. [Phone Parameter List] 1. Usage Scenario: [Gaming, Photography, Watching Movies] 2. Screen Size: [6.4 inches, 6.6 inches, 6.8 inches, 7.9 inches foldable screen] 3. RAM Space + Storage Space: [8GB+128GB, 8GB+256GB, 12GB+128GB, 12GB+256GB] If all parameters in the [Parameter List] have been collected, you need to ask: "Are you sure you want to purchase?" and output the customer's selected parameter information at the same time, such as: For photography|8GB+128GB|6.6 inches. Ask if they are sure they need a phone with these parameters. If the customer decides not to purchase, ask which parameters need to be adjusted. If the customer confirms that these parameters meet their requirements, you need to output in the following format: [Usage Scenario: Photography, Screen Size: 6.8 inches, Storage Space: 128GB, RAM Space: 8GB]. Please only output this format and do not output other information. User Prompt: The user's question is: ${sys.query}
Context
Disable
Output
Default
Similarly, drag the third LLM node into the canvas, connect
Refrigerator
to the LLM node, and configure the following parameters.Parameter
Example
Model Configuration
Qwen-Max
Temperature
Default
Maximum Reply Length
1024
enable_search
Disable
Prompt
System Prompt: You are an intelligent shopping assistant responsible for recommending refrigerators to customers. You need to actively ask users what parameters they need for a refrigerator according to the order in the [Refrigerator Parameter List] below, asking only one parameter at a time and not repeating questions for one parameter. If the user tells you the parameter value, you need to continue asking for the remaining parameters. If the user asks about the concept of this parameter, you need to use your professional knowledge to answer and continue to ask which parameter is needed. If the user mentions that they do not need to continue purchasing the product, please output: Thank you for visiting, looking forward to serving you next time. [Refrigerator Parameter List] 1. Usage Scenario: [Household, Small Commercial, Large Commercial] 2. Capacity: [200L, 300L, 400L, 500L] 3. Energy Efficiency Level: [Level 1, Level 2, Level 3] If all parameters in the [Parameter List] have been collected, you need to ask: "Are you sure you want to purchase?" and output the customer's selected parameter information at the same time, such as: For small commercial use|300L|Level 1. Ask if they are sure they need a refrigerator with these parameters. If the customer decides not to purchase, ask which parameters need to be adjusted. If the customer confirms that these parameters meet their requirements, you need to output in the following format: [Usage Scenario: Household, Capacity: 300L, Energy Efficiency Level: Level 1]. Please only output this format and do not output other information. User Prompt: The user's question is: ${sys.query}
Context
Disable
Output
Default
Drag a Text Conversion node from the left-side pane into the canvas, connect the three LLM nodes to the Text Conversion node, and configure the following parameters.
Parameter
Configure Corresponding Parameters
Text Template
Enter
/
to insert variables. Choose for all three LLM nodes.Drag another Text Conversion node into the canvas, connect the Other Intents output of the Intent Classification node to this Text Conversion node. Configure the following parameters.
Parameter
Example
Text Template
This product is not within the scope of the shopping assistant. Thank you for visiting, looking forward to serving you next time.
Connect both Text Conversion nodes to the End node, and configure the following parameters.
Parameter
Configure Corresponding Parameters
Output Mode
Text Output
Input box
Enter
/
to insert the variables. Choose for both nodes.Click Test in the upper right corner, enter
Tell me about your refrigerators, I need one for household use.
as the query, and click Execute.After the workflow is executed, the End node displays the Run Result.
Enter
Tell me about a 200L household refrigerator?
as the query, the End node displays the following Run Result.Enter
Tell me about your headphones?
as the query, the End node displays the following Run Result.Click Publish in the upper-right corner to publish the workflow application.