This section explains your questions such as how can you run your Python program python test.py
on the cloud using Batch Compute.
python test.py:
print('Hello, cloud!')
You first submit a job to Batch Compute which applies for a machine based on your configuration. Then start the VM and run python test.py
on the VM. The running result is automatically uploaded to your OSS bucket. You can view the running result in your OSS bucket.
1. You can submit a job in multiple methods. The following describes four methods
1.1. Use a command line tool (a command) to submit a job
bcs sub "python test.py" -p ./test.py
The job is submitted.
When this command is run, the file test.py is packed into worker.tar.gz, uploaded to the specified place, and then submitted to a job for running.
To run the bcs command, install the Batch Compute-cli tool first. For more information, see Here.
bcs sub command:
bcs sub <commandLine> [job_name] [options]
To view more parameter details, run bcs sub -h
.
1.2. Use console to submit a job
The detailed steps are as follows:
1.2.1. Pack and upload test.py to OSS
Run the following command in the directory of test.py:
tar -czf worker.tar.gz test.py # Packs test.py into worker.tar.gz.
Use the OSS console to upload worker.tar.gz to your OSS bucket.
If you must have signed up OSS service. You must have created a bucket. Assume that the bucket name is mybucket. Create a directory named test in this bucket.
Assume that you upload the file to the directory test in mybucket. Then the file path in your OSS instance is oss://mybucket/test/worker.tar.gz
.
1.2.2. Use the console to submit a job
Go to the Submit job page.
Enter the job name test_job as prompted.
Drag a job and enter the fields as follows. The ECS image ID can be obtained from Image.
Click Submit Job to submit the job.
After the job is successfully submitted, the page automatically jumps to the Job List page where you can view the status of the job you submitted.
Wait a moment. You can view the result after the job running is finished.
1.3. Use Python SDK to submit a job
1.3.1. Pack and upload test.py to OSS.
Same to the previous section.
1.3.2. Submit a job.
from batchcompute import Client, ClientError
from batchcompute import CN_SHENZHEN as REGION
ACCESS_KEY_ID = 'your_access_key_id' #This parameter needs to be configured.
ACCESS_KEY_SECRET = 'your_access_key_secret' #This parameter needs to be configured.
job_desc = {
"Name": "my_job_name",
"Description": "hello test",
"JobFailOnInstanceFail": true,
"Priority": 0,
"Type": "DAG",
"DAG": {
"Tasks": {
"test": {
"InstanceCount": 1,
"MaxRetryCount": 0,
"Parameters": {
"Command": {
"CommandLine": "python test.py",
"PackagePath": "oss://mybucket/test/worker.tar.gz"
},
"StderrRedirectPath": "oss://mybucket/test/logs/",
"StdoutRedirectPath": "oss://mybucket/test/logs/"
},
"Timeout": 21600,
"AutoCluster": {
"InstanceType": "ecs.sn1.medium",
"ImageId": "img-ubuntu"
}
}
},
"Dependencies": {}
}
}
client = Client(REGION, ACCESS_KEY_ID, ACCESS_KEY_SECRET)
result = client.create_job(job_desc)
job_id = result.Id
....
For more information about the Python SDK, see Python SDK.
1.4. Use Java SDK to submit a job
1.4.1. Pack and upload test.py to OSS.
Same to the previous section.
1.4.2. Submit a job.
import com.aliyuncs.batchcompute.main.v20151111.*;
import com.aliyuncs.batchcompute.model.v20151111.*;
import com.aliyuncs.batchcompute.pojo.v20151111.*;
import com.aliyuncs.exceptions.ClientException;
public class SubmitJob{
String REGION = "cn-shenzhen";
String ACCESS_KEY_ID = ""; //This parameter needs to be configured.
String ACCESS_KEY_SECRET = ""; //This parameter needs to be configured.
public static void main(String[] args) throws ClientException{
JobDescription desc = new SubmitJob().getJobDesc();
BatchCompute client = new BatchComputeClient(REGION, ACCESS_KEY_ID, ACCESS_KEY_SECRET);
CreateJobResponse res = client.createJob(desc);
String jobId = res.getJobId();
//...
}
private JobDescription getJobDesc() {
JobDescription desc = new JobDescription();
desc.setName("testJob");
desc.setPriority(1);
desc.setDescription("JAVA SDK TEST");
desc.setType("DAG");
desc.setJobFailOnInstanceFail(true);
DAG dag = new DAG();
dag.addTask(getTaskDesc());
desc.setDag(dag);
return desc;
}
private TaskDescription getTaskDesc() {
TaskDescription task = new TaskDescription();
task.setClusterId(gClusterId);
task.setInstanceCount(1);
task.setMaxRetryCount(0);
task.setTaskName("test");
task.setTimeout(10000);
AutoCluster autoCluster = new AutoCluster();
autoCluster.setImageId("img-ubuntu");
autoCluster.setInstanceType("ecs.sn1.medium");
// autoCluster.setResourceType("OnDemand");
task.setAutoCluster(autoCluster);
Parameters parameters = new Parameters();
Command cmd = new Command();
cmd.setCommandLine("python test.py");
// cmd.addEnvVars("a", "b");
cmd.setPackagePath("oss://mybucket/test/worker.tar.gz");
parameters.setCommand(cmd);
parameters.setStderrRedirectPath("oss://mybucket/test/logs/");
parameters.setStdoutRedirectPath("oss://mybucket/test/logs/");
// InputMappingConfig input = new InputMappingConfig();
// input.setLocale("GBK");
// input.setLock(true);
// parameters.setInputMappingConfig(input);
task.setParameters(parameters);
// task.addInputMapping("oss://my-bucket/disk1/", "/home/admin/disk1/");
// task.addOutputtMapping("/home/admin/disk2/", "oss://my-bucket/disk2/");
// task.addLogMapping( "/home/admin/a.log","oss://my-bucket/a.log");
return task;
}
}
For more information about the Java SDK, see Java SDK.
2. CommandLine in Batch Compute:
CommandLine is different from Shell. It supports only the form of
program + parameter
, for example,python test.py
orsh test.sh
.To run a Shell, run
/bin/bash -c 'cd /home/xx/ && python a.py'
.To write a Shell to an SH script such as test.sh, run
sh test.sh
.
CommandLine location:
cmd of
bcs sub <cmd> [job_name] [options]
in the command line tool.When the Java SDK is used, cmd in
cmd.setCommandLine(cmd)
.taskName.Parameters.Command.CommandLine
in the Python SDK.