All Products
Search
Document Center

ApsaraVideo Live:Live stream recording

Last Updated:Oct 12, 2024

ApsaraVideo Live provides the live stream recording feature that allows you to record live streams in real time and play historical streams. This topic describes the live stream recording feature in detail.

Overview

ApsaraVideo Live uses streaming URLs to play streams that are ingested by the stream ingest client in real time. If you want to play or modify the content of a live stream after the live stream ends, you can use the live stream recording feature.

The live stream recording feature is used to record the ingested streams that are received by a live center and store the recording files in a specific storage location. ApsaraVideo Live allows you to store the recording files of a live stream in ApsaraVideo VOD or an Object Storage Service (OSS) bucket.

How live stream recording works

The live stream recording feature pulls the streams you ingest, and encapsulates audio and video data into a number of separate media segment files in the Transport Stream (TS) format, and then stores the TS files. The live stream recording feature complies with the following principles:

  • The live stream recording feature only converts the container format of audio and video data from Real-Time Messaging Protocol (RTMP) or Flash Video (FLV) to TS. The feature does not modify the audio and video content at the coding layer. For example, if the live stream that you ingest is blurry, the recording is also blurry.

  • The live stream recording feature can write recording files to the storage location that you specify. However, the feature cannot modify or delete the task files from the storage location, including the stored recording files. You can manage the recording files in your storage location.

Storage method comparison

You can store a live stream in ApsaraVideo VOD or an OSS bucket. The storage methods are used in different scenarios. You can select a storage method based on your business requirements.

Storage method

Associated service

Container format

Scenario

ApsaraVideo VOD

ApsaraVideo VOD

TS/M3U8

You can repurpose your recordings or focus on the post-production process of recordings, such as editing, playback, and video thumbnail creation.

OSS bucket

OSS

  • TS/M3U8

  • MP4

  • FLV

  • CMAF

    Note

    Regardless of the container format, the live stream recording feature first slices the live streams into a number of TS segments and then composes them to generate a recording file in the specified container format.

You can focus on the management and storage of recordings.

Note

You cannot store the recording files of a live stream in both OSS buckets and ApsaraVideo VOD.

Limits

Audio and video encoding formats

ApsaraVideo Live needs to slice live streams into TS segments first. The audio and video encoding formats of the live streams that you ingest must meet the requirements of the TS format. Based on the FLV and ISO/IEC 13818-1 standards, the recording feature supports the following encoding formats:

  • Video: H.264, High Efficiency Video Coding (HEVC), and MPEG-4

  • Audio: Advanced Audio Coding (AAC) and MP3

Note

If the live streams contain audio and video data that is encoded in unsupported formats, unexpected exceptions, such as recording failures, black screen issues, and recordings without sound, may occur in the recording process.

Abnormal live streams

The live stream recording feature can generate recording files for normal live streams. Normal live streams include streams that are encoded in the supported formats, streams that have stable frame rates, and streams for which the timestamps are monotonically and linearly increased. If the frame rate of a live stream is occasionally unstable or the timestamp of the live stream jumps, the feature can also generate a recording file for the live stream due to its wide compatibility. However, if a serious exception is detected on an ingested live stream, the feature may fail to generate recording files for the ingested live stream Serious exceptions include no video frame detected for a long period of time, unexpected timestamps increase, and missing of audio and video headers.

Store recording files in ApsaraVideo VOD

Prerequisites

ApsaraVideo VOD is activated. For more information, see Activate ApsaraVideo VOD.

The VOD bucket is enabled to store recording files in ApsaraVideo VOD. For more information, see Enable VOD buckets.

Important
  • When you enable the VOD bucket, make sure that the VOD bucket resides in the same region as the live center of the streaming domain.

  • You cannot configure the live-to-VOD feature by using a Finance Cloud account.

Configure storage settings

After the prerequisites are met, you can configure the required settings to store recording files in ApsaraVideo VOD.

Note

If you store recording files in ApsaraVideo VOD, storage fees are generated. For more information, see Media storage.

ApsaraVideo Live provides the following methods to store recording files in ApsaraVideo VOD:

1. Configure settings in the ApsaraVideo Live console. For more information, see Store recordings in ApsaraVideo VOD.

2. Call an API operation. For more information, see AddLiveRecordVodConfig. The following section describes how to use SDK for Java to configure settings for recording files to be stored in ApsaraVideo VOD.

//Replace the content in the angle brackets <> with actual values.
DefaultProfile profile = DefaultProfile.getProfile("<regionId>", "<ALIBABA_CLOUD_ACCESS_KEY_ID>", "<ALIBABA_CLOUD_ACCESS_KEY_SECRET>");
IAcsClient client = new DefaultAcsClient(profile);   
AddLiveRecordVodConfigRequest addLiveRecordVodConfigRequest=new AddLiveRecordVodConfigRequest();
addLiveRecordVodConfigRequest.setDomainName("<DomainName>");
addLiveRecordVodConfigRequest.setAppName("<AppName>");
addLiveRecordVodConfigRequest.setStreamName("<StreamName>");
addLiveRecordVodConfigRequest.setVodTranscodeGroupId("VOD_NO_TRANSCODE");
//The address of the VOD bucket.
addLiveRecordVodConfigRequest.setStorageLocation("<StorageLocation>");
AddLiveRecordVodConfigResponse dddLiveRecordVodConfigResponse = null;
try {
    dddLiveRecordVodConfigResponse = client.getAcsResponse(addLiveRecordVodConfigRequest);
} catch (ClientException e) {
    e.printStackTrace();
}
 System.out.println(new Gson().toJson(dddLiveRecordVodConfigResponse));
Note
  • After you run the sample code, the recording file of the stream that is played by using the streaming domain is stored in the VOD bucket. <DomainName> specifies the streaming domain. <AppName> specifies the application name. <StreamName> specifies the stream name. VOD_NO_TRANSCODE specifies that no transcoding is required for the recording. <StorageLocation> specifies the address of the VOC bucket.

  • If you do not want to specify an application or a live stream, you can set the AppName and StreamName parameters to asterisks (*), which indicates all applications and streams.

  • You can configure multiple rules. The rules take effect based on priorities:

    • If the AppName and StreamName parameters are not set to asterisks (*) in a rule, the rule has the highest priority.

    • If the AppName parameter is not set to an asterisk (*) in a rule, the rule has the second priority.

    • If the AppName parameter is set to an asterisk (*) in a rule, the rule has the lowest priority. In this case, only the stream domain name is specified.

Important
  • After you configure the recording rules, you must ingest the stream again for the configuration to take effect.

  • After a live stream is interrupted for 180 seconds, a recording file is generated for the live stream.

Transcode a recording

The recording file is stored in the M3U8 format. If you want to store the recording file in another format, you can configure transcoding. Sample code for configuring transcoding:

 addLiveRecordVodConfigRequest.setVodTranscodeGroupId("<TranscodeGroupId>");
Note
  • The transcoding template must be configured in ApsaraVideo VOD. For more information, see Transcoding template.

  • <TranscodeGroupId> specifies the ID of the transcoding template group in ApsaraVideo VOD.

  • After you configure transcoding, the recording file is transcoded and the transcoded stream URL is generated. The original recording file is retained.

Merge recordings

Recordings can be merged in the following scenarios:

1. If a live stream is interrupted but is re-ingested within 180 seconds, the recording files that are generated before and after the interruption are automatically merged into one recording file.

2. If the duration of a live stream exceeds the recording cycle, the recording files that are generated before and after the recording cycle can be merged. After the duration of a live stream exceeds the recording cycle, a new recording file is generated. For example, if the recording cycle is 5 minutes and the stream ingest duration is 8 minutes, two recording files are generated after a stream is ingested.

Sample code for configuring a recording cycle:

//Unit: seconds. Valid values: 300 to 21600.
addLiveRecordVodConfigRequest.setCycleDuration(<300>);
Note

If you do not specify a recording cycle, the default recording cycle is 15 minutes.

If you want to merge the recording files of different recording cycles, you can configure the merging settings.

Note

After you enable the auto merging of recording files of multiple cycles, the editing and production feature and the transcoding feature of ApsaraVideo VOD are required. For more information, see Billing of value-added services and Billing of basic services.

Sample code for merging recording files:

addLiveRecordVodConfigRequest.setAutoCompose("ON");
addLiveRecordVodConfigRequest.setComposeVodTranscodeGroupId("<TranscodeGroupId>");
Note
  • After multiple recording files are merged, a new recording file is generated and the original recording files are retained.

  • The transcoding template must be configured in ApsaraVideo VOD. For more information, see Transcoding template.

  • <TranscodeGroupId> specifies the ID of the transcoding template group in ApsaraVideo VOD.

Manage recording files

You can view the recording files that are stored in the VOD bucket in the ApsaraVideo Live console. For more information, see View recordings. To manage recording files, you can use ApsaraVideo VOD. For more information, see Media asset management.

Store recording files in an OSS bucket

Prerequisites

OSS is activated and an OSS bucket is created. For more information, see Configure OSS.

Important

The OSS bucket must reside in the same region as the live center of the streaming domain.

Configure storage settings

After the prerequisites are met, you can configure the required settings to store recording files in an OSS bucket.

Note

If you store recording files in an OSS bucket, storage fees are generated. For more information, see Storage fees.

ApsaraVideo Live provides the following methods to store recording files in an OSS bucket:

1. Configure settings in the OSS console: For more information, see Store recordings in OSS.

2. Call an API operation: For more information, see AddLiveAppRecordConfig. The following section describes how to use SDK for Java to configure settings for recording files to be stored in an OSS bucket.

        //Replace the content in the angle brackets <> with actual values.
        DefaultProfile profile = DefaultProfile.getProfile("<regionId>", "<ALIBABA_CLOUD_ACCESS_KEY_ID>", "<ALIBABA_CLOUD_ACCESS_KEY_SECRET>");
        IAcsClient client = new DefaultAcsClient(profile);   
        AddLiveAppRecordConfigRequest addLiveAppRecordConfigRequest=new AddLiveAppRecordConfigRequest();
        addLiveAppRecordConfigRequest.setAppName("<AppName>");
        addLiveAppRecordConfigRequest.setStreamName("<StreamName>");
        addLiveAppRecordConfigRequest.setDomainName("<DomainName>");
        addLiveAppRecordConfigRequest.setOssEndpoint("<OssEndpoint>");
        addLiveAppRecordConfigRequest.setOssBucket("<OssBucket>");
        List<AddLiveAppRecordConfigRequest.RecordFormat> formatList=new ArrayList<AddLiveAppRecordConfigRequest.RecordFormat>();
        AddLiveAppRecordConfigRequest.RecordFormat m3u8Format=new AddLiveAppRecordConfigRequest.RecordFormat();
        m3u8Format.setFormat("m3u8");
        //The name of the recording file that is stored in the OSS bucket.
        m3u8Format.setOssObjectPrefix("<record/{AppName}/{StreamName}/{EscapedStartTime}_{EscapedEndTime}>");
        //The name of the segment.
        m3u8Format.setSliceOssObjectPrefix("<record/{AppName}/{StreamName}/{UnixTimestamp}_{Sequence}>");
        formatList.add(m3u8Format);
        addLiveAppRecordConfigRequest.setRecordFormats(formatList);
        AddLiveAppRecordConfigResponse addLiveAppRecordConfigResponse = null;
        try {
            addLiveAppRecordConfigResponse = client.getAcsResponse(addLiveAppRecordConfigRequest);
        } catch (ClientException e) {
            e.printStackTrace();
        }
        System.out.println(new Gson().toJson(addLiveAppRecordConfigResponse));
Note
  • After you run the sample code, the recording file of the stream that is played by using the streaming domain is stored in the OSS bucket. <DomainName> specifies the streaming domain. <AppName> specifies the application name. <StreamName> specifies the stream name. <OssBucket> specifies the name of the OSS bucket. The recording file is stored in the M3U8 format.

  • The value of the OssObjectPrefix or SliceOssObjectPrefix parameter cannot exceed 255 characters in length after variables are matched. For more information, see AddLiveAppRecordConfig.

  • You can obtain the value of the <OssEndpoint> parameter based on the region of the OSS bucket that is specified by the <OssBucket> parameter. For more information, see Regions, endpoints and open ports.

  • If you do not want to specify an application or a live stream, you can set the AppName and StreamName parameters to asterisks (*), which indicates all applications and streams.

  • You can configure multiple rules. The rules take effect based on priorities:

    • If the AppName and StreamName parameters are not set to asterisks (*) in a rule, the rule has the highest priority.

    • If the AppName parameter is not set to an asterisk (*) in a rule, the rule has the second priority.

    • If the AppName parameter is set to an asterisk (*) in a rule, the rule has the lowest priority. In this case, only the stream domain name is specified.

Important
  • After you configure the recording rules, you must ingest the stream again for the configuration to take effect.

  • After a live stream is interrupted for 180 seconds, a recording file is generated for the live stream. You can specify the interruption duration. For more information, see Configure the interruption duration for merge.

Transcode a recording

If you want to reduce the size of a recording file, you can record transcoded streams to adjust the video bitrate, improve the video compression ratio without affecting the video quality.

Note

OSS allows you to store the original and transcoded streams. You can also store only the original or transcoded stream.

Sample code for recording transcoded streams:

        //Replace the content in the angle brackets <> with actual values.        
        List<AddLiveAppRecordConfigRequest.TranscodeRecordFormat> transcodeFormatList=new ArrayList<AddLiveAppRecordConfigRequest.TranscodeRecordFormat>();
        AddLiveAppRecordConfigRequest.TranscodeRecordFormat m3u8TranscodeFormat=new AddLiveAppRecordConfigRequest.TranscodeRecordFormat();
        m3u8TranscodeFormat.setFormat("m3u8");
        //The name of the recording file for the transcoded stream in an OSS bucket.
        m3u8TranscodeFormat.setOssObjectPrefix("<transcodeRecord/{AppName}/{StreamName}/{EscapedStartTime}_{EscapedEndTime}>");
        //The name of the segment of the transcoded stream.
        m3u8TranscodeFormat.setSliceOssObjectPrefix("<transcodeRecord/{AppName}/{StreamName}/{UnixTimestamp}_{Sequence}>");
        transcodeFormatList.add(m3u8TranscodeFormat);
        addLiveAppRecordConfigRequest.setTranscodeRecordFormats(transcodeFormatList);
        List<String> transcodeTemplateList=new ArrayList<String>();
        //The ID of the transcoding template.
        transcodeTemplateList.add("lld");
        addLiveAppRecordConfigRequest.setTranscodeTemplatess(transcodeTemplateList);
Note
  • If you use the live stream transcoding feature, you are charged based on the transcoding standard, resolution, and total length of transcoded streams. For more information, see Billing of live stream transcoding.

  • If you want to record only transcoded streams, you do not need to configure the setRecordFormats parameter that specifies the original stream.

  • If you configure the setTranscodecordFormats parameter, you must also configure the setTranscodeTemplatess parameter to specify the transcoding template ID. You must configure the transcoding template. For more information, see Live Stream Transcoding.

  • When you configure a transcoding template for transcoded stream recording, you must configure the transcoding to be triggered by stream ingest.

  • The StreamName parameter specifies the name of the recording file of the transcoded stream. The name is in the StreamName_Transcoding template ID format.

Configure the interruption duration for merge

If a live stream is interrupted but is ingested again within the interruption duration for merge, the generated recording files are automatically merged into one recording file. By default, the interruption duration is 180 seconds.

Note

A new recording file is generated only after the interruption duration for merge elapses.

You can configure the interruption duration for merge based on your business requirements. Sample code:

//Unit: seconds. Valid values: 15 to 21600.
addLiveAppRecordConfigRequest.setDelayTime(<180>);

If you do not want to generate a recording file after the interruption duration for merge elapses, you can configure the system to generate a recording file immediately after the live stream ends. You can configure a stream ingest callback. When you receive the notification that the stream ingest process is complete, you can call the RealTimeRecordCommand operation to disable the settings of the interruption duration for merge. For more information, see RealTimeRecordCommand.

For more information about how to configure a callback for a stream, see Callbacks for stream ingest status. Sample code for disabling the settings of the interruption duration for merge:

        //Replace the content in the angle brackets <> with actual values.
        RealTimeRecordCommandRequest realTimeRecordCommandRequest=new RealTimeRecordCommandRequest();
        realTimeRecordCommandRequest.setCommand("cancel_delay");
        realTimeRecordCommandRequest.setAppName("<AppName>");
        realTimeRecordCommandRequest.setStreamName("<StreamName>");
        realTimeRecordCommandRequest.setDomainName("<DomainName>");
        RealTimeRecordCommandResponse realTimeRecordCommandResponse = null;
        try {
            realTimeRecordCommandResponse = client.getAcsResponse(realTimeRecordCommandRequest);
        } catch (ClientException e) {
            e.printStackTrace();
        }
        System.out.println(new Gson().toJson(realTimeRecordCommandResponse));
Note
  • If you record a transcoded stream, you must specify the <StreamName> parameter in the StreamName_transcoding template ID format.

  • setCommand("cancel_delay"): resets the interruption duration for merge, completely stops recording, and immediately generates a recording file.

Configure recording rules

In the Configure storage settings section, the live stream recording feature is configured. When a live stream is ingested, the recording process starts. This process is called automatic recording.

In some cases, you may want to determine whether to record a live stream based on your business requirements. You can use one of the following methods to record a live stream as required:

1. ApsaraVideo Live sends a request to ask whether to record a live stream. After ApsaraVideo Live receives a response to record the live stream, the recording process starts. This process is called on-demand recording.

2. Your business system sends a notification to ApsaraVideo Live to record a live stream. This process is called manual recording.

On-demand recording

To implement on-demand recording, your business system must provide a callback URL for ApsaraVideo Live. For more information, see Callbacks for on-demand recording.

Sample code for configuring on-demand recording:

        addLiveAppRecordConfigRequest.setOnDemand(1);
Note

Set the OnDemand parameter to 1 to enable on-demand recording that uses an HTTP callback method.

You must also call the AddLiveRecordNotifyConfig operation to configure recording callbacks. Sample code:

        //Replace the content in the angle brackets <> with actual values.
        AddLiveRecordNotifyConfigRequest addLiveRecordNotifyConfigRequest=new AddLiveRecordNotifyConfigRequest();
        //Specify a callback URL for on-demand recording.
        addLiveRecordNotifyConfigRequest.setOnDemandUrl("<OnDemandUrl>");
        addLiveRecordNotifyConfigRequest.setNotifyUrl("<NotifyUrl>");
        addLiveRecordNotifyConfigRequest.setDomainName("<DomainName>");
        AddLiveRecordNotifyConfigResponse addLiveRecordNotifyConfigResponse = null;
        try {
            addLiveRecordNotifyConfigResponse = client.getAcsResponse(addLiveRecordNotifyConfigRequest);
        } catch (ClientException e) {
            e.printStackTrace();
        }
        System.out.println(new Gson().toJson(addLiveRecordNotifyConfigResponse));
Important
  • <OnDemandUrl> specifies the callback URL for on-demand recording. <NotifyUrl> specifies the callback URL that is used to receive notifications about recording events and status.

  • The <NotifyUrl> parameter is required. If you want to receive only on-demand recording requests but do not want to call back recording events and status, you can set the <NotifyUrl> parameter to a URL that does not support the requested method.

Manual recording

To enable manual recording, you must disable automatic recording. Sample code:

        addLiveAppRecordConfigRequest.setOnDemand(7);
Note

Set the OnDemand parameter to 7 to disable automatic recording.

During live streaming, you can call the RealTimeRecordCommand operation to enable recording based on your business requirements. Sample code:

        //Replace the content in the angle brackets <> with actual values. 
        RealTimeRecordCommandRequest realTimeRecordCommandRequest=new RealTimeRecordCommandRequest();
        realTimeRecordCommandRequest.setCommand("start");
        realTimeRecordCommandRequest.setAppName("<AppName>");
        realTimeRecordCommandRequest.setStreamName("<StreamName>");
        realTimeRecordCommandRequest.setDomainName("<DomainName>");
        RealTimeRecordCommandResponse realTimeRecordCommandResponse = null;
        try {
            realTimeRecordCommandResponse = client.getAcsResponse(realTimeRecordCommandRequest);
        } catch (ClientException e) {
            e.printStackTrace();
        }
        System.out.println(new Gson().toJson(realTimeRecordCommandResponse));
Note
  • If you record a transcoded stream, you must specify the <StreamName> parameter in the StreamName_transcoding template ID format.

  • setCommand("start"): starts the recording.

Stop a recording

You can manually stop an automatic recording, on-demand recording, or a manual recording. Sample code:

        //Replace the content in the angle brackets <> with actual values. 
        RealTimeRecordCommandRequest realTimeRecordCommandRequest=new RealTimeRecordCommandRequest();
        realTimeRecordCommandRequest.setCommand("stop");
        realTimeRecordCommandRequest.setAppName("<AppName>");
        realTimeRecordCommandRequest.setStreamName("<StreamName>");
        realTimeRecordCommandRequest.setDomainName("<DomainName>");
        RealTimeRecordCommandResponse realTimeRecordCommandResponse = null;
        try {
            realTimeRecordCommandResponse = client.getAcsResponse(realTimeRecordCommandRequest);
        } catch (ClientException e) {
            e.printStackTrace();
        }
        System.out.println(new Gson().toJson(realTimeRecordCommandResponse));
Note
  • setCommand("stop"): stops the recording.

  • If you manually stop a recording, a recording file is generated after the interruption duration for merge elapses.

  • If you record a transcoded stream, you must specify the <StreamName> parameter in the StreamName_transcoding template ID format.

Related operations

Operation

Description

AddLiveRecordNotifyConfig

Creates a recording callback configuration for a live stream recording.

DescribeLiveRecordNotifyConfig

Queries the recording callback configurations of a live stream recording.

UpdateLiveRecordNotifyConfig

Modifies the recording callback configuration of a live stream recording.

DeleteLiveRecordNotifyConfig

Deletes the recording callback configuration of a live stream recording.

Configure a recording cycle

After the duration of live streaming exceeds the recording cycle that you specify, a new recording file is generated. For example, if the live streaming lasts for 20 minutes and the recording cycle is set to 15 minutes, two recording files are generated. You can set the recording cycle to a value that ranges from 15 to 360 minutes.

Sample code for configuring a recording cycle:

        List<AddLiveAppRecordConfigRequest.RecordFormat> formatList=new ArrayList<AddLiveAppRecordConfigRequest.RecordFormat>();
        AddLiveAppRecordConfigRequest.RecordFormat m3u8Format=new AddLiveAppRecordConfigRequest.RecordFormat();
        m3u8Format.setFormat("m3u8");
        m3u8Format.setOssObjectPrefix("<record/{AppName}/{StreamName}/{EscapedStartTime}_{EscapedEndTime}>");
        m3u8Format.setSliceOssObjectPrefix("<record/{AppName}/{StreamName}/{UnixTimestamp}_{Sequence}>");
       //Configure the recording cycle. Unit: seconds.
        m3u8Format.setCycleDuration(<900>);
        AddLiveAppRecordConfigRequest.RecordFormat mp4Format=new AddLiveAppRecordConfigRequest.RecordFormat();
        mp4Format.setFormat("mp4");
        mp4Format.setOssObjectPrefix("<record/{AppName}/{StreamName}/{EscapedStartTime}_{EscapedEndTime}>");
        //Configure the recording cycle. Unit: seconds.
        mp4Format.setCycleDuration(<1500>);
        formatList.add(mp4Format);
        formatList.add(m3u8Format);
        addLiveAppRecordConfigRequest.setRecordFormats(formatList);
Note

After you run the sample code, the recording file is stored in two formats, and each is configured with a specific recording cycle. If a stream is ingested for 20 minutes, three recording files are generated. One recording file is in the MP4 format, and two recording files are in the M3U8 format.

If the live streaming duration does not exceed the recording cycle, the recording file is generated after the live streaming ends. If you want to view the recording file before the live streaming ends, you can call the RealTimeRecordCommand operation to refresh the recording content and force the recording to be restarted. If you restart the recording when the live stream is being recorded, a recording file is immediately generated.

Sample code for restarting a recording:

        //Replace the content in the angle brackets <> with actual values. 
        RealTimeRecordCommandRequest realTimeRecordCommandRequest=new RealTimeRecordCommandRequest();
        realTimeRecordCommandRequest.setCommand("restart");
        realTimeRecordCommandRequest.setAppName("<AppName>");
        realTimeRecordCommandRequest.setStreamName("<StreamName>");
        realTimeRecordCommandRequest.setDomainName("<DomainName>");
        RealTimeRecordCommandResponse realTimeRecordCommandResponse = null;
        try {
            realTimeRecordCommandResponse = client.getAcsResponse(realTimeRecordCommandRequest);
        } catch (ClientException e) {
            e.printStackTrace();
        }
        System.out.println(new Gson().toJson(realTimeRecordCommandResponse));
Note
  • setCommand("restart"): restarts the recording.

  • If you record a transcoded stream, you must specify the <StreamName> parameter in the StreamName_transcoding template ID format.

  • If you configure multiple storage formats for the recording files of live streams, recording files generated after the restart are stored in the configured formats.

Merge and extract recording files

ApsaraVideo Live allows you to create index files and extract recording files by time range. You can also merge multiple recording files that are generated within a time range into one recording file.

Note
  • Regardless of the storage format of recording files, ApsaraVideo Live first slices the live streams into a number of TS segments and then composes them to generate recording files in the specified container format. You can create an M3U8 index file that can reference TS segments. This way, you can merge and extract recording files based on the index file.

  • Only recording files that are stored in the M3U8 format can be merged or extracted.

You can call the CreateLiveStreamRecordIndexFiles operation to create an index file.

        CreateLiveStreamRecordIndexFilesRequest createLiveStreamRecordIndexFilesRequest=new CreateLiveStreamRecordIndexFilesRequest();
        createLiveStreamRecordIndexFilesRequest.setAppName("<AppName>");
        createLiveStreamRecordIndexFilesRequest.setStreamName("<StreamName>");
        createLiveStreamRecordIndexFilesRequest.setDomainName("<DomainName>");
        createLiveStreamRecordIndexFilesRequest.setOssEndpoint("<OssEndpoint>");
        createLiveStreamRecordIndexFilesRequest.setOssBucket("<OssBucket>");
        createLiveStreamRecordIndexFilesRequest.setStartTime("<StartTime>");
        createLiveStreamRecordIndexFilesRequest.setEndTime("<EndTime>");
        createLiveStreamRecordIndexFilesRequest.setOssObject("<IndexFiles/OssObject.m3u8>");
        CreateLiveStreamRecordIndexFilesResponse createLiveStreamRecordIndexFilesResponse = null;
        try {
            createLiveStreamRecordIndexFilesResponse = client.getAcsResponse(createLiveStreamRecordIndexFilesRequest);
        } catch (ClientException e) {
            e.printStackTrace();
        }
        System.out.println(new Gson().toJson(createLiveStreamRecordIndexFilesResponse));
Note
  • In the preceding sample code, an index file is created for the recording file of a live stream within a specific time range. <AppName> specifies the application name. <StreamName> specifies the stream name. <StartTime> specifies the start time of the time range. <EndTime> specifies the end of the time range. The index file is stored in an OSS bucket. <OssEndpoint> specifies the OSS endpoint. <OssBucket> specifies the name of the OSS bucket. <IndexFiles/OssObject.m3u8> specifies the name of the index file.

  • The values of the <StartTime> and <EndTime> parameters are in the yyyy-MM-ddTHH:mm:ssZ(UTC time) format.

Important
  • You can create an index file only after a live stream is ingested. If no live stream is available within the specified time range or the name of the specified live stream is invalid, the index file fails to be created.

  • The recording content contains files and information about the files, such as application names, stream names, and OSS storage paths. The information about TS segments and M3U8 index files is stored in ApsaraVideo Live. The files, such as TS segments and M3U8 index files, are stored in OSS buckets.

  • The retention period of the files that are stored in OSS buckets is determined by the configuration of the OSS bucekts.

  • ApsaraVideo Live stores information about TS segments only for three months. You can create M3U8 index files only for the recordings of the previous three months.

  • ApsaraVideo Live stores information about M3U8 index files only for six months. You can query only the information about the M3U8 index files that are created in the previous six months.

  • If you want to merge or extract transcoded streams, you must specify the StreamName parameter in the StreamName_Transcoding template ID format.

  • The OSS bucket that is specified by the <OssBucket> parameter must reside in the same region as the required live center. Otherwise, you cannot use domain names to query information about the M3U8 index files.

Related operations

Operation

Description

DescribeLiveStreamRecordIndexFiles

Queries all index files within a specific time period.

DescribeLiveStreamRecordIndexFile

Queries the information about an index file.

Manage recording files

ApsaraVideo Live provides the following methods to manage recording files:

1. Configure settings in the ApsaraVideo Live console. For more information, see Manage recordings.

2. Call an APIoperation. The following section describes how to use SDK for Java to delete recording files.

The following section describes how to use SDK for Java to delete recording files:

  1. Query information about a recording file.

    Note

    To delete a recording file, you must specify the index file ID. You can obtain the index file ID from the information about the recording file.

            //Replace the content in the angle brackets <> with actual values. 
            DefaultProfile profile = DefaultProfile.getProfile("<regionId>", "<ALIBABA_CLOUD_ACCESS_KEY_ID>", "<ALIBABA_CLOUD_ACCESS_KEY_SECRET>");
            IAcsClient client = new DefaultAcsClient(profile);
            DescribeLiveStreamRecordIndexFilesRequest describeLiveStreamRecordIndexFilesRequest=new DescribeLiveStreamRecordIndexFilesRequest();
            describeLiveStreamRecordIndexFilesRequest.setAppName("<AppName>");
            describeLiveStreamRecordIndexFilesRequest.setStreamName("<StreamName>");
            describeLiveStreamRecordIndexFilesRequest.setDomainName("<DomainName>");
            describeLiveStreamRecordIndexFilesRequest.setStartTime("<StartTime>");
            describeLiveStreamRecordIndexFilesRequest.setEndTime("<EndTime>");
            DescribeLiveStreamRecordIndexFilesResponse describeLiveStreamRecordIndexFilesResponse = null;
            try {
                describeLiveStreamRecordIndexFilesResponse = client.getAcsResponse(describeLiveStreamRecordIndexFilesRequest);
            } catch (ClientException e) {
                e.printStackTrace();
            }
    
            for(int i=0;i<describeLiveStreamRecordIndexFilesResponse.getRecordIndexInfoList().size();i++){
                System.out.println(new Gson().toJson(describeLiveStreamRecordIndexFilesResponse.getRecordIndexInfoList().get(i)));
    
            }
    Important
    • You can query only the data within the previous six months.

    • The values of the EndTime and StartTime parameters are in the yyyy-MM-ddTHH:mm:ssZ(UTC time) format. The interval cannot exceed four days.

    • For more information about the operation that you can call to query recording information, see DescribeLiveStreamRecordIndexFiles.

  2. Delete a recording file.

    Note

    If you want to delete files that are stored in OSS buckets when you delete recording files in ApsaraVideo Live, you must create the AliyunMTSVideoLifecycleRole role and attach the AliyunMTSVideoLifecycleRolePolicy system policy to the role. ApsaraVideo Live stores information about a recording file, such as the application name, the stream name, and the OSS storage path. OSS stores the file. For more information, see Delete recordings.

            DeleteLiveStreamRecordIndexFilesRequest deleteLiveStreamRecordIndexFilesRequest=new DeleteLiveStreamRecordIndexFilesRequest();
            deleteLiveStreamRecordIndexFilesRequest.setAppName("<AppName>");
            deleteLiveStreamRecordIndexFilesRequest.setStreamName("<StreamName>");
            deleteLiveStreamRecordIndexFilesRequest.setDomainName("<DomainName>");
    
            List<String> recordList=new ArrayList<>();
            //Add the index file ID.
            recordList.add("<recordId>");
            deleteLiveStreamRecordIndexFilesRequest.setRecordIds(recordList);
            deleteLiveStreamRecordIndexFilesRequest.setRemoveFile("<true>");
            DeleteLiveStreamRecordIndexFilesResponse deleteLiveStreamRecordIndexFilesResponse = null;
            try {
                deleteLiveStreamRecordIndexFilesResponse = client.getAcsResponse(deleteLiveStreamRecordIndexFilesRequest);
            } catch (ClientException e) {
                e.printStackTrace();
            }
    
            System.out.println(new Gson().toJson(deleteLiveStreamRecordIndexFilesResponse));
    Important
    • <recordId> specifies the index file ID.

    • For more information about the operation that you can call to delete the recording file of a live stream, see DeleteLiveStreamRecordIndexFiles.

The preceding example shows how to delete a recording file. The file stored in an OSS bucket is also deleted.

Other methods to delete files

Delete files that are generated more than six months ago: You can query information about the recording files that are generated only within the previous six months. To delete files that are generated more than six months ago, you can use OSS. For more information, see Delete objects.

If you want to automatically delete the expired recording files in an OSS bucket, you must authorize ApsaraVideo Live to access Alibaba Cloud resources. Go to the Cloud Resource Access Authorization page and click Agree to Authorization. Then, submit a ticket. The after-sales team configures a validity period based on your requirements. For more information about how to submit a ticket, see Contact us. If a validity period is configured for an expired recording file, ApsaraVideo Live automatically deletes the recording file.

Note

In this step, you are assigned the role that grants ApsaraVideo Live the permissions to access Alibaba Cloud resources. If you accidentally delete the role, you can create and grant permissions to the role again.

References

For more information about the APIs operations that are related to live stream recording, see Store live recordings to ApsaraVideo VOD and Store recordings to OSS.

For more information about how to use SDK for Java, see Java SDK usage instructions.

For more information about recording callbacks, see Callbacks for live stream recording and Callbacks for on-demand recording.