Integrate into iOS apps

Updated at: 2025-03-05 09:15

This topic describes how to integrate source code for AI real-time interaction into your iOS app.

Source code

Download link

You can download the source code from GitHub.

Directory structure

├── iOS  // The root directory of the iOS platform.
│ ├── AUIAICall.podspec // The pod description file.
│   ├── Source                                    // The source code files.
│   ├── Resources                                 // The resource files.
│   ├── Example                                   // The source code of the demo.
│   ├── AUIBaseKits                               // The basic UI components. 
│   ├── AICallKit                                 // The AICallKit module, which can be customized. 
│   ├── README.md                                 // The README file.

Environment requirements

  • Xcode 16.0 or later. We recommend that you use the latest official version.

  • Cocoa Pods 1.9.3 or later.

  • A physical device that runs iOS 10.0 or later.

Prerequisites

The relevant API operations are developed on your AppServer or the AppServer is deployed based on the source code. For more information, see Deploy a project.

Run the demo

  • After the source code is downloaded, go to the Example directory.

  • Run the pod install --repo-update command in the Example directory. The dependent SDKs are automatically installed.

  • Open the AUIAICallExample.xcworkspace file and modify the package ID.

  • Open the AUIAICallAppServer.swift file and modify the domain name of the AppServer.

    // AUIAICallAppServer.swift
    public let AICallServerDomain = "Domain name of your AppServer"
  • Compile and run the Example Target.

Develop your own AI-powered call features

You can quickly integrate AUIAICall into your app by performing the following steps.

Integrate the source code

  • After you download the source code from the repository, copy the iOS folder to the code directory of your app and rename the folder AUIAICall. Make sure that the folder is placed at the same directory level as your Podfile. You can delete the Example and AICallKit directories.

  • Modify your Podfile to import the following modules:

    • AliVCSDK_ARTC: ApsaraVideo MediaBox SDK for Alibaba Real-Time Communication (ARTC). You can also import AliVCSDK_Standard or AliVCSDK_InteractiveLive. For more information, see Integrate ApsaraVideo MediaBox SDK for iOS.

    • ARTCAICallKit: an SDK for AI-powered real-time interactive calls.

    • AUIFoundation: the basic UI components.

    • AUIAICall: the source code of UI components for AI-powered calls.

    # iOS 10.0 or later is required.
    platform :ios, '10.0'
    
    target' Your app target' do
        # Integrate ApsaraVideo MediaBox SDK based on your business requirements. AliVCSDK_ARTC, AliVCSDK_Standard, and AliVCSDK_InteractiveLive are supported.
        pod 'AliVCSDK_ARTC', '~> x.x.x'
    
        # An SDK for AI-powered real-time interactive calls.
        pod "ARTCAICallKit", '~> 1.6.0'
    
        # The source code of the basic UI components.
        pod 'AUIFoundation', :path => "./AUIAICall/AUIBaseKits/AUIFoundation/", :modular_headers => true
    
        # The source code of UI components for AI-powered calls.
        pod 'AUIAICall',  :path => "./AUIAICall/"
    end
    Note

    You can go to the official website to download ARTC SDK of the latest version.

  • Run the pod install --repo-update command.

  • Complete the integration.

Configure the project

  • Open the info.Plist file of your project and add the NSMicrophoneUsageDescription and NSCameraUsageDescription permissions.

  • In the project settings, enable Background Modes on the Signing & Capabilities tab. If you do not enable Background Modes, you must develop code to end calls after they are switched to the background.

Configure the source code

  • Make sure that all prerequisites are met.

  • Open the AUIAICallAppServer.swift file and modify the domain name of the AppServer.

    // AUIAICallAppServer.swift
    public let AICallServerDomain = "Domain name of your AppServer"

Call API operations

After the preceding steps are complete, you can call API operations in other modules of your app or on its homepage to start AI-powered calls. You can also modify the source code based on your business requirements.

// Import the components.
import AUIAICall
import AUIFoundation
// Check whether the microphone is enabled.
AVDeviceAuth.checkMicAuth { auth in
    if auth == false {
        return
    }
    
    // Create a controller based on the user ID.
    let controller = AUIAICallStandardController(userId: userId)
    // Set the type of the intelligent agent, such as voice, digital human, or visual understanding. The AppServer selects an agent ID based on the agent type to start a call.
    controller.config.agentType = agentType  
    // Create a ViewController for the call.
    let vc = AUIAICallViewController(controller)
    // Open the call page in full screen mode.
    vc.modalPresentationStyle = .fullScreen
    vc.modalTransitionStyle = .coverVertical
    vc.modalPresentationCapturesStatusBarAppearance = true
    self.present(vc, animated: true)
}

FAQ

To obtain more information about and technical support for AUI Kits, scan the following QR code by using DingTalk to join the AUI Kits user group.

02e499c493677805c1e0601c5fbc9273

  • On this page (1, T)
  • Source code
  • Download link
  • Directory structure
  • Environment requirements
  • Prerequisites
  • Run the demo
  • Develop your own AI-powered call features
  • Integrate the source code
  • Configure the project
  • Configure the source code
  • Call API operations
  • FAQ
Feedback