All Products
Search
Document Center

Elasticsearch:Migrate data with elasticsearch-dump

Last Updated:Feb 14, 2026

This guide explains how to use the open-source elasticsearch-dump tool to migrate settings, mappings, and documents between Alibaba Cloud Elasticsearch clusters, or between an on-premises environment and the cloud.

Overview

elasticsearch-dump is a command-line tool used to export data from one Elasticsearch index and import it into another, or into a local file.

Best for:

  • Migrating small volumes of data.

  • Moving specific indexes.

  • Backing up mappings or settings to local JSON files.

Official documentation: elasticdump.

Prerequisites

  • Source/Destination clusters: Alibaba Cloud Elasticsearch clusters must be created. See Create an Alibaba Cloud Elasticsearch cluster.

  • Auto-indexing: The destination cluster must have Auto Indexing enabled, or the target index must be created manually in advance. See Configure the YML file.

  • Migration node: An Elastic Compute Service (ECS) is required to run the tool.

    • If the ECS instance and Elasticsearch cluster are in the same Virtual Private Cloud (VPC), use internal endpoints for faster, free data transfer.

    • If they are in different regions/VPCs, use public endpoints and ensure the ECS IP is added to the cluster whitelist. See Manage IP address whitelists.

Install elasticsearch-dump

  1. Connect to the ECS instance.

  2. Install Node.js.

    1. Download the package:

      wget https://nodejs.org/dist/v16.18.0/node-v16.18.0-linux-x64.tar.xz
    2. Decompress and move:

      tar -xf node-v16.18.0-linux-x64.tar.xz
    3. Configure environment variables:

      • For the environment variables to temporarily take effect, run the following command:

        export PATH=$PATH:/root/node-v16.18.0-linux-x64/bin/
      • To make Node.js permanent, add it to your profile:

        vim ~/.bash_profile
        export PATH=$PATH:/root/node-v16.18.0-linux-x64/bin/
        source ~/.bash_profile
  3. Install elasticsearch-dump:

    npm install elasticdump -g

Examples

Important

If your password contains special characters (e.g., #$@), standard URL strings may fail. See the FAQ and troubleshooting section for the --httpAuthFile solution.

Migrate data between clusters (cloud-to-cloud)

To fully migrate an index, run the command for settings, mappings, and data in that order.

  1. Migrate index settings:

    elasticdump --input=http://"<UserName>:<YourPassword>"@<YourEsHost>/<YourEsIndex> --output=http://"<OtherName>:<OtherPassword>"@<OtherEsHost>/<OtherEsIndex> --type=settings
  2. Migrate mappings:

    elasticdump --input=http://"<UserName>:<YourPassword>"@<YourEsHost>/<YourEsIndex> --output=http://"<OtherName>:<OtherPassword>"@<OtherEsHost>/<OtherEsIndex> --type=mapping
  3. Migrate documents (data)

    elasticdump --input=http://"<UserName>:<YourPassword>"@<YourEsHost>/<YourEsIndex> --output=http://"<OtherName>:<OtherPassword>"@<OtherEsHost>/<OtherEsIndex> --type=data

Export to local file (backup)

  1. Migrate settings

    elasticdump --input=http://"<UserName>:<YourPassword>"@<YourEsHost>/<YourEsIndex> --output=<YourLocalFile> --type=settings
  2. Migrate mappings

    elasticdump --input=http://"<UserName>:<YourPassword>"@<YourEsHost>/<YourEsIndex> --output=<YourLocalFile> --type=mapping
  3. Migrate documents (data)

    elasticdump --input=http://"<UserName>:<YourPassword>"@<YourEsHost>/<YourEsIndex> --output=<YourLocalFile> --type=data
  4. Migrate data based on a query

    elasticdump --input=http://"<UserName>:<YourPassword>"@<YourEsHost>/<YourEsIndex> --output=<YourLocalFile> ----searchBody="<YourQuery>"

Import from local file (restore)

Restore documents

elasticdump --input=<YourLocalFile> --output=http://"<UserName>:<YourPassword>"@<YourEsHost>/<YourEsIndex> --type=data

Parameter reference

Parameter

Description

--input / --output

Source and destination. Can be a cluster URL or a local file path.

Important

When you export to a local file, elasticsearch-dump automatically generates the destination file in the specified path. Therefore, before you migrate data to a local machine, ensure sure that the name of the destination file is unique in the related directory.

<YourEsHost>/<OtherEsHost>

The endpoint (internal or public) of your Elasticsearch cluster (e.g., es-cn-abc.public.elasticsearch.aliyuncs.com:9200). See View the basic information of a cluster.

<UserName>/<OtherName>

Cluster username (default is elastic).

<YourPassword>/<OtherPassword>

Cluster password.

--type

The type of migration: settingsmapping, or data.

--searchBody

Filter data using a Query DSL. Example: {"query":{"term":{"user":"admin"}}}

FAQ and troubleshooting

Q: Error: getaddrinfo ENOTFOUND elastic

image.png

Cause: This usually occurs when your password contains special characters (like #!, or @) that break the URL structure.

Solutions: Use an authentication file

  1. Create a file named auth.ini:

    user=elastic
    password="Your#Complex$Password"
    Note

    The password must be enclosed in a pair of double quotation marks (").

  2. Run elasticdump using the --httpAuthFile flag:

    elasticdump --input=http://es-*****.public.elasticsearch.aliyuncs.com:9200/customers --output=/root/customers.json --httpAuthFile=/root/auth.ini --type=settings

Q: URIError: URI malformed

Cause: elasticsearch-dump cannot parse usernames and passwords that contain special characters.

Solutions:

  • Method 1: Remove the special characters in the username or password.

  • Method 2: Log on to the Kibana console of the cluster, create a user, grant the required permissions to the user, and then use the new user for data migration.