All Products
Search
Document Center

Object Storage Service:Use Hadoop Shell commands to access OSS-HDFS

Last Updated:Aug 14, 2024

If you want to use the CLI to perform operations, such as uploading objects, downloading objects, and deleting objects, on a bucket for which OSS-HDFS is enabled, you can use Hadoop Shell commands.

Environment preparation

You can use one of the following methods to access OSS-HDFS:

  • If you want to access OSS-HDFS by using an Alibaba Cloud EMR cluster, make sure that an EMR cluster whose version is 3.46.2 or later or 5.12.2 or later is created. EMR clusters that meet the version requirements are integrated with OSS-HDFS by default. For more information, see Create a cluster.

  • If you do not want to use an Alibaba Cloud EMR cluster to access OSS-HDFS, make sure that JindoSDK 4.6.x or later is installed and deployed. For more information, see Deploy JindoSDK in an environment other than EMR.

Commands and examples

The following section provides examples on how to use Hadoop Shell commands to access OSS-HDFS.

  • Upload a local file

    Run the following command to upload a local file named examplefile.txt in the local root directory to a bucket named examplebucket:

    hdfs dfs -put examplefile.txt oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/
  • Create a directory

    Run the following command to create a directory named dir/ in a bucket named examplebucket:

    hdfs dfs -mkdir oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/dir/
  • Query objects or directories

    Run the following command to query the objects or directories in a bucket named examplebucket:

    hdfs dfs -ls oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/
  • Query the size of objects or directories

    Run the following command to query the size of all objects or directories in a bucket named examplebucket:

    hdfs dfs -du oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/
  • Query the content of an object

    Run the following command to query the content of an object named localfile.txt in a bucket named examplebucket:

    hdfs dfs -cat oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/localfile.txt
    Important

    The content of the queried object is displayed on the screen in plain text. If the content is encoded, use the HDFS API for Java to read and decode the content.

  • Copy an object or a directory

    Run the following command to copy the root directory named subdir1 in a bucket named examplebucket to a directory named subdir2 in the same bucket. In addition, the position of the subdir1 root directory, the objects in the subdir1 root directory, and the structure and content of subdirectories in the subdir1 root directory remain unchanged.

    hdfs dfs -cp oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/subdir1  oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/subdir2/subdir1
  • Move an object or a directory

    Run the following command to move the root directory named srcdir in a bucket named examplebucket and the objects and subdirectories in the root srcdir directory to another root directory named destdir:

    hdfs dfs -mv oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/srcdir  oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/destdir
  • Download an object

    Run the following command to download an object named exampleobject.txt from a bucket named examplebucket to the root directory named /tmp on your computer:

    hdfs dfs -get oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/exampleobject.txt  /tmp/
  • Delete objects or directories

    Run the following command to delete a directory named destfolder/ and all objects in the directory from a bucket named examplebucket:

    hdfs dfs -rm -r oss://examplebucket.cn-hangzhou.oss-dls.aliyuncs.com/destfolder/