• MapReduce Service

mrs
  1. Help Center
  2. MapReduce Service
  3. User Guide
  4. Cluster Operation Guide
  5. Managing Jobs
  6. Replicating Jobs

Replicating Jobs

This section describes how to replicate MRS jobs.

Background

Currently, all types of jobs except for Spark SQL and Distcp jobs can be replicated.

Procedure

  1. Log in to the MRS management console.
  2. Click in the upper-left corner on the management console and select Region and Project.
  3. Choose Clusters > Active Clusters, select a running cluster, and click its name to switch to the cluster information page.
  4. Click Job Management.
  5. In the Operation column corresponding to the to-be-replicated job, choose More > Copy.

    The Copy Job dialog box is displayed.

  6. Set job parameters, and click OK.

    Table 1 describes job configuration information.

    After being successfully submitted, a job changes to the Running state by default. You do not need to manually execute the job.

    Table 1 Job configuration information

    Parameter

    Description

    Type

    Job type

    Possible types include:

    • MapReduce
    • Spark
    • Spark Script
    • Hive Script
    NOTE:

    To add jobs of the Spark and Hive types, you need to select Spark and Hive components when creating a cluster and the cluster must be in the running state. Spark Script jobs support Spark SQL only, and Spark supports Spark Core and Spark SQL.

    Name

    Job name

    This parameter consists of 1 to 64 characters, including letters, digits, hyphens (-), or underscores (_). It cannot be null.

    NOTE:

    Identical job names are allowed but not recommended.

    Program Path

    Address of the JAR file of the program for executing jobs

    NOTE:

    When configuring this parameter, click OBS or HDFS, specify the file path, and click OK.

    This parameter cannot be null.

    This parameter must meet the following requirements:

    • A maximum of 1023 characters are allowed, but special characters (*?<">|\) are not allowed. The address cannot be empty or full of spaces.
    • The path varies depending on the file system:
      • OBS: The path must start with s3a://, for example, s3a://wordcount/program/hadoop-mapreduce-examples-2.7.x.jar.
      • HDFS: The path must start with /user.
    • Spark Script must end with .sql; MapReduce and Spark must end with .jar. sql and jar are case-insensitive.

    Parameter

    Key parameter for executing jobs

    This parameter is assigned by an internal function. MRS is only responsible for inputting the parameter. Separate parameters with spaces.

    Format: package name.class name

    A maximum of 2047 characters are allowed, but special characters (;|&>',<$) are not allowed. This parameter can be empty.

    NOTE:

    When you enter parameters containing sensitive information, for example, a password for login, you can add an at sign (@) before the parameters to encrypt the parameter values and prevent persistence of sensitive information in the form of plaintext. Therefore, when you view job information on the MRS management console, sensitive information will be displayed as asterisks (*).

    Example: username=admin @password=admin_123

    Import from

    Address for inputting data

    NOTE:

    When configuring this parameter, click OBS or HDFS, specify the file path, and click OK.

    The path varies depending on the file system:
    • OBS: The path must start with s3a://.
    • HDFS: The path must start with /user.

    A maximum of 1023 characters are allowed, but special characters (*?<">|\) are not allowed. This parameter can be empty.

    Export to

    Address for outputting data

    NOTE:

    When configuring this parameter, click OBS or HDFS, specify the file path, and click OK.

    The path varies depending on the file system:
    • OBS: The path must start with s3a://.
    • HDFS: The path must start with /user.

    A maximum of 1023 characters are allowed, but special characters (*?<">|\) are not allowed. This parameter can be empty.

    Log path

    Address for storing job logs that record job running status

    NOTE:

    When configuring this parameter, click OBS or HDFS, specify the file path, and click OK.

    The path varies depending on the file system:
    • OBS: The path must start with s3a://.
    • HDFS: The path must start with /user.

    A maximum of 1023 characters are allowed, but special characters (*?<">|\) are not allowed. This parameter can be empty.