• MapReduce Service

mrs
  1. Help Center
  2. MapReduce Service
  3. User Guide
  4. Cluster Operation Guide
  5. Managing Jobs
  6. Adding a Jar or Script Job

Adding a Jar or Script Job

You can submit developed programs to MRS, execute them, and obtain the execution result. This section describes how to create a job.

Prerequisites

You have completed the procedure described in Background.

Procedure

  1. Log in to the MRS management console.
  2. Click in the upper-left corner on the management console and select Region and Project.
  3. Choose Clusters > Active Clusters, select a running cluster, and click its name to switch to the cluster information page.
  4. Click Job Management and go to the Job Management tab page.
  5. On the Job tab page, click Create and go to the Create Job page.

    Table 1 describes job configuration information.
    Table 1 Job configuration information

    Parameter

    Description

    Type

    Job type

    Possible types include:

    • MapReduce
    • Spark
    • Spark Script
    • Hive Script
    NOTE:

    To add jobs of the Spark and Hive types, you need to select Spark and Hive components when creating a cluster and the cluster must be in the running state. Spark Script jobs support Spark SQL only, and Spark supports Spark Core and Spark SQL.

    Name

    Job name

    This parameter consists of 1 to 64 characters, including letters, digits, hyphens (-), or underscores (_). It cannot be null.

    NOTE:

    Identical job names are allowed but not recommended.

    Program Path

    Address of the JAR file of the program for executing jobs

    NOTE:

    When configuring this parameter, click OBS or HDFS, specify the file path, and click OK.

    This parameter cannot be null.

    This parameter must meet the following requirements:

    • A maximum of 1023 characters are allowed, but special characters (*?<">|\) are not allowed. The address cannot be empty or full of spaces.
    • The path varies depending on the file system:
      • OBS: The path must start with s3a://, for example, s3a://wordcount/program/hadoop-mapreduce-examples-2.7.x.jar.
      • HDFS: The path must start with /user.
    • Spark Script must end with .sql; MapReduce and Spark must end with .jar. sql and jar are case-insensitive.

    Parameter

    Key parameter for executing jobs

    This parameter is assigned by an internal function. MRS is only responsible for inputting the parameter. Separate parameters with spaces.

    Format: package name.class name

    A maximum of 2047 characters are allowed, but special characters (;|&>',<$) are not allowed. This parameter can be empty.

    NOTE:

    When you enter parameters containing sensitive information, for example, a password for login, you can add an at sign (@) before the parameters to encrypt the parameter values and prevent persistence of sensitive information in the form of plaintext. Therefore, when you view job information on the MRS management console, sensitive information will be displayed as asterisks (*).

    Example: username=admin @password=admin_123

    Import from

    Address for inputting data

    NOTE:

    When configuring this parameter, click OBS or HDFS, specify the file path, and click OK.

    The path varies depending on the file system:
    • OBS: The path must start with s3a://.
    • HDFS: The path must start with /user.

    A maximum of 1023 characters are allowed, but special characters (*?<">|\) are not allowed. This parameter can be empty.

    Export to

    Address for outputting data

    NOTE:

    When configuring this parameter, click OBS or HDFS, specify the file path, and click OK.

    The path varies depending on the file system:
    • OBS: The path must start with s3a://.
    • HDFS: The path must start with /user.

    A maximum of 1023 characters are allowed, but special characters (*?<">|\) are not allowed. This parameter can be empty.

    Log path

    Address for storing job logs that record job running status

    NOTE:

    When configuring this parameter, click OBS or HDFS, specify the file path, and click OK.

    The path varies depending on the file system:
    • OBS: The path must start with s3a://.
    • HDFS: The path must start with /user.

    A maximum of 1023 characters are allowed, but special characters (*?<">|\) are not allowed. This parameter can be empty.

    NOTE:
    • The OBS path supports s3a://, and s3a:// is used by default.
    • Files and programs encrypted by the KMS cannot be imported if the OBS path is used.
    • The full path of HDFS and OBS contains a maximum of 1023 characters.

  6. Confirm job configuration information and click OK.

    After jobs are added, you can manage them.

    NOTE:

    By default, each cluster supports a maximum of 10 running jobs.