Google Ads

Batch Processing  |  AdWords API  |  Google Developers

While most services provide synchronous APIs, requiring you to make a request
and then wait for a response, BatchJobService
provides a way to perform batches of operations on multiple services without
waiting for the operations to complete.

Unlike service-specific mutate operations, a single job in BatchJobService can operate
against a mixed collection of campaigns, ad groups, ads, criteria,
labels, and feed items. Submitted jobs run in parallel, and BatchJobService
automatically retries operations that fail due to transient errors such as

In addition, BatchJobService allows you to use temporary IDs
within your requests so you can submit dependent operations in a single job.

Supported operations

BatchJobService supports the following operations:


Although each client library contains utilities
that handle the XML serialization of uploaded operations
and XML deserialization of downloaded results, the complete schema for batch
job requests and responses is available at:

Batch job flow

The steps for using a batch job are:

  1. Create the BatchJob and capture its uploadUrl from the mutate() response.
  2. Upload the list of operations you want executed to the uploadUrl.
  3. Poll the batch job’s status periodically until it is CANCELED or DONE.
  4. Download the results of the job from its downloadUrl and check for processingErrors.

In addition, you can cancel a BatchJob in the
AWAITING_FILE or ACTIVE state by setting its

Create a batch job

Create a batch job by sending an ADD operation containing a new BatchJob object.

// Create a BatchJob.
BatchJobOperation addOp = new BatchJobOperation();
addOp.setOperand(new BatchJob());

BatchJob batchJob = batchJobService.mutate(new BatchJobOperation[] {addOp}).getValue(0);

// Get the upload URL from the new job.
String uploadUrl = batchJob.getUploadUrl().getUrl();

System.out.printf("Created BatchJob with ID %d, status '%s' and upload URL %s.%n",
    batchJob.getId(), batchJob.getStatus(), uploadUrl);

At this point in the process, the status
of the job will be AWAITING_FILE.

Create operations for the batch job

For this step, create the operations for your batch job using the same approach
you would use for the synchronous API services. For example, the following
snippet creates CampaignOperation objects for adding new campaigns.

List operations = new ArrayList<>();
for (int i = 0; i < NUMBER_OF_CAMPAIGNS_TO_ADD; i++) {
  Campaign campaign = new Campaign();
  campaign.setName(String.format("Batch Campaign %s.%s", namePrefix, i));

  // Recommendation: Set the campaign to PAUSED when creating it to prevent
  // the ads from immediately serving. Set to ENABLED once you've added
  // targeting and the ads are ready to serve.

  Budget budget = new Budget();
  BiddingStrategyConfiguration biddingStrategyConfiguration =
      new BiddingStrategyConfiguration();

  // You can optionally provide a bidding scheme in place of the type.
  ManualCpcBiddingScheme cpcBiddingScheme = new ManualCpcBiddingScheme();


  CampaignOperation operation = new CampaignOperation();
return operations;

If you're creating dependent objects such as a complete campaign consisting of
a new campaign and corresponding ad groups, ads, and keywords, you can
use temporary IDs in your ADD operations.

// Create a temporary ID generator that will produce a sequence of descending negative numbers.
Iterator tempIdGenerator =
    new AbstractSequentialIterator(-1L) {
      protected Long computeNext(Long previous) {
        return Long.MIN_VALUE == previous ? null : previous - 1;

// Use a random UUID name prefix to avoid name collisions.
String namePrefix = UUID.randomUUID().toString();

// Create the mutate request that will be sent to the upload URL.
List operations = new ArrayList<>();

// Create and add an operation to create a new budget.
BudgetOperation budgetOperation = buildBudgetOperation(tempIdGenerator, namePrefix);

// Create and add operations to create new campaigns.
List campaignOperations =
    buildCampaignOperations(tempIdGenerator, namePrefix, budgetOperation);

// Create and add operations to create new negative keyword criteria for each campaign.

// Create and add operations to create new ad groups.
List adGroupOperations =
    new ArrayList<>(buildAdGroupOperations(tempIdGenerator, namePrefix, campaignOperations));

// Create and add operations to create new ad group criteria (keywords).

// Create and add operations to create new ad group ads (text ads).

Upload operations to the upload URL

Once you've collected the set of operations for your job, the next step is to
send them to the upload URL.

If you are using the utility in one of the client libraries,
then you don't need to worry about all of the underlying details. The utility
will handle constructing and sending the requests for you, and will provide
methods for both of the following options:

  1. Uploading all operations at once.
  2. Uploading operations using multiple calls to the utility.

Option 1: Upload all operations at once

The example below uses the BatchJobHelper utility from the Java client library
to upload all operations at once.

// Use a BatchJobHelper to upload all operations.
BatchJobHelper batchJobHelper = adWordsServices.getUtility(session, BatchJobHelper.class);

batchJobHelper.uploadBatchJobOperations(operations, uploadUrl);

Option 2: Upload operations using multiple calls to the utility

The example below uses the BatchJobHelper utility from the Java client
library to upload operations incrementally via multiple calls to the
utility's uploadIncrementalBatchJobOperations() method.

// Use a BatchJobUploadHelper to upload all operations.
BatchJobHelper batchJobUploadHelper = new BatchJobHelper(session);
BatchJobUploadStatus startingUploadStatus =
    new BatchJobUploadStatus(0, URI.create(batchJob.getUploadUrl().getUrl()));
BatchJobUploadResponse uploadResponse;

// Create and upload the first operation to create a new budget.
BudgetOperation budgetOperation = buildBudgetOperation(tempIdGenerator, namePrefix);
uploadResponse = batchJobUploadHelper.uploadIncrementalBatchJobOperations(
    false, /* pass isLastRequest = false */
System.out.printf("First upload response: %s%n", uploadResponse);

// Create and upload intermediate operations to create new campaigns.
List campaignOperations =
    buildCampaignOperations(budgetOperation, tempIdGenerator, namePrefix);
uploadResponse = batchJobUploadHelper.uploadIncrementalBatchJobOperations(
    false, /* pass isLastRequest = false */
System.out.printf("Intermediate upload response: %s%n", uploadResponse);

// Upload more intermediate requests...

// Create and upload operations to create new ad group ads (text ads).
// This is the final upload request for the BatchJob.
uploadResponse = batchJobUploadHelper.uploadIncrementalBatchJobOperations(
    true, /* pass isLastRequest = true */
System.out.printf("Last upload response: %s%n", uploadResponse);

Poll the batch job status

After you upload your operations, the batch job will be submitted to the job
queue, so you'll want to periodically check the job's status until it is
Use an exponential backoff policy to avoid polling too aggressively. The
snippet below will wait 30 seconds on the first attempt, 60 seconds on the
second attempt, 120 seconds on the third attempt, and so on.

int pollAttempts = 0;
boolean isPending;
Selector selector =
    new SelectorBuilder()
        .fields(BatchJobField.Id, BatchJobField.Status, BatchJobField.DownloadUrl,
            BatchJobField.ProcessingErrors, BatchJobField.ProgressStats)
do {
  long sleepSeconds = (long) Math.scalb(30, pollAttempts);
  System.out.printf("Sleeping %d seconds...%n", sleepSeconds);
  Thread.sleep(sleepSeconds * 1000);

  batchJob = batchJobService.get(selector).getEntries(0);
      "Batch job ID %d has status '%s'.%n", batchJob.getId(), batchJob.getStatus());

  isPending = PENDING_STATUSES.contains(batchJob.getStatus());
} while (isPending && pollAttempts < MAX_POLL_ATTEMPTS);

Download the batch job results and check for errors

At this stage, your job should be in one of two states.

Status Description Actions to take
DONE BatchJobService successfully parsed and attempted each of the uploaded operations.
  • Download the results for each operation from the batch job's downloadUrl.
CANCELED The job was canceled as requested,
BatchJobService could not parse the uploaded operations, or an
unexpected error occurred during job execution.

  • Inspect the list of processingErrors on the batch job.
  • Download the results for any attempted operations from the batch job's downloadUrl, if present.

The download URL will return a mutateResult element for every uploaded operation. Each result will have the following
attributes as defined in BatchJobOpsService.wsdl:

Attribute Type Description
result Operand If the operation was successful, this will have exactly one of the following child elements:

  • Ad
  • AdGroup
  • AdGroupAd
  • AdGroupAdLabel
  • AdGroupBidModifier
  • AdGroupCriterion
  • AdGroupCriterionLabel
  • AdGroupExtensionSetting
  • AdGroupLabel
  • Budget
  • Campaign
  • CampaignCriterion
  • CampaignExtensionSetting
  • CampaignLabel
  • CampaignSharedSet
  • CustomerExtensionSetting
  • ExtensionFeedItem
  • FeedItem
  • Label
  • Media
  • SharedCriterion
  • SharedSet

The element and object returned will correspond to the type of operation for
the index. For example, if the operation was a successful CampaignOperation, a Campaign element will be returned here.

errorList ErrorList If the operation failed, this will have one or more errors elements, each of which will be an instance of ApiError or one of its subclasses.
index long The 0-based operation number of the operation. Use this to correlate this
result to the corresponding operation in your upload.

The code below shows one approach for processing the results retrieved from a
download URL.

if (batchJob.getDownloadUrl() != null && batchJob.getDownloadUrl().getUrl() != null) {
  BatchJobMutateResponse mutateResponse =
  System.out.printf("Downloaded results from %s:%n", batchJob.getDownloadUrl().getUrl());
  for (MutateResult mutateResult : mutateResponse.getMutateResults()) {
    String outcome = mutateResult.getErrorList() == null ? "SUCCESS" : "FAILURE";
    System.out.printf("  Operation [%d] - %s%n", mutateResult.getIndex(), outcome);
} else {
  System.out.println("No results available for download.");

The batch job's
will contain all errors encountered while pre-processing your uploaded
operations, such as input file corruption errors. The code below shows one
approach for processing such errors.

if (batchJob.getProcessingErrors() != null) {
  int i = 0;
  for (BatchJobProcessingError processingError : batchJob.getProcessingErrors()) {
        "  Processing error [%d]: errorType=%s, trigger=%s, errorString=%s, fieldPath=%s"
        + ", reason=%s%n",
        i++, processingError.getApiErrorType(), processingError.getTrigger(),
        processingError.getErrorString(), processingError.getFieldPath(),
} else {
  System.out.println("No processing errors found.");

Using temporary IDs

A powerful feature of BatchJobService is that it supports the use of temporary IDs.
A temporary ID is a negative number (long) that allows operations in a batch job
to reference the result of an ADD operation from a previous operation in the same
batch job. Simply specify a negative number for the ID in the ADD operation for
the parent object, and then reuse that ID in subsequent mutate() operations for other
dependent objects in the same batch job.

A common use case for temporary IDs is to create a complete campaign in a
single batch job. For example, you could create a single job containing ADD operations
with the following ID assignments on each operand:

This sequence would:

  • Add a campaign with (temporary) ID -1
    • Add an ad group with temporary ID -2 for campaign -1
      • Add an ad group ad for ad group -2
      • Add multiple ad group criteria (keywords) for ad group -2
  • Apply a label to campaign -1
  • Add a campaign negative criterion (keyword) for campaign -1

Use a new temporary ID when creating a new object. If you don't, you will
receive a TaskExecutionError.TEMP_ID_ALREADY_USED error.

Canceling a batch job

A BatchJob can be canceled if its
status is
AWAITING_FILE or ACTIVE. Simply issue a
request and pass a BatchJobOperation

  • operator = SET
  • operand = a BatchJob with:
    • id = the batch job ID
    • status = CANCELING

If the status of the BatchJob at the time of the above request is neither
AWAITING_FILE nor ACTIVE, the request will fail with a

Canceling a job is an asynchronous operation, so after your mutate() request,
poll the batch job status until it reaches
either CANCELED or DONE. Make sure you
download the results and check for errors
as well, since some of the operations in your job may have been attempted
before the job was canceled.

Upload requirements

Non-incremental uploads

Each client library utility provides a convenience method for uploading
operations in a single step. However, if you're not using a client library,
please note that non-incremental uploads are not supported.

Incremental uploads

Incremental uploads allow you to send multiple upload requests to the batch job
uploadUrl. Your job will only start executing once you've uploaded the last
set of operations.

Exchange the BatchJob uploadURL for a resumable upload URL

The upload process follows the Google Cloud Storage guidelines for resumable
uploads with the XML API

The uploadUrl
of your BatchJob must be exchanged for a resumable upload URL.
To exchange your uploadUrl for a resumable upload URL,
send a request to the uploadUrl that meets the following

Request attributes
Request method POST
URL Upload URL returned by BatchJobService.mutate
Content-Type HTTP header application/xml
Content-Length HTTP header 0
x-goog-resumable HTTP header start
Request body no request body required

If your request is successful, the returned response will have a status
of 201 Created, and a Location header whose value
is the resumable upload URL.

Upload operations to the resumable upload URL

Once you have the resumable upload URL, you can start uploading your operations.
Each request sent to the resumable upload URL must meet the following specifications.

Request attributes
Request method PUT
URL Resumable upload URL from the initialization step above.
Content-Type HTTP header application/xml
Content-Length HTTP header The number of bytes in the contents of the current request.
Content-Range HTTP header

Range of bytes in the request, followed by total bytes. Total bytes will
be * for the first and intermediate requests, but should be
set to the final total bytes when sending the last request.


bytes 0-262143/*

bytes 262144-524287/*

bytes 524288-786431/786432

Request body

Request body for the resumable upload URL

The BatchJobService will ultimately concatenate all of the XML uploaded to the uploadUrl and parse
it as a single request, so you must take care to include only the start and end mutate elements
on the first and last request, respectively.

In addition, since BatchJobService will parse the uploaded XML as a single document,
the request body for a single request need not contain a complete XML document. For example,
if you were uploading 524305 bytes (256K + 256K + 17), your requests could be as follows:

Request 1



Content length of 262144, where the t in the last line is the 262144th byte.

Request 2




Content length of 262144, where the e in the last line is the 262144th byte.

Request 3

... (padded to 262144 bytes)

Content length without padding of 17 bytes, where the closing > on
is the 17th byte. Total content length with padding is 262144 bytes.

Best practices

Consider these guidelines when using BatchJobService:

  • For better throughput, fewer larger jobs is preferred over many smaller jobs.
  • When submitting multiple concurrent jobs for the same
    clientCustomerId, try to
    reduce the likelihood of jobs operating on the same objects at the same time, while
    maintaining large job sizes. Many unfinished jobs (with status of ACTIVE or CANCELING)
    that try to mutate the same set of objects may lead to deadlock-like conditions resulting
    in severe slow-down and even job failures.
  • Don't submit multiple operations that mutate the same object in the same job. The result will be unpredictable if you do so.
  • For better throughput, order uploaded operations by operation type. For
    example, if your job contains operations to add campaigns, ad groups, and ad group criteria,
    order the operations in your upload so that all of the CampaignOperations are first, followed
    by all of the AdGroupOperations, and finally all AdGroupCriterionOperations.
  • Do not poll the job status too frequently or you will risk hitting rate limit errors.

When updating product partition trees using BatchJobService, the following
restrictions apply:

  1. If a list of operations on a product partition tree result in a structurally
    invalid product partition tree (for example, a node is subdivided without
    creating an other node), the entire list of operations on that
    product partition tree will fail.

  2. Operations that don't introduce structural changes in a product partition
    tree (for example, bid changes on an existing node) are executed independently.

  3. A BatchJob can contain operations that introduce product partition tree
    structural changes for more than two ad groups. The
    two ad group limit does not
    apply to batch jobs.

  4. When removing a product partition node, set the AdGroupCriterion
    object's criterion
    field to a ProductPartition
    instance. Setting this field to a
    instance will cause the operation to fail with an
    AdGroupCriterionError.CONCRETE_TYPE_REQUIRED error.


  • At any given time, a Google Ads account can have up to 1 GB of uploaded
    operations across all of its batch jobs that have not yet completed. Once
    your account reaches this limit, you will get a
    with reason DISK_QUOTA_EXCEEDED if you try to add new batch jobs. If you
    encounter this error, wait until the size of pending uploaded operations falls
    below the limit before creating new jobs.

  • Test accounts are limited to 250 new jobs per day.

  • Batch jobs are retained for 60 days after creation. After that, they won't
    be retrievable through either get() or query() requests.

Code examples

The following client libraries contain a complete code example that illustrates all of
the features above.

Leave a Reply

Your email address will not be published. Required fields are marked *

Back to top button