Skip to main content

Kibo Import/Export API Developer Guide

Introduction to Import/Export

Understand import workflows and file formats

Understanding Import/Export in Kibo

In Kibo, the Import/Export domain is the primary tool for asynchronous, bulk data management. Unlike standard REST API calls that create or update a single record in real-time, the Import/Export system is designed to handle thousands or even millions of records at once from a file (like a CSV). What makes Kibo’s approach different is its job-based, asynchronous workflow. You don’t just “upload data”; you follow a distinct, multi-step process:
  1. Upload a file to a secure, temporary storage location in Kibo.
  2. Create an Import Job, telling Kibo to process that file.
  3. Monitor the job’s status as it runs in the background.
This design is robust and scalable, preventing API timeouts and ensuring that large data operations don’t impact storefront performance. It’s the standard Kibo pattern for data migration, bulk updates, and integrating with external systems that work with flat files.

How This Domain Fits Into Kibo

The Import/Export domain is a utility that supports almost every other part of the Kibo platform. It’s the backbone for large-scale data operations.
  • Catalog: Used to import new products, update pricing for thousands of SKUs, or add inventory information from a supplier’s feed.
  • Orders: Used to export daily orders for an external reporting or analytics system.
  • Customers: Used for migrating customer lists from a previous e-commerce platform.
  • Locations: Used to bulk-create or update a list of retail store locations.

Prerequisites

  • Kibo API credentials and basic setup
  • Node.js 16+ with TypeScript and the ability to read local files (fs module)
  • Familiarity with REST APIs and the concept of asynchronous jobs

What You’ll Learn

After completing this guide, you’ll understand:
  • Kibo’s asynchronous, job-based approach to bulk data (based on official API specs).
  • The key patterns for uploading files and managing import/export jobs (verified from apidocs.kibocommerce.com).
  • The complete workflow for importing products and exporting orders.
  • How to avoid the most common beginner mistakes, like trying to process an import in a single, synchronous call.
  • How to read and navigate the official Import/Export and File Management API documentation effectively.

Kibo Import/Export Fundamentals

How Kibo Organizes Import/Export Data

The system is built around a few core concepts that manage the workflow:
  • File: A data file (usually CSV or ZIP containing CSVs) that you upload to Kibo’s temporary storage. The key piece of information you get back after an upload is the filePath.
  • ImportJob / ExportJob: A record that tracks the status of your bulk data operation. It has a unique id and a status field (e.g., Pending, Processing, Succeeded, Failed). This is the central object you’ll interact with to monitor your operation.
  • ImportSettings / ExportSettings: A JSON object you provide when creating a job. It tells Kibo what type of data you’re working with (e.g., Products, Orders), where to find the input file (for imports), and other configuration details.

Key Kibo Patterns You’ll See Everywhere

Authentication Pattern: The Kibo SDK manages authentication for you. You create a single Configuration object containing your credentials (Client ID, Shared Secret, etc.). This object is then passed to the constructor of specific API clients (e.g., new ImportExportApi(configuration)). Request/Response Structure: When you request the status of a job, you get back the complete job object. When requesting a list of jobs, the response is paginated with the data in an items array.
// Actual response schema for GET /platform/data/import/:id
    {
        "name": "Products Import",
        "id": "b16e3e69-b002-4274-a24e-5daf8eb4377d",
        "requester": "3264d539c46642b38f2abcbd515adcbf",
        "domain": "catalog",
        "resources": [
            {
                "format": "Legacy",
                "resource": "Products",
                "deleteOmitted": false,
                "status": "complete",
                "isComplete": true,
                "stateDetails": "Duration: 0.0498721 seconds",
                "allowSyscalcValueUpdates": false
            }
        ],
        "isComplete": true,
        "auditInfo": {
            "updateDate": "2025-10-03T18:15:37.734Z",
            "createDate": "2025-10-03T18:15:37.051Z"
        },
        "tenant": 31271,
        "status": "complete",
        "files": [
            {
                "id": "e5d7cc84-d0e7-455b-be8a-1a0276b19382",
                "locationType": "internal",
                "fileName": "Products.zip",
                "fileType": "import"
            },
            {
                "id": "a4a16338-c5a9-4c1a-a9f1-a980c7188a61",
                "locationType": "internal",
                "fileName": "products_import_log.csv",
                "fileType": "log"
            }
        ]
    }
Error Handling Approach: API call failures (like providing a bad job ID) will throw a standard structured error. However, a failed job is different. The API call to getImportJob will succeed (HTTP 200), but the status field in the response will be Failed. You must check this field to determine the outcome of the operation. API Documentation Reference: Throughout this guide, we’ll reference specific endpoints. Find complete specs at: /api-overviews/openapi_importexport_overview

Common Import/Export Workflows

  1. Product Data Migration: Uploading a CSV of products from a previous platform and starting an import job to create them in Kibo.
  2. Bulk Price List Updates: Importing a CSV containing just containing pricing data to update pricing for thousands of items without affecting other product data.
  3. Bulk Customer Attribute Updates: Importing a CSV containing just customer attribute data for mass updates to customer data.
Let’s explore the primary import workflow step by step.

Creating an Import Job: The Kibo Way

The import process is a perfect example of Kibo’s asynchronous philosophy. It’s a two-part process: you first upload the file, then you create the job.

When You Need This

You need this workflow whenever you want to create or update a large number of records in Kibo from a file, such as migrating products, updating inventory, or adding customer accounts.

Part 1: Uploading the Data File

API Documentation Reference:
  • Endpoint: POST /platform/data/files
  • Method: POST
  • API Docs: Upload Files
Understanding the Kibo Approach: Kibo intentionally separates the file upload from the job creation. This is more robust. You upload your potentially large file to a dedicated, temporary storage service. Once the upload is complete, Kibo gives you back a unique fileId. This path is the key that links your file to the import job you’re about to create.

Step-by-Step Implementation (Part 1)

// Essential imports. We need clients for both file management and import/export jobs.
import { Configuration } from "@kibocommerce/rest-sdk";
import { FilesApi, ImportApi } from "@kibocommerce/rest-sdk/clients/ImportExport";
import { DropLocation, ImportJob } from "@kibocommerce/rest-sdk/clients/ImportExport/models";
import * as fs from "fs";
import * as path from "path";

const configuration = new Configuration({
    tenantId: process.env.KIBO_TENANT_ID || 'test-tenant',
    siteId: process.env.KIBO_SITE_ID || 'test-site',
    clientId: process.env.KIBO_CLIENT_ID || 'test-client',
    sharedSecret: process.env.KIBO_SHARED_SECRET || 'test-secret',
    authHost: process.env.KIBO_AUTH_HOST || 'https://home.mozu.com',
});
// This function takes a local file path and uploads it to Kibo.
async function uploadFileToKibo(localFilePath: string): Promise<DropLocation> {
    const fileBasedClient = new FilesApi(configuration);
  
    try {

        const zipFileContent = await fs.readFile(localFilePath)
        // The uploadFile method takes the remote path and the file content as a buffer.
        const uploadedFile = await fileBasedClient.upload({
            fileName: 'import.zip',
            body: zipFileContent
        });

        console.log("File uploaded successfully.");
        // The DropLocation response is what we need for the next step.
        return uploadedFile;
    } catch (error) {
        console.error("API Error uploading file:", JSON.stringify(error, null, 2));
        throw error;
    }
}

Part 2: Creating and Monitoring the Import Job

API Documentation Reference: Understanding the Kibo Approach: Once the file is uploaded, you create the import job. This call is lightweight and returns almost instantly. It simply puts your job into a queue. Kibo returns a jobId. It is now your responsibility to periodically check the status of this job until it is complete.

Step-by-Step Implementation (Part 2)

import { ImportJob } from "@kibocommerce/rest-sdk/clients/ImportExport/models";

// This function creates an import job and polls for its completion.
async function createAndMonitorImportJob(importJobName: string,uploadedFile: DropLocation): Promise<ImportJob> {
    const importClient = new ImportApi(configuration);
    let jobId: string;

    const importJobConfig: ImportJob = {
      "name": importJobName,
      "tenant": 11111,
      "domain": "catalog",
      "resources": [
          {
              "format": "Legacy",
              "resource": "Products",
              "deleteOmitted": false
          }
      ],
      "files": [
         uploadedFile
      ]
    }

    /**
     * There is an options `contextOverride` that will override the context from the URL and site headers. 
     * 
     *  "contextOverride": {
            "locale": "en-CA",
            "currency": "CAD",
            "masterCatalog": 7,
            "catalog": 6,
            "site": 12345
        }
     */

    // --- Create the Job ---
    try {
        const job = await importClient.create({
           importJob: importJobConfig
        });
        jobId = job.id as string;
        console.log(`Successfully created import job with ID: ${jobId}`);
    } catch (error) {
        console.error("API Error creating import job:", JSON.stringify(error, null, 2));
        throw error;
    }

    // --- Monitor the Job ---
    console.log("Polling for job completion...");
    while (true) {
        const jobStatus = await importClient.get({ id: jobId });

        if (jobStatus.status === "complete" || jobStatus.status === "errored") {
            console.log(`Job finished with status: ${jobStatus.status}`);
            if (jobStatus.status === 'errored') {
                console.error('Job failed. Check the Kibo Admin UI for details on the errors.');
            }
            return jobStatus;
        }

        // Wait for 15 seconds before checking again.
        await new Promise(resolve => setTimeout(resolve, 15000)); 
    }
}

Multiple Real-World Examples

Example 1: The Complete Import Workflow This combines the two parts into a single, reusable function.
async function runFullImport(localFilePath: string) {
    try {
        // Step 1: Upload the file
        const remoteFilePath = await uploadFileToKibo(localFilePath);
        
        // Step 2: Create and monitor the job
        const finalJobStatus = await createAndMonitorImportJob(remoteFilePath);
        
        console.log("Import process complete.", finalJobStatus);
    } catch (error) {
        console.error("The import workflow failed.", error);
    }
}

// To run it:
// runFullImport("./my-product-data.zip");
Example 2: Downloading an Exported File
import * as fs from 'fs';
import { pipeline } from 'stream/promises';

async function downloadKiboFile(fileId: string, localDestination: string): Promise<void> {
    const fileBasedClient = new FilesApi(configuration);
    
    try {
        const fileStream = await fileBasedClient.download({ id: fileId });
        const writer = fs.createWriteStream(localDestination);

        await pipeline(
            (fileStream as any), 
            writer
        );
        
    } catch (error) {
        try {
            await fs.promises.unlink(localDestination);
        } catch (unlinkError) {
            //
        }
        
        throw error;
    }
}
Example 5: Getting a List of Recent Import Jobs
async function getRecentImportJobs() {
    const importExportClient = new ImportApi(configuration);
    try {
        const jobs = await importExportClient.list({ pageSize: 10 });
        console.log("Last 10 import jobs:", jobs.length);
        return jobs;
    } catch (error) {
        console.error("Failed to get import jobs:", error);
        throw error;
    }
}

Integrating Import/Export with Other Kibo Domains

Import/Export + PIM Integration

This is a classic use case. An external Product Information Management (PIM) system can be configured to generate a CSV file of product updates on a schedule. A separate process can then pick up this file and use the import workflow described here to keep the Kibo catalog synchronized.

Troubleshooting Your Import/Export Implementation

Reading Job Status

Remember, the API call to get a job’s status might succeed (HTTP 200), but the job itself could have failed. Always check the status property in the response body. If the status is Failed, you must log into the Kibo Admin UI, navigate to System > Import/Export, and view the job details to see the specific row-by-row errors. Common Error Codes for the API calls themselves:
  • JOB_NOT_FOUND: The importJobId or exportJobId you provided is incorrect.
  • VALIDATION_ERROR: The settings object you provided when creating the job was malformed or missing required fields.
  • UNAUTHORIZED: Your API credentials do not have permission to access the Import/Export system.

Common Development Issues

Issue 1: My import job failed, but the API call didn’t throw an error.
  • Why it happens: This is the expected behavior of an asynchronous system. The API call to create the job was successful, but the job failed later during processing.
  • How to fix it: Your monitoring logic must check for jobStatus.status === "errored". When this happens, the root cause is almost always a problem with the data file itself.
  • How to avoid it: Before attempting an API-based import, try uploading your CSV file manually in the Kibo Admin UI. The UI provides direct feedback on formatting errors. Ensure your file’s columns and data types perfectly match the templates provided by Kibo.
Issue 2: The file upload call is failing.
  • Why it happens: The uploaded file must be a compressed ZIP file. Another common reason is that the file content (body) is not being sent correctly as a stream or buffer.
  • How to fix it: Double-check the construction of your remote filePath. Ensure you are using a library like Node.js’s fs.createReadStream() to provide the file content to the SDK method.