Kibo Import/Export API Developer Guide
Introduction to Import/Export
Understand import workflows and file formats
Understanding Import/Export in Kibo
In Kibo, the Import/Export domain is the primary tool for asynchronous, bulk data management. Unlike standard REST API calls that create or update a single record in real-time, the Import/Export system is designed to handle thousands or even millions of records at once from a file (like a CSV). What makes Kibo’s approach different is its job-based, asynchronous workflow. You don’t just “upload data”; you follow a distinct, multi-step process:- Upload a file to a secure, temporary storage location in Kibo.
- Create an Import Job, telling Kibo to process that file.
- Monitor the job’s status as it runs in the background.
How This Domain Fits Into Kibo
The Import/Export domain is a utility that supports almost every other part of the Kibo platform. It’s the backbone for large-scale data operations.- Catalog: Used to import new products, update pricing for thousands of SKUs, or add inventory information from a supplier’s feed.
- Orders: Used to export daily orders for an external reporting or analytics system.
- Customers: Used for migrating customer lists from a previous e-commerce platform.
- Locations: Used to bulk-create or update a list of retail store locations.
Prerequisites
- Kibo API credentials and basic setup
- Node.js 16+ with TypeScript and the ability to read local files (
fsmodule) - Familiarity with REST APIs and the concept of asynchronous jobs
What You’ll Learn
After completing this guide, you’ll understand:- Kibo’s asynchronous, job-based approach to bulk data (based on official API specs).
- The key patterns for uploading files and managing import/export jobs (verified from apidocs.kibocommerce.com).
- The complete workflow for importing products and exporting orders.
- How to avoid the most common beginner mistakes, like trying to process an import in a single, synchronous call.
- How to read and navigate the official Import/Export and File Management API documentation effectively.
Kibo Import/Export Fundamentals
How Kibo Organizes Import/Export Data
The system is built around a few core concepts that manage the workflow:- File: A data file (usually CSV or ZIP containing CSVs) that you upload to Kibo’s temporary storage. The key piece of information you get back after an upload is the
filePath. ImportJob/ExportJob: A record that tracks the status of your bulk data operation. It has a uniqueidand astatusfield (e.g.,Pending,Processing,Succeeded,Failed). This is the central object you’ll interact with to monitor your operation.ImportSettings/ExportSettings: A JSON object you provide when creating a job. It tells Kibo what type of data you’re working with (e.g.,Products,Orders), where to find the input file (for imports), and other configuration details.
Key Kibo Patterns You’ll See Everywhere
Authentication Pattern: The Kibo SDK manages authentication for you. You create a singleConfiguration object containing your credentials (Client ID, Shared Secret, etc.). This object is then passed to the constructor of specific API clients (e.g., new ImportExportApi(configuration)).
Request/Response Structure:
When you request the status of a job, you get back the complete job object. When requesting a list of jobs, the response is paginated with the data in an items array.
getImportJob will succeed (HTTP 200), but the status field in the response will be Failed. You must check this field to determine the outcome of the operation.
API Documentation Reference:
Throughout this guide, we’ll reference specific endpoints. Find complete specs at:
/api-overviews/openapi_importexport_overview
Common Import/Export Workflows
- Product Data Migration: Uploading a CSV of products from a previous platform and starting an import job to create them in Kibo.
- Bulk Price List Updates: Importing a CSV containing just containing pricing data to update pricing for thousands of items without affecting other product data.
- Bulk Customer Attribute Updates: Importing a CSV containing just customer attribute data for mass updates to customer data.
Creating an Import Job: The Kibo Way
The import process is a perfect example of Kibo’s asynchronous philosophy. It’s a two-part process: you first upload the file, then you create the job.When You Need This
You need this workflow whenever you want to create or update a large number of records in Kibo from a file, such as migrating products, updating inventory, or adding customer accounts.Part 1: Uploading the Data File
API Documentation Reference:- Endpoint:
POST /platform/data/files - Method:
POST - API Docs: Upload Files
fileId. This path is the key that links your file to the import job you’re about to create.
Step-by-Step Implementation (Part 1)
Part 2: Creating and Monitoring the Import Job
API Documentation Reference:- Endpoint:
POST /platform/data/import - Method:
POST - API Docs: Create Import Job
jobId. It is now your responsibility to periodically check the status of this job until it is complete.
Step-by-Step Implementation (Part 2)
Multiple Real-World Examples
Example 1: The Complete Import Workflow This combines the two parts into a single, reusable function.Integrating Import/Export with Other Kibo Domains
Import/Export + PIM Integration
This is a classic use case. An external Product Information Management (PIM) system can be configured to generate a CSV file of product updates on a schedule. A separate process can then pick up this file and use the import workflow described here to keep the Kibo catalog synchronized.Troubleshooting Your Import/Export Implementation
Reading Job Status
Remember, the API call to get a job’s status might succeed (HTTP 200), but the job itself could have failed. Always check thestatus property in the response body. If the status is Failed, you must log into the Kibo Admin UI, navigate to System > Import/Export, and view the job details to see the specific row-by-row errors.
Common Error Codes for the API calls themselves:
JOB_NOT_FOUND: TheimportJobIdorexportJobIdyou provided is incorrect.VALIDATION_ERROR: The settings object you provided when creating the job was malformed or missing required fields.UNAUTHORIZED: Your API credentials do not have permission to access the Import/Export system.
Common Development Issues
Issue 1: My import job failed, but the API call didn’t throw an error.- Why it happens: This is the expected behavior of an asynchronous system. The API call to create the job was successful, but the job failed later during processing.
- How to fix it: Your monitoring logic must check for
jobStatus.status === "errored". When this happens, the root cause is almost always a problem with the data file itself. - How to avoid it: Before attempting an API-based import, try uploading your CSV file manually in the Kibo Admin UI. The UI provides direct feedback on formatting errors. Ensure your file’s columns and data types perfectly match the templates provided by Kibo.
- Why it happens: The uploaded file must be a compressed ZIP file. Another common reason is that the file content (
body) is not being sent correctly as a stream or buffer. - How to fix it: Double-check the construction of your remote
filePath. Ensure you are using a library like Node.js’sfs.createReadStream()to provide the file content to the SDK method.

