Import
The Import module is designed to help users upload bulk data into the system quickly and accurately. Instead of entering records manually one by one, you can use CSV files to import large volumes of information such as customers, products, price lists, stock details, categories, orders, and more.
This feature is especially useful during store setup, system migration, bulk corrections, or periodic updates.
Why Use Import
Importing data provides major benefits:
- Saves time compared to manual entry
- Reduces repetitive data entry work
- Ensures consistent formatting and structure
- Makes large-scale updates easier
- Helps when migrating data from another software
- Allows administrative users to manage large datasets efficiently
Whether you are creating a new store, updating stock or pricing, adding new product lines, or switching from a legacy system, the Import module ensures fast and structured data entry.
Where to Find the Import Tool
Open the Back Office and navigate to:
Tools Import
The Import page gives access to the following functions:
- Import
- Import Log
- Import Entity Log
Each section provides different controls, depending on whether you are performing a new import or reviewing past results.
Import Section Overview
On the Import screen, you can:
- Select the data entity you want to upload
- Choose the CSV file to import
- Map CSV fields to system fields
- Import the data
- Load existing mappings from a previous import
This section is the main working area for creating new imports.
Supported Import Entities
The system allows bulk import for multiple sections of the application. Depending on your business needs, you can upload:
- Customer
- Product
- Product Attribute
- Product Price List
- Product Category
- Stock Availability
- Product Volume Price
- Category
- Brand
- Order
Each import type has specific fields and formatting rules that must be followed when preparing your CSV file.
How to Perform an Import
Step 1: Open the Import Page
- Open Back Office
- Go to Tools
- Select Import
The Import screen will appear, displaying a field to select the entity and upload a file.
Step 2: Select the Data Entity
In the Entity box, type or select the type of data you want to import. For example:
- Customer for adding or updating customer records
- Product for adding product details
- Category for product categories
- Order for bulk order uploads
Step 3: Select the CSV File
Click the Choose File button and select a CSV file from your computer.
Before importing, make sure:
- The file is saved in CSV format
- The first row includes column headers
- The column names match the expected fields or are ready for mapping
Step 4: Load Existing Mapping (Optional)
If you have previously imported the same entity, you can save time by reusing past field mapping.
Click:
Get Existing Mapping
This retrieves saved field pairings from earlier imports, allowing a one-click setup.
Step 5: Map Fields
If no mapping is available, click the Map button.
You will now match:
- Columns from your CSV
- Fields stored in the system
For example:
CSV Column Customer_Name ? System Field Customer Name CSV Column Phone_Number ? System Field Customer Phone
This step ensures the data is placed into the correct fields during import.
Step 6: Start Import
Click Import to begin the upload.
Depending on the number of rows in your file:
- Small files may import instantly
- Large imports may take some time to process
Once completed, the results will be available in the import logs.
Understanding Import Logs
Two logs are available for tracking results:
Import Log
Shows:
- Total records processed
- Successfully imported entries
- Failed rows
- Error messages
- Upload timestamp
- User who performed the import
This helps verify data quality and identify where corrections are needed.
Import Entity Log
This log is useful when working on repeated imports of a specific entity. It helps track import history for a single data type in more detail.
Preparing the CSV File
To avoid errors during import, keep these guidelines in mind:
- Always use CSV format
- Include column headings in the first row
- Do not use special characters in the header names
- Ensure number and date formats match the system standard
- Do not leave required fields empty
- Avoid merged cells, formulas, spaces, or hidden characters
If unsure about field names:
- First export data from the system
- Use the exported structure as a template
This helps create a reliable CSV with correctly named columns.
Common Use Cases
New Store Setup
When launching a new store, you can quickly upload:
- All products
- Categories
- Brand lists
- Customers
This reduces the time required for initial configuration.
Bulk Stock Updates
If stock levels or purchase prices change regularly, the import tool provides a fast method to:
- Update stock quantities
- Change purchase and selling prices
- Modify tax and pricing rules
- Refresh availability data
Multi-System Migration
When switching from another eCommerce or POS system, you can export records into CSV and upload them into this system, greatly reducing onboarding time.
Troubleshooting Common Errors
Typical issues that may occur include:
- Column header not recognized
- Required field missing
- Incorrect data type (for example, text in a numeric field)
- Special characters causing format issues
- CSV saved in incorrect encoding
To fix these issues:
- Confirm column names match system fields
- Remove symbols, embedded commas, and formatting
- Save CSV in UTF-8 format
- Reimport after correcting data
The Import Log will show specific error messages to help you correct the file.
Best Practices
To ensure smooth imports, follow these recommendations:
- Test with a small CSV file before full import
- Keep backups of uploaded data
- Save mappings for repeated imports
- Validate data formatting before uploading
- Frequently check logs after import
- Use Export first to understand field structure
Summary
The Import tool is a powerful feature that helps you upload bulk data efficiently. By preparing your CSV correctly, mapping fields accurately, and reviewing logs after import, you can manage large-scale data updates with minimal effort.