Skip to main content

Import / Export Tool

Importing and exporting your FieldAware data is now easier than ever, read on to find out more

Product Education Team avatar
Written by Product Education Team
Updated over a year ago

What’s new in this release?

This release contains a big improvement on how you can import and export your data to and from the FieldAware platform.

Our development team has reengineered and overhauled how the import export tool works. Not only does the tool perform much much faster we also give you greater flexibility and better control of your data therefore enhancing your data migration workflows.

In addition to that the tool also provides a historical review of your data imports in case you need to rectify or audit a previous import operation as well as few other features and improvements that we are sure will delight you, such as;

  • Out of the box Templates to support your import or export

  • Customised field mapping your CSV data to FieldAware entities

  • Auto mapping & saved mappings to speed up your import workflows

  • Custom fields support

  • Improved validation & error reporting

Fig 1: Step 3 of the Import workflow: Mapping CSV columns when importing a file of customers into FieldAware


Overview

The FieldAware Import / Export Tool allows you to import and export records to and from your FieldAware business. To do so, you will need to upload a CSV file into the tool which will then guide you through the process of custom mapping your CSV columns to data keys in the FieldAware API schema.

We have provided a help centre within the import tool itself which explains our API schema as well as any validation criteria your data needs to meet when mapped and uploaded successfully.

Note: You can create new records or update existing records via the Import / Export Tool.


Templates

You can download template CSV files from FieldAware for the supported entities via the Import Tool. You can find this section by accessing it from the main navigation bar from within the Import Tool. Each template includes the default attributes for each entity, plus the corresponding custom fields (if any).


Custom Mapping

The FieldAware Import Tool imports records into a FieldAware business via the FieldAware API. When a CSV file is uploaded, the next step is too map each column name to a data field in the API schema for the selected entity.

The CSV file you upload must have a header row with unique column names. Otherwise the columns can't be mapped to data keys on the schema.

For illustration purposes, consider the mock schema below:

{
"name": "string",
"tel": "number",
"archived": false,
"address": {
"street": "string",
"city": "string",
"country": "string"
},
"device": {
"make": "iPhone",
"model": "6S"
}
}


Flat Schema Keys

Simple values on the API schema are mapped directly to column names. For example, the data field name could be mapped to the column "Name" and the tel could be mapped to "Phone Number" or "Mobile".

Nested Schema Keys

Some values in the API schema have a nested structure. We can map these data keys to column names using dot notation in the mapping.

For example, the data field address.street could be mapped to the CSV column "Street Name", address.city could be mapped to "Town/City", and address.country could be mapped to "Country".


Auto mapping & saved mappings

Auto Mapping

When you first upload a CSV for an entity, the Import Tool will try to automatically map the CSV column names to the entity schema keys.

For example, name in the schema will map automatically to a column called "Name" if it finds one, and someLongKeywill automatically map to the column "Some Long Key" if it exists on the CSV.

These automatic mappings can be easily overwritten or removed using the Unmap or Unmap All buttons.

Saved Mappings

User-defined mappings of schema keys to CSV column names are saved for the next import session, so you don't have to keep mapping the same columns repeatedly.

If you wish, you can clear these mappings using the menu on the right hand side of the Navigation Bar at the top of the page.


Custom fields

Although Custom Fields are sent to the API as nested keys under the parent key customFields (see Mapping CSV Columns), we represent Custom Fields in the Import Tool as a flat data field (i.e. not a nested field).

Custom Fields appear in the mapping table after regular fields, and can be identified by the CF badge next to the name.


Improved validation & error reporting

Each data field in the API schema for the entity has an associated validation function. For example, a data field may require the value to be a number, or a string that represents a boolean such as 'Yes', 'No', 'yes', 'no'. Some validators have max length requirements, and some expect exact values in the CSV.

When you map a schema key to a CSV column and parse the CSV file, each value in the column is validated using the validation function.

Any values that fail the validation will appear as errors which you can view in the CSV Validation Preview modal. CSV rows with validation errors will not be sent to the API during the import run.


How to Handle Failed Import Requests?

During the import run, when the records are being sent to the API, some requests may fail or be rejected by the API. This can happen due to network connectivity issues, or because the payload is not valid for the API schema.

Failed records will appear in the import report summary and table. You can view the request payload data that was sent and, in most cases, the API error that caused the request to fail.

Exporting Errors

After the import run, you can export the failed records to a new CSV file. All the rows that failed validation, as well as the rows that failed during the import will be present in this file.

The exported error CSV file includes an additional column with the reason for the import failure. In the case of a failure over the network during import, the request error is shown in the column. In the case of a validation failure before import, each data field validation error message is shown. There may be one or several validation errors for the invalid row.

Did this answer your question?