Mapping and Preparation

Mapping ensures that the data from the source file is imported to the correct destination in Gospel. Gospel supports import of multiple source files of the same extension, (for example, XML files) to the specified record definitions. This is achieved by mapping the source and destination components correctly in the definitions.properties file.  

This page provides a step by step guide to perform the following available functions using the LedgerBridge connector for:

Before you begin:

  • Read through the Import data from External Data Sources and LedgerBridge Connector sections to familiarise yourself with the feature and the available parameters
  • Ensure that the LedgerBridge Connector is connected to the Gospel UI at all times during the import
  • The ID field must begin with a letter (A-Za-z)

  • Provide source files, which can be a valid XML, CSV, JSON, or flat file format.

Importing an XML file 

  1. Define the LedgerBridge Connector, file collector/API collector and XML transformer parameters in your application.properties file and place the file in the directory containing the JAR files (LedgerBridge connector, collector and transformer). 

    If the data source is a local file, configure the file collector or configure the API collector if the data source is available at an online location

  2. Provide the file containing the definitions (for example, definitions.properties). The definitions file contains the mapping for source-to-target. During the import, the LedgerBridge connector maps the elements in the source to fields in the record definition. If the mapping is incorrect or missing, an error may be reported.

    record.<recorddefinition-id>.base = <element>
    #to automatically generate the id 
    record.<recorddefinition-id>.id = %AUTOID%
    #to use the id information from an element
    record.<recorddefinition-id>.id = <element>
    #to map element to fields
    record.<recorddefinition-id>.fields.<field-name> = <element>
    #to upload a file to a field
    record.<recorddefinition-id>.fields.<field-name> = FILENAME(<element>)
    #to return a new value by concatenating strings or elements
    record.<recorddefinition-id>.id = concat('<string>', <element>, <element>)
    • We support XPath transformation for extracting data from XML files. For more information see, extracting data from a file using XPath.
    • If the Record Definition ID or field name contains a space, replace the space with the \u0020 unicode. For example, if the Record Definition ID is London Employees and the field name is First Name, you can map it as:
      record.London\u0020Employees.fields.First\u0020Name= FirstName

Importing JSON file 

  1. Provide a valid JSON file containing your data
  2. Define the LedgerBridge Connector, file collector/ API collector and JSON transformer parameters in your application.properties file and place the file in the directory containing the JAR files (LedgerBridge connector, collector and transformer)

    • If the data source is a local file, configure the file collector or configure the API collector if the data source is available at an online location. 
    • If the Record Definition ID or field name contains a space, replace the space with the \u0020 unicode. For example, if the Record Definition ID is London Employees and the field name is First Name, you can map it as:
      record.London\u0020Employees.fields.First\u0020Name= $.FirstName
  3. Provide the file containing the definitions (for example, definitions.properties). The definitions file contains the mapping for source-to-target. During the import, the LedgerBridge connector maps the elements in the source to fields in the record definition. If the mapping is incorrect or missings, an error may be reported.

    #base record is at top level node
    record.<recorddefinition-id>.base = $
    #or base record is a node in the hierarchy
    record.<recorddefinition-id>.base = $.<jsonPath-expression>
    #for automatically generating the id
    record.<recorddefinition-id>.id = %AUTOID%
    #or for using id information from an jsonPath-expression
    record.<recorddefinition-id>.id = $.<jsonPath-expression>.<jsonPath-expression>
    #to map json-path expressions to fields
    record.<recorddefinition-id>.fields.<field-name> = $.<jsonPath-expression>.<jsonPath-expression>
    #to upload a file to a field
    record.<recorddefinition-id>.fields.<field-name> = FILENAME($.<jsonPath-expression>.<jsonPath-expression>)

    We support JSONPath transformation for extracting data from JSON files. For more information see, extracting data from a file using JSONPath.

Importing a CSV file 

  1. Provide a valid CSV file containing your data
  2. Define the LedgerBridge Connector, file collector/ API collector and CSV transformer parameters in your application.properties file and place the file in the directory containing the JAR files (LedgerBridge connector, collector and transformer)

    If the data source is a local file, configure the file collector or configure the API collector if the data source is available at an online location

  3. For CSV, the definitions file is optional. If the file contains headers, the headers are mapped to the field names during the import.
    You can, however, manually provide the mapping for source-to-target by providing the definitions file (for example, definitions.properties). During the import, the LedgerBridge connector maps the elements in the source to fields in the record definition. If the mapping is incorrect or missing, an error may be reported.
    In the definitions file, you can map: 

    #for automatically generating the id
    record.<recorddefinition-id>.id = %AUTOID%
    #for using id information from a header/field
    record.<recorddefinition-id>.id = <header-label>
    #to map header-labels to fields
    record.<recorddefinition-id>.fields.<field-name> = <header-label>
    #to upload a file to a field
    record.<recorddefinition-id>.fields.<field-name> = FILENAME(<header-label>)

Importing a Flatfile 

  1. Provide a valid text file containing your data
  2. Define the LedgerBridge Connector, file collector/ API collector and FlatFile transformer parameters in your application.properties file and place the file in the directory containing the JAR files (LedgerBridge connector, collector and transformer)

    • If the data source file is a local file, configure the file collector or configure the API collector if the data source is available at an online location
    • Ensure that the record definition contains the following fields, Filename of type String and Data of type File. The name of the imported file is used as the Filename and the file is uploaded to the Data field.
  3. Start the import  

Importing from the Database (JDBC) 

  1. Define the LedgerBridge Connector, JDBC file collector/ API collector and JDBC transformer parameters in your application.properties file and place the file in the directory containing the JAR files (LedgerBridge connector, collector and transformer)

    If the data source is a local file, configure the JDBC file collector or configure the API collector if the data source is available at an online location.

  2. Ensure that the field names in the record definition match the elements in the database
  3. Start the import