# Upload a Bulk Data to Data Source Uploads a chunk (batch) of data from a client resource to the storage reserved for a particular Data Source. The uploaded data is not immediately transfered to the Data Source database table. Instead it is held (‘buffered’) in a staging area, until an explicit flush command is issued (refer to the /datamart.rundataload endpoint, "type": "DS_FLUSH"), or a Datamart dependent on the Data Source data is refreshed. Endpoint: POST /datamart.loaddata/{datasourceUniqueName} Security: basic, X-PriceFx-jwt ## Path parameters: - `datasourceUniqueName` (string, required) The unique name of the Data Source where you want to upload the data to. You can also use typedId or the source name. ## Request fields (application/json): - `data` (object, required) - `data.header` (array, required) Specify header field names (table columns) of the record in the target Data Source. - `data.options` (object) Specify options of the bulk data insertion. - `data.options.detectJoinFields` (boolean) - `data.data` (array, required) The data as a list of lists, with the inner lists representing rows, its field values appearing in the same order as specified in the header list. ## Response 200 fields (application/json): - `response` (object) - `response.node` (string) - `response.data` (array) - `response.data.loaded` (integer) - `response.data.failed` (integer) - `response.status` (integer)