Uploads a chunk (batch) of data from a client resource to the storage reserved for a particular Data Source.
The uploaded data is not immediately transfered to the Data Source database table. Instead it is held (‘buffered’) in a staging area, until an explicit flush command is issued (refer to the /datamart.rundataload endpoint, "type": "DS_FLUSH"
), or a Datamart dependent on the Data Source data is refreshed.
{- "data": {
- "header": [
- "sku",
- "label",
- "attribute1",
- "attribute2"
], - "options": {
- "detectJoinFields": true,
- "maxJoinFieldsLengths": [ ]
}, - "data": [
- [
- "11111",
- "Label One",
- "EA",
- "USD"
], - [
- "22222",
- "Label Two",
- "EA",
- "EUR"
], - [
- "33333",
- "Label Three",
- "EA",
- "CZK"
]
]
}
}
{- "response": {
- "node": "string",
- "data": null,
- "status": 0
}
}