Pricefx REST API Reference
- Import a File
The Pricefx Backend API
Request
Creates a default Data Source (Product, Customer, Unit of Measures, Currencies, or Calendar). Use this endpoint if default Data Sources have not been created when setting up the partition.
Note: When no path parameter is passed, all default Data Sources are dropped and recreated - any data will be lost. Othwerise, only the Data Source indicated by the path param is dropped and recreated.
- Mock serverhttps://api.pricefx.com/_mock/openapi/reference/pricefx-server_openapi/datamart.restoredefaultds/{dataSourceName}
- URL:https://companynode.pricefx.com/pricefx/companypartition/datamart.restoredefaultds/{dataSourceName}
- curl
- JavaScript
- Node.js
- Python
- Java
- C#
- PHP
- Go
- Ruby
- R
- Payload
curl -i -X POST \
-u <username>:<password> \
'https://api.pricefx.com/_mock/openapi/reference/pricefx-server_openapi/datamart.restoredefaultds/{dataSourceName}'Request
Uploads a file to the Data Manager (specified by typeId in the path parameter).
To import data into FieldCollection (DMT or DMDS), use the
datamart.loadfc/{typedId}endpoint.Follow these steps to upload a file:
- Create an upload slot and retrieve the slotId using the /uploadmanager.newuploadslot endpoint (see Product Image > 1. Create an Upload Slot).
- Upload the file using the /datamart.importfile/{slotId}/{typedId} endpoint.
When importing a CSV file, it is possible to include more information than just a column name.
Example:
columName1:NUMBER:KEY,columnName2:TEXT:DIM,columnName3:NUMBER
a,1,0.5
b,1,0.6
c,2,1.2
d,3,2.0In the example above you can see that columnName1 column should be the NUMBER field type and also the key (KEY).
The columnName2 column should be the TEXT type and a dimension (DIM).
The columnName3 column should be a NUMBER.
Possible field types: BOOLEAN, CURRENCY, DATE, DATETIME, INTEGER, LOB, MONEY, NUMBER, QUANTITY, TEXT, UOM
Information: When using datamart.importfile to upload a file with huge numbers (20+ significant decimal digits) it is recommended to use the Avro format. In CSV and XLSX files, you can use numbers within the double type or long type ranges.
- Mock serverhttps://api.pricefx.com/_mock/openapi/reference/pricefx-server_openapi/datamart.importfile/{slotId}/{typedId}
- URL:https://companynode.pricefx.com/pricefx/companypartition/datamart.importfile/{slotId}/{typedId}
- curl
- JavaScript
- Node.js
- Python
- Java
- C#
- PHP
- Go
- Ruby
- R
- Payload
curl -i -X POST \
-u <username>:<password> \
'https://api.pricefx.com/_mock/openapi/reference/pricefx-server_openapi/datamart.importfile/{slotId}/2147483847.DMDS'Request
Creates a DMFieldCollection from a multipart/form-data body containing the schema (JSON – the DMFieldCollection DTO) as a first part and the data (AVRO format) as second part. Fails if the specified DMFieldCollection already exists.
The first part contains information about the schema of the Field Collenction to be created:
- for DMDataSource
uniqueNamelabel*fields
- for DMT
namelabelfields*owner– the ModeObjecttypedIdthat owns the table. *fieldsis a list of the DMFields specification.
The second part contains the table data in the AVRO format. The data schema must comply with the one given in the first part.
An existing table will be overriden (deletes an existing one, creates a new one).
This is an endpoint dedicated for remote services (such as Optimization Engine or Python Engine). It is very strict by design in order to ensure that the uploaded data ends up exactly in the same format in the platform.
- Mock serverhttps://api.pricefx.com/_mock/openapi/reference/pricefx-server_openapi/datamart.createfc/{fcType}
- URL:https://companynode.pricefx.com/pricefx/companypartition/datamart.createfc/{fcType}
- curl
- JavaScript
- Node.js
- Python
- Java
- C#
- PHP
- Go
- Ruby
- R
- Payload
curl -i -X POST \
-u <username>:<password> \
https://api.pricefx.com/_mock/openapi/reference/pricefx-server_openapi/datamart.createfc/DMDS \
-H 'Content-Type: multipart/form-data'See the Key-Value Database Storage Knowledge Base article for more details.
Here you can find all fields of the corresponding entity (represented by the type code).
Use the /metadata.describe endpoint to find out the correct data type of the field that is used in your partition.
Comments