Data Integration
This guide covers how to integrate external data sources with Point — both inbound (data flowing into Point) and outbound (data flowing out of Point to other systems).
Inbound Integration (Data into Point)
SFTP File Drop
The simplest integration method. Your data provider drops files in a standard format to a Point-managed SFTP server.
Supported file formats:
- CSV (comma or pipe-delimited)
- Excel (.xlsx)
- XML
- JSON
- FIX (Financial Information eXchange)
Setup process:
- Contact support@pointgroup.io to request SFTP credentials
- Provide the file format specification from your data source
- Point's team will configure the mapping and validation rules
- Test with sample files
- Go live
Typical use cases: Custodian position files, transaction files, NAV files from fund administrators
REST API Push
Your system pushes data to Point's REST API in real time or on a schedule.
Endpoint: POST /api/v1/import/{dataType}
Supported data types:
positions— portfolio positionstransactions— transaction recordsprices— security pricesfx-rates— foreign exchange ratessecurities— security master data
Example — pushing a price:
POST /api/v1/import/prices
Authorization: Bearer {token}
Content-Type: application/json
{
"prices": [
{
"isin": "GB0002634946",
"date": "2026-02-17",
"price": 152.45,
"currency": "GBP",
"type": "close",
"source": "bloomberg"
}
]
}
Azure Blob Storage
For organisations already on Azure, Point can read data directly from your Azure Blob Storage containers.
Setup:
- Grant Point's managed identity read access to your storage container
- Configure the container path and file pattern in Point's System Settings
- Set the refresh schedule
Outbound Integration (Data from Point)
REST API Pull
The most flexible option. Your system queries Point's API to retrieve data on demand.
See API Reference for the full endpoint documentation.
Common use cases:
- Feeding portfolio data into a CRM
- Populating a client portal
- Sending data to a BI tool (Power BI, Tableau)
- Triggering downstream workflows
Scheduled Exports
Point can automatically export data to SFTP or Azure Blob Storage on a schedule.
Configuring a scheduled export:
- Go to System Settings → Data Exports
- Click + New Export
- Configure:
- Data type (portfolios, holdings, transactions, etc.)
- Format (CSV, Excel, JSON)
- Schedule (daily, weekly, monthly)
- Destination (SFTP or Azure Blob)
- File naming convention
- Click Save
Webhooks
Point can send real-time event notifications to your systems via webhooks.
Supported events:
| Event | Description |
|---|---|
portfolio.valued | A portfolio has been valued |
transaction.settled | A transaction has settled |
exception.created | A new data quality exception was created |
refresh_job.completed | A data refresh job completed |
refresh_job.failed | A data refresh job failed |
Webhook payload format:
{
"event": "portfolio.valued",
"timestamp": "2026-02-17T18:30:00Z",
"data": {
"portfolioId": "port_abc123",
"valuationDate": "2026-02-17",
"totalValue": 12500000.0,
"currency": "GBP"
}
}
Configuring webhooks:
- Go to System Settings → Webhooks
- Click + Add Webhook
- Enter your endpoint URL
- Select the events to subscribe to
- Point will send a test event to verify the endpoint
- Click Save
Read-Only Database Replica
For BI tools that need direct database access (Power BI, Tableau, custom SQL queries), Point can provide a read-only replica of the analytics database.
The read-only replica is available on the Enterprise plan. Contact support@pointgroup.io for details.
Connection details (provided after setup):
- Server:
[your-org]-replica.database.windows.net - Database:
point_analytics - Authentication: Azure AD or SQL authentication
- Port: 1433