Skip to main content

Data Integration

This guide covers how to integrate external data sources with Point — both inbound (data flowing into Point) and outbound (data flowing out of Point to other systems).

Inbound Integration (Data into Point)

SFTP File Drop

The simplest integration method. Your data provider drops files in a standard format to a Point-managed SFTP server.

Supported file formats:

  • CSV (comma or pipe-delimited)
  • Excel (.xlsx)
  • XML
  • JSON
  • FIX (Financial Information eXchange)

Setup process:

  1. Contact support@pointgroup.io to request SFTP credentials
  2. Provide the file format specification from your data source
  3. Point's team will configure the mapping and validation rules
  4. Test with sample files
  5. Go live

Typical use cases: Custodian position files, transaction files, NAV files from fund administrators


REST API Push

Your system pushes data to Point's REST API in real time or on a schedule.

Endpoint: POST /api/v1/import/{dataType}

Supported data types:

  • positions — portfolio positions
  • transactions — transaction records
  • prices — security prices
  • fx-rates — foreign exchange rates
  • securities — security master data

Example — pushing a price:

POST /api/v1/import/prices
Authorization: Bearer {token}
Content-Type: application/json

{
"prices": [
{
"isin": "GB0002634946",
"date": "2026-02-17",
"price": 152.45,
"currency": "GBP",
"type": "close",
"source": "bloomberg"
}
]
}

Azure Blob Storage

For organisations already on Azure, Point can read data directly from your Azure Blob Storage containers.

Setup:

  1. Grant Point's managed identity read access to your storage container
  2. Configure the container path and file pattern in Point's System Settings
  3. Set the refresh schedule

Outbound Integration (Data from Point)

REST API Pull

The most flexible option. Your system queries Point's API to retrieve data on demand.

See API Reference for the full endpoint documentation.

Common use cases:

  • Feeding portfolio data into a CRM
  • Populating a client portal
  • Sending data to a BI tool (Power BI, Tableau)
  • Triggering downstream workflows

Scheduled Exports

Point can automatically export data to SFTP or Azure Blob Storage on a schedule.

Configuring a scheduled export:

  1. Go to System Settings → Data Exports
  2. Click + New Export
  3. Configure:
    • Data type (portfolios, holdings, transactions, etc.)
    • Format (CSV, Excel, JSON)
    • Schedule (daily, weekly, monthly)
    • Destination (SFTP or Azure Blob)
    • File naming convention
  4. Click Save

Webhooks

Point can send real-time event notifications to your systems via webhooks.

Supported events:

EventDescription
portfolio.valuedA portfolio has been valued
transaction.settledA transaction has settled
exception.createdA new data quality exception was created
refresh_job.completedA data refresh job completed
refresh_job.failedA data refresh job failed

Webhook payload format:

{
"event": "portfolio.valued",
"timestamp": "2026-02-17T18:30:00Z",
"data": {
"portfolioId": "port_abc123",
"valuationDate": "2026-02-17",
"totalValue": 12500000.0,
"currency": "GBP"
}
}

Configuring webhooks:

  1. Go to System Settings → Webhooks
  2. Click + Add Webhook
  3. Enter your endpoint URL
  4. Select the events to subscribe to
  5. Point will send a test event to verify the endpoint
  6. Click Save

Read-Only Database Replica

For BI tools that need direct database access (Power BI, Tableau, custom SQL queries), Point can provide a read-only replica of the analytics database.

Enterprise Feature

The read-only replica is available on the Enterprise plan. Contact support@pointgroup.io for details.

Connection details (provided after setup):

  • Server: [your-org]-replica.database.windows.net
  • Database: point_analytics
  • Authentication: Azure AD or SQL authentication
  • Port: 1433