Data Export is available as an early access feature. Contact your Minoa account team to enable it for your organization.
Supported Warehouses
Google BigQuery
Authenticate with a GCP service account JSON key file. Data is written directly to a BigQuery table in your project.
Snowflake
Authenticate with RSA key-pair credentials. Data is written via Snowflake’s SQL REST API to a table in your database.
Exportable Objects
| Object Type | Description | Columns |
|---|---|---|
| Business Cases | All business cases in your workspace, flattened to one row per case | 24 columns including name, owner, account, CRM linkage, total value, ROI, cost, status, dates |
| Value Realization Trackers | All value trackers, flattened to one row per tracker | 18 columns including name, account, linked business case, value realized, status, contract terms |
Export Methods
| Method | Behavior | Best For |
|---|---|---|
| Incremental | Only exports records created or updated since the last successful run | Large datasets where you want to minimize transfer volume |
| Full | Truncates the table and re-exports all records on every run | Smaller datasets or when you need a complete refresh |
| Upsert | Inserts new records and updates existing rows in-place using the record ID | Maintaining a single up-to-date copy of each record |
Setting Up BigQuery
1. Create a Service Account
Open the GCP Console
Go to the Google Cloud Console and select the project where your target dataset lives.
Create a service account
Navigate to IAM & Admin → Service Accounts → Create Service Account.Give it a descriptive name like
minoa-data-export and click Create and Continue.Grant permissions
Assign the following roles to the service account:
- BigQuery Data Editor (
roles/bigquery.dataEditor) — allows creating tables and writing data - BigQuery Job User (
roles/bigquery.jobUser) — allows running queries
2. Create the Target Dataset
If you don’t already have a dataset for Minoa data, create one:You do not need to create the table manually — Minoa will create it automatically on the first export run using the correct schema.
Minimum Permissions Summary
| Permission | Role | Why |
|---|---|---|
bigquery.tables.create | BigQuery Data Editor | Create the export table on first run |
bigquery.tables.updateData | BigQuery Data Editor | Insert, update, and truncate rows |
bigquery.tables.get | BigQuery Data Editor | Read table metadata |
bigquery.jobs.create | BigQuery Job User | Execute queries |
Setting Up Snowflake
1. Generate an RSA Key Pair
Snowflake’s SQL REST API uses key-pair authentication. You’ll generate an RSA key pair, register the public key with your Snowflake user, and provide the private key to Minoa.Generate a private key
Run the following command in your terminal:This creates an unencrypted PKCS#8 private key file called
minoa_rsa_key.p8.2. Create a Snowflake User and Assign the Key
Assign the public key
Minimum Permissions Summary
| Permission | Granted On | Why |
|---|---|---|
USAGE | Warehouse | Execute queries |
USAGE | Database | Access the database |
USAGE | Schema | Access the schema |
CREATE TABLE | Schema | Create the export table on first run |
SELECT | Tables | Read data for MERGE operations |
INSERT | Tables | Write new rows |
UPDATE | Tables | Update existing rows (upsert) |
DELETE | Tables | Required for MERGE operations |
TRUNCATE | Tables | Clear table for full exports |
Configuring an Export in Minoa
Name your export
Enter a descriptive name, e.g.,
BigQuery - Business Cases or Snowflake - Value Trackers.Provide credentials
- BigQuery: Upload the service account JSON key file you downloaded earlier.
- Snowflake: Enter your Snowflake username and paste the contents of the private key file (
minoa_rsa_key.p8).
Configure the destination
- BigQuery: Enter your GCP Project ID, Dataset ID, and a Table ID (e.g.,
business_cases). - Snowflake: Enter your Account identifier (e.g.,
orgname-accountname), Database, Schema, Table, and Warehouse.
Choose what to export
Select the object type (Business Cases or Value Realization Trackers), set the export interval, and pick an export method.
Once saved and enabled, your export will run automatically on the schedule you configured.
Monitoring Exports
Runs Tab
The Runs tab shows a history of all export executions:- Status — Green (completed), yellow (completed with errors), red (failed), or blue (running)
- Rows Exported — Number of records written to the warehouse
- Duration — How long the export took
- Errors — Count of any errors encountered
Logs Tab
The Logs tab provides detailed log entries across all runs. You can filter by:- Level — INFO, WARN, or ERROR
- Search — Search log messages or filter by run ID
Troubleshooting
Export fails with 'permission denied'
Export fails with 'permission denied'
Verify that your service account (BigQuery) or Snowflake user has the minimum permissions listed above. For BigQuery, ensure both BigQuery Data Editor and BigQuery Job User roles are assigned. For Snowflake, confirm the role grants include
USAGE on the warehouse, database, and schema.Table is not being created automatically
Table is not being created automatically
Minoa creates the target table on the first export run. Ensure your credentials have
CREATE TABLE permission on the target dataset (BigQuery) or schema (Snowflake). If the table already exists with a different schema, the export may fail — drop or rename the existing table and let Minoa recreate it.Snowflake authentication errors
Snowflake authentication errors
- Confirm the private key is in PKCS#8 PEM format (starts with
-----BEGIN PRIVATE KEY-----). - Verify the matching public key is registered on the Snowflake user: run
DESC USER minoa_export_userand check theRSA_PUBLIC_KEYproperty. - Ensure the account identifier matches exactly (e.g.,
orgname-accountname). You can find this in your Snowflake URL:https://<account>.snowflakecomputing.com.
Exports show 'completed with errors'
Exports show 'completed with errors'
Some records may fail to flatten or write due to unexpected data formats. Check the Logs tab for specific error messages. Common causes include invalid date values or extremely long text fields. The export will still write all records that succeed.
Data appears stale or not updating
Data appears stale or not updating
- Check that the export status is Enabled in the Configuration tab.
- Verify the interval matches your expectations (e.g., daily exports run once per day).
- For Incremental exports, only records modified since the last successful run are included. If no records changed, the export will complete with 0 rows — this is expected.