Skip to main content
Minoa’s Data Export feature lets you automatically sync your business cases and value realization trackers to your own data warehouse. Once configured, exports run on a schedule you choose — hourly, daily, weekly, or monthly — keeping your warehouse up to date with the latest Minoa data. This enables you to join Minoa data with your existing datasets, build custom dashboards, and run advanced analytics in your own BI tools.
Data Export is available as an early access feature. Contact your Minoa account team to enable it for your organization.

Supported Warehouses

Google BigQuery

Authenticate with a GCP service account JSON key file. Data is written directly to a BigQuery table in your project.

Snowflake

Authenticate with RSA key-pair credentials. Data is written via Snowflake’s SQL REST API to a table in your database.

Exportable Objects

Object TypeDescriptionColumns
Business CasesAll business cases in your workspace, flattened to one row per case24 columns including name, owner, account, CRM linkage, total value, ROI, cost, status, dates
Value Realization TrackersAll value trackers, flattened to one row per tracker18 columns including name, account, linked business case, value realized, status, contract terms
You can preview the exact column schema during configuration by clicking View object schema below the object type selector.

Export Methods

MethodBehaviorBest For
IncrementalOnly exports records created or updated since the last successful runLarge datasets where you want to minimize transfer volume
FullTruncates the table and re-exports all records on every runSmaller datasets or when you need a complete refresh
UpsertInserts new records and updates existing rows in-place using the record IDMaintaining a single up-to-date copy of each record

Setting Up BigQuery

1. Create a Service Account

1

Open the GCP Console

Go to the Google Cloud Console and select the project where your target dataset lives.
2

Create a service account

Navigate to IAM & AdminService AccountsCreate Service Account.Give it a descriptive name like minoa-data-export and click Create and Continue.
3

Grant permissions

Assign the following roles to the service account:
  • BigQuery Data Editor (roles/bigquery.dataEditor) — allows creating tables and writing data
  • BigQuery Job User (roles/bigquery.jobUser) — allows running queries
Click ContinueDone.
4

Generate a JSON key

Click on your new service account, go to the Keys tab, and click Add KeyCreate new keyJSON.Download the JSON key file — you’ll upload this to Minoa in the next step.
Keep this file secure. It grants write access to your BigQuery dataset. Do not commit it to version control or share it publicly.

2. Create the Target Dataset

If you don’t already have a dataset for Minoa data, create one:
1

Go to BigQuery

In the GCP Console, navigate to BigQuerySQL Workspace.
2

Create a dataset

Click your project name in the explorer panel, then click Create Dataset.Choose a dataset ID (e.g., minoa_exports), select a data location, and click Create Dataset.
You do not need to create the table manually — Minoa will create it automatically on the first export run using the correct schema.

Minimum Permissions Summary

PermissionRoleWhy
bigquery.tables.createBigQuery Data EditorCreate the export table on first run
bigquery.tables.updateDataBigQuery Data EditorInsert, update, and truncate rows
bigquery.tables.getBigQuery Data EditorRead table metadata
bigquery.jobs.createBigQuery Job UserExecute queries

Setting Up Snowflake

1. Generate an RSA Key Pair

Snowflake’s SQL REST API uses key-pair authentication. You’ll generate an RSA key pair, register the public key with your Snowflake user, and provide the private key to Minoa.
1

Generate a private key

Run the following command in your terminal:
openssl genrsa 2048 | openssl pkcs8 -topk8 -inform PEM -out minoa_rsa_key.p8 -nocrypt
This creates an unencrypted PKCS#8 private key file called minoa_rsa_key.p8.
Keep this private key secure. It authenticates write access to your Snowflake database. Do not commit it to version control or share it publicly.
2

Extract the public key

openssl rsa -in minoa_rsa_key.p8 -pubout -out minoa_rsa_key.pub
3

Get the public key value

Copy the public key content without the header and footer lines:
grep -v "PUBLIC KEY" minoa_rsa_key.pub | tr -d '\n'
Copy the output — you’ll need it in the next step.

2. Create a Snowflake User and Assign the Key

1

Create a dedicated user (recommended)

Connect to your Snowflake account as an admin and run:
CREATE USER minoa_export_user
    DEFAULT_ROLE = minoa_export_role
    DEFAULT_WAREHOUSE = COMPUTE_WH;
2

Assign the public key

ALTER USER minoa_export_user SET RSA_PUBLIC_KEY='<paste your public key here>';
Paste the public key value (without header/footer) that you copied in the previous section.
3

Create a role with minimum privileges

CREATE ROLE minoa_export_role;

-- Grant usage on warehouse, database, and schema
GRANT USAGE ON WAREHOUSE COMPUTE_WH TO ROLE minoa_export_role;
GRANT USAGE ON DATABASE MINOA_DB TO ROLE minoa_export_role;
GRANT USAGE ON SCHEMA MINOA_DB.PUBLIC TO ROLE minoa_export_role;

-- Grant table-level permissions on the target schema
GRANT CREATE TABLE ON SCHEMA MINOA_DB.PUBLIC TO ROLE minoa_export_role;
GRANT SELECT, INSERT, UPDATE, DELETE, TRUNCATE ON ALL TABLES IN SCHEMA MINOA_DB.PUBLIC TO ROLE minoa_export_role;
GRANT SELECT, INSERT, UPDATE, DELETE, TRUNCATE ON FUTURE TABLES IN SCHEMA MINOA_DB.PUBLIC TO ROLE minoa_export_role;

-- Assign the role to the user
GRANT ROLE minoa_export_role TO USER minoa_export_user;
Replace COMPUTE_WH, MINOA_DB, and PUBLIC with your actual warehouse, database, and schema names.

Minimum Permissions Summary

PermissionGranted OnWhy
USAGEWarehouseExecute queries
USAGEDatabaseAccess the database
USAGESchemaAccess the schema
CREATE TABLESchemaCreate the export table on first run
SELECTTablesRead data for MERGE operations
INSERTTablesWrite new rows
UPDATETablesUpdate existing rows (upsert)
DELETETablesRequired for MERGE operations
TRUNCATETablesClear table for full exports

Configuring an Export in Minoa

1

Navigate to Data Export

In Minoa, go to SettingsData Export and click New Export.
2

Name your export

Enter a descriptive name, e.g., BigQuery - Business Cases or Snowflake - Value Trackers.
3

Select your warehouse

Choose BigQuery or Snowflake.
4

Provide credentials

  • BigQuery: Upload the service account JSON key file you downloaded earlier.
  • Snowflake: Enter your Snowflake username and paste the contents of the private key file (minoa_rsa_key.p8).
5

Configure the destination

  • BigQuery: Enter your GCP Project ID, Dataset ID, and a Table ID (e.g., business_cases).
  • Snowflake: Enter your Account identifier (e.g., orgname-accountname), Database, Schema, Table, and Warehouse.
6

Choose what to export

Select the object type (Business Cases or Value Realization Trackers), set the export interval, and pick an export method.
7

Test and save

Use the Start test button to verify your configuration before saving. Check the Runs tab to confirm the test completed successfully.
Once saved and enabled, your export will run automatically on the schedule you configured.

Monitoring Exports

Runs Tab

The Runs tab shows a history of all export executions:
  • Status — Green (completed), yellow (completed with errors), red (failed), or blue (running)
  • Rows Exported — Number of records written to the warehouse
  • Duration — How long the export took
  • Errors — Count of any errors encountered

Logs Tab

The Logs tab provides detailed log entries across all runs. You can filter by:
  • Level — INFO, WARN, or ERROR
  • Search — Search log messages or filter by run ID

Troubleshooting

Verify that your service account (BigQuery) or Snowflake user has the minimum permissions listed above. For BigQuery, ensure both BigQuery Data Editor and BigQuery Job User roles are assigned. For Snowflake, confirm the role grants include USAGE on the warehouse, database, and schema.
Minoa creates the target table on the first export run. Ensure your credentials have CREATE TABLE permission on the target dataset (BigQuery) or schema (Snowflake). If the table already exists with a different schema, the export may fail — drop or rename the existing table and let Minoa recreate it.
  • Confirm the private key is in PKCS#8 PEM format (starts with -----BEGIN PRIVATE KEY-----).
  • Verify the matching public key is registered on the Snowflake user: run DESC USER minoa_export_user and check the RSA_PUBLIC_KEY property.
  • Ensure the account identifier matches exactly (e.g., orgname-accountname). You can find this in your Snowflake URL: https://<account>.snowflakecomputing.com.
Some records may fail to flatten or write due to unexpected data formats. Check the Logs tab for specific error messages. Common causes include invalid date values or extremely long text fields. The export will still write all records that succeed.
  • Check that the export status is Enabled in the Configuration tab.
  • Verify the interval matches your expectations (e.g., daily exports run once per day).
  • For Incremental exports, only records modified since the last successful run are included. If no records changed, the export will complete with 0 rows — this is expected.