Snowflake Integration
  • 4 Minutes to read
  • Dark
    Light
  • PDF

Snowflake Integration

  • Dark
    Light
  • PDF

Article summary

Integrating Snowflake with Planful allows you to load data directly from Snowflake to Planful for more efficient financial planning and analysis. This integration requires setting up a profile within Planful to establish a connection with Snowflake.

Note:

Contact your Account Manager to enable the Snowflake Integration feature.

Once the profile is created, you can define Data Load Rules (DLRs) to transfer data from Snowflake to Planful for further processing.

How to Configure Snowflake in Planful?

To configure Snowflake in Planful, you have to create a new profile. To create a new profile, do the following:

  1. Navigate to Maintenance > Administration > Configuration Tasks > Data Integration Configuration > Cloud Services.

  2. Click Snowflake.

  3. Click New Profile.

    Notes:

    • You can create multiple profiles.

    • Created profiles are displayed under Existing Profiles (visible only if at least one profile exists).

  4. Enter the Profile Name.

  5. Enter the Account Name - as defined in Snowflake.

  6. Enter the Database Name - provide the name of the Snowflake database you want to connect to.

  7. Enter the Warehouse Name - provide the name of the warehouse used to execute the computing process.

  8. Select an authentication method:

    1. Basic: Enter Username and Password

    2. Key Pair: Enter Username, Private Key, and Key Phrase.

  9. Click Create.

    Once the profile is created, click Add Data Load Rule to create a new Data Load Rule to load data from Snowflake to Planful. Read more about adding a DLR to load data from Snowflake.

How to Edit an Existing Snowflake Profile?

Editing an existing Snowflake profile allows users to update credentials or connection details as needed. To edit an existing Snowflake profile, do the following:

  1. Navigate to Maintenance > Administration > Configuration Tasks > Data Integration Configuration > Cloud Services.

  2. Click Snowflake.

  3. Hover over an existing profile and click the Edit icon.

    Note:

    Click the Delete icon and click Delete to delete the profile. If any DLRs exist, users cannot delete the profile.

  4. On the Profile page, you can edit/update the details as required.

  5. Once updated, click Save.

How to Create a DLR to Load Data from Snowflake?

Users can load GL, transaction, ESM, and segment hierarchy data from Snowflake to Planful using DLRs. To create a Data Load Rule (DLR) to load data from Snowflake, do the following:

  1. Navigate to Maintenance > Data Integration > Data Load Rules  

  2. Go to New Data Load Rule.

  3. Enter Name and Description.

  4. Select Load Type as Snowflake and select a Profile.

  5. Select Load Item.

  6. Select Load Sub Item and click Next.

  7. In the Query section, enter SQL Query based on your data load needs.

  8. Fill in the details as required on the Define Overall Rule Settings, Manipulate Input File, and Define Data Mappings steps.

    To learn more about the field explanations, click here.

  9. Click Save.

    To load the data, go to Cloud Scheduler and run the DLR task or set up a recurring schedule. To learn about scheduling a DLR process, click here.

Best Practices

Managing and Validating SQL Queries:

  • Testing Queries: It’s best to test your SQL outside Planful first. Use Snowflake’s worksheet/web UI or any SQL editor to run the query and inspect results. Verify that the data returned looks correct (e.g., correct date range, no duplicates, expected number of records). This helps catch issues early (like a filter that’s excluding too much or not enough).

  • Performance Considerations: GL tables can be large (millions of rows). Use filters (WHERE clauses) to limit data whenever possible. Also, select only the columns you truly need (for example, if Planful only needs the amount and account, there is no need to select every column). This reduces the data volume, speeding up the transfer. Snowflake’s warehouses can handle large queries, but unneeded large result sets will slow down the import.

  • Use of Views or Stored Logic: If your GL data requires joins or calculations (for example, joining account codes to descriptions or converting currencies), do the following:

    • One convenient approach is to create a materialized view in Snowflake that encapsulates this logic. For example, a view v_gl_transactions_filtered joins the chart of accounts and filters out specific system entries. You can then SELECT * FROM v_gl_transactions_filtered in Planful.

    • The benefit is twofold:

      • Snowflake can optimize the view’s query, and

      • If you need to change the logic, you can update the view in Snowflake (which is often version-controlled via SQL scripts or a tool like dbt) rather than modifying the query in the Planful UI. This provides better change management for complex logic.

  • Version Control and Documentation: Treat the SQL query powering your Planful import as a piece of source code for your data pipeline.

    • Document what each query does (e.g., This query pulls the last 2 months of GL transactions, excluding inter-company entries). This helps future maintainers and auditors understand the data lineage.

  • Validation: After Planful imports the data, do a quick reconciliation against Snowflake. For instance, if you imported a sum of amounts by month, cross-check that the total matches between Planful and a Snowflake query. This ensures the data was transferred correctly and completely.

SQL Query Management in Planful:

Planful provides an interface to input the SQL. When saving the import configuration, Planful validates the SQL (checking syntax or trying a small preview). If there’s an error (syntax error, or a privilege issue), Planful displays it.

Common mistakes include missing a semicolon (if required) or not fully qualifying table names (it’s best to use database.schema.table in cross-database queries since Planful’s session might not have a context set, unless you set the default database/schema in the connection).

In summary, writing SQL for GL data is about selecting the necessary data and filtering out the rest. Keep queries as simple as possible, test them thoroughly, and leverage Snowflake’s power (through views or aggregations) to lighten the load on Planful. By managing these queries carefully and possibly version-controlling them, you ensure that the data import process is reliable and transparent.


Was this article helpful?