Multi-Cloud provides a native integration with cross-cloud systems to unlock data products at the speed your business needs them.
- IT alone cannot realize the full potential of data in Snowflake. A broad and inclusive approach with all LOBs is required, and not all valuable data is in Snowflake and/or controlled by IT yet.
- Increased stress on, and demands from, data teams. Business users demand access to data products in a timely manner, increasing pressure to deliver.
- Rigidity and friction in data processes. Rigid processes and tooling prevent the building of databases in a speedy and flexible manner. Taking too much time and too much effort.
Multi-Cloud can help with these needs and more.
To enable Domo's Multi-Cloud Data Fabric, contact your Customer Service Representative or Account Executive. You'll then be able to connect your existing data warehouses from these providers:
- Other data providers coming soon
Each data platform has its own unique requirements. See the associated articles for more details.
- Other data providers coming soon
Multi-Cloud Data Fabric allows you to specify a default computation cloud account for Adrenaline DataFlows and a default storage account for all DataSets.
- Default storage account: Used as a default for all new DataSets including Connectors and DataFlow outputs.
- Default computation account: Used as the default account for computation of Adrenaline DataFlows. See Creating an Adrenaline DataFlow for more details.
Changing these settings does not change existing DataSets or DataFlows. These defaults are only used for new assets.
Security & Roles
After Multi-Cloud has been enabled in your instance, you can assign new Multi-Cloud grants to manage access to these features:
Manage Cloud Accounts
Users with the Manage Cloud Accounts role can add and remove (coming soon) cloud accounts, and add and remove (coming soon) tables from a cloud account. This grant also allows changing the default compute and storage clouds.
Override Default Cloud
Users with the Override Default Cloud grant can choose any compute or storage cloud account that has been shared with them. This allows users with an understanding of your third-party compute cost structure and technical knowledge of how best to optimize queries the ability to set the storage or computation cloud account that is best for each asset.
Working with Data
Tables that already exist in your Snowflake warehouse can be registered in Domo's Data Center through 'Add Accounts'. Only users with the Manage Cloud Accounts grant can use this feature.
In 'Add Accounts', select a database or warehouse, schema and the table(s) to add to the Data Center. These tables are not copied to Domo, but rather registered in Domo's Data Center so they're immediately available for DataFlows and cards.
Tables that have already been registered cannot be selected again. To remove an existing table from your Domo Data Center, delete the DataSet using one of Domo's existing methods.
Domo assumes all date/time columns are stored in UTC. If this is not the case, the correct date/time will not display in Domo.
When creating a DataSet via a connector, users with the Override Default Cloud grant can choose a cloud account to store the data to. Users without this grant will always write to the default storage cloud at the time the set is created.
When creating a DataFlow, users with the Override Default Cloud grant can choose a cloud account to write the output set to. Users without this grant will always write to the default storage cloud.
Each DataSet can be set independently and can only be specified when creating a DataFlow or adding a new output DataSet.
Since Adrenaline DataFlows execute directly against your data where it's hosted, users with the Override Default Cloud grant can also specify a compute cloud. Users without this grant will use the Default Compute Cloud specified at the time the Adrenaline DataFlow is created.
The compute cloud account determines where your transformations are computed. For example, if your compute cloud account is a Snowflake account, transformation is performed in Snowflake. This is valuable when the input DataSets to your Adrenaline DataFlow are hosted in the same cloud. Using that cloud for computation reduces execution time by eliminating the need to query and stage data across systems. See the table below for general guidelines. Note that computation on an external cloud account, like a Snowflake account, may be subject to compute charges from that provider. Please contact your account representative from the provider for more details.
The compute cloud account can be changed at any time and will impact the next execution of the DataFlow. Only admin users and DataFlow owners can change this setting. Users without the admin role must have the Override Default Cloud grant.
As a general practice, we recommend choosing a compute cloud based on the following:
|Input DataSet storage cloud||Output DataSet storage cloud||Recommended compute cloud for optimal performance|
|All in Domo||In Domo||Domo|
|All in Snowflake||In Snowflake||Snowflake|
|Mix of Domo and Snowflake||Domo or Snowflake||Cloud account that hosts the majority of your input data, or your output data cloud.|
Other DataFlows compute in the Domo cloud and will query your hosting cloud for data. Depending on the operations performed in a DataFlow, this may result in multiple queries being executed against your hosting cloud.
Does Multi-Cloud work with Bring Your Own Key (BYOK)?
Not at this time, although we are exploring integration for a future release.