Data sitting in your cloud data warehouse like Snowflake, BigQuery or Redshift can be sent to MadKudu via an Amazon S3 bucket.
To send data from Redshift, please refer to this dedicated article.
Pre-requisite
- You have an AWS account
- You can create and write in an S3 bucket, and manage IAM roles for this bucket
To let MadKudu use your data in Snowflake or BigQuery, please follow the following steps
Step 1: Identify the data from your Snowflake or BigQuery to send to MadKudu
Depending on what you are trying to do with MadKudu, you may need to send us behavioral data or enrichment on your users.
Please follow these instructions to understand which data and format are relevant to send us. Don't hesitate to reach out to our support team to assist you in this setup.
Step 2: Create an Amazon S3 bucket
- Go to your AWS Management Console
- Go to S3 service from Services > Storage > S3
- Create Bucket Amazon S3 > Create bucket
- Fill in the form and make note of your bucket
- You can use any bucket name, example: my-madkudu-shared-bucket
Step 3: Copy your Data warehouse data to your S3 bucket
Use a native function from your Data warehoue or a non-technical user-friendly tool like Hightouch or Census to send the relevant data to your S3 bucket
Step 4: Set up the S3 integration with MadKudu
Use MadKudu's instructions here to grant MadKudu access to your S3 bucket via an IAM role
Once set up, please reach out to our support team to finalize the setup and start pulling data from your S3 bucket.
Any question? Here is the FAQ for our S3 integration