0 / 0
Snowflake connector (DataStage)

Snowflake connector (DataStage)

Use the Snowflake connector to connect to a Snowflake data warehouse and do data access operations such as Read, Write, Bulk Load and Metadata import functionality.

When to use the Snowflake connector?

Use the Snowflake connector to perform the following operations:
  • Read data from or write data to tables in the Snowflake data warehouse.
  • Bulk load data to a table in the Snowflake data warehouse.
  • Insert or Bulk load into multiple tables at a time by using the Multiple input links functionality.
  • To lookup records from a table in the Snowflake data warehouse.
  • Import metadata from Snowflake data warehouse through InfoSphere Metadata Asset Manager (IMAM).

Prerequisite

Create the connection. For instructions, see Connecting to a data source in DataStage® and the Snowflake connection.

Writing data to the Snowflake connector as a target

If you want to configure the Snowflake connector for DataStage as a target, you can:

  1. Double-click the Snowflake connector icon in the project canvas.
  2. In the Stage tab, in Properties section set your connection details (if you did not create the connection earlier). In the Advanced section, set the execution and combinability mode. Moreover, select the option for preserving the data partitioning.
  3. In the Input tab, choose the Input name. Under the Columns section edit input columns. Under the Partitioning section, choose the Partition type option. Under the Advanced section, choose the Buffering mode option.

Reading data from the Snowflake connector as a source

The available properties for reading data depend on whether you select Use DataStage properties.

If you want to configure the Snowflake connector for DataStage as a source, while Use DataStage properties checkbox is selected, you can:

  1. Double-click the Snowflake connector icon in the project canvas.
  2. In the Stage tab, in Properties section set your connection details (if you did not create the connection earlier). In the Advanced section, set the execution and combinability mode. Moreover, select the option for preserving the data partitioning.
  3. In the Output tab, choose the Output name. Under the Usage section:
    1. In General source properties select Generate SQL at run time checkbox. While set to Yes, you need to provide Table name that you want to access in the data source. In Additional properties provide the JDBC properties. Set the Enable quoted identifiers checkbox to Yes, if you want to enclose specified table and columns names.
    2. In Transaction, Begin and end, Session, Before and after and Limits sections specify required information and select all needed checkboxes.
  4. Under the Columns section, edit input columns. Under the Partitioning section, choose the Partition type option. Under the Advanced section, choose the Buffering mode option.
If you want to configure the Snowflake connector for DataStage as a source, while Use DataStage properties checkbox is not selected, you can:
  1. Double-click the Snowflake connector icon in the project canvas.
  2. In the Stage tab, in Properties section set your connection details (if you did not create the connection earlier). In the Advanced section, set the execution and combinability mode. Moreover, select the option for preserving the data partitioning.
  3. In the Output tab, choose the Output name. Under the Usage section:
    1. In General source properties select Generate unicode type columns checkbox if you use the runtime column propragation. Choose the Read method option and provide the rest of the required information.
    2. In Limits, Sampling, and Before and after SQL sections specify required information and select all needed checkboxes.
  4. Under the Columns section, edit input columns. Under the Partitioning section, choose the Partition type option. Under the Advanced section, choose the Buffering mode option.
Generative AI search and answer
These answers are generated by a large language model in watsonx.ai based on content from the product documentation. Learn more