Skip to main content Skip to footer

Search Crosser Knowledge Base

Snowflake Modules

Integrating industrial and operational data with Snowflake enables powerful analytics, reporting, and scalable data storage. However, handling real-time ingestion, batch uploads, and database operations in a consistent and efficient way can be complex. Our platform helps simplify this by providing dedicated Snowflake modules that support both streaming and batch-based data pipelines, as well as direct query execution and data retrieval.

To address these needs, we provide a set of Snowflake modules designed for different data handling patterns:

  • Snowflake Streaming Publisher: Streams data directly into Snowflake tables in real time using the Snowpipe Streaming API, making it suitable for high-throughput, low-latency ingestion.
  • Snowflake Publisher: Triggers Snowpipe-based ingestion from staged files in AWS S3 or Azure Data Lake Gen2, supporting efficient batch data loading workflows.
  • Snowflake Insert: Inserts flow message data into Snowflake tables as individual row entries for straightforward database writes.
  • Snowflake Select: Retrieves rows from Snowflake tables and returns them as arrays within flow messages for downstream processing.
  • Snowflake Executer: Executes raw SQL statements, enabling advanced database operations and full control over query execution.

About the author

Syed Gillani | Support & Post Sales Manager

Syed is the Support & Post Sales Manager at Crosser, bringing over 13 years of experience in support and pre-sales. Passionate about helping people find solutions, Syed enjoys sharing tips and tricks to make technology more accessible and improve daily workflows.