<--- Back to all connectors

Azure Data Lake Storage

Stream data to Azure Data Lake Storage Gen2 for scalable cloud storage. Real-time CDC to ADLS for analytics and data lakehouse architectures.

Azure Data Lake Storage

Stream Data to Azure Data Lake

Key Features

  • Scalable Storage - Enterprise-grade cloud storage for big data
  • Multiple Formats - Write as Parquet, Avro, JSON, or CSV
  • Hierarchical Namespace - Organize data with directories
  • Azure Integration - Native integration with Azure analytics services

How It Works

Streamkap streams CDC events directly to Azure Data Lake Storage:

  1. Changes are captured from your source database
  2. Events are batched and converted to your chosen format
  3. Files are written to your ADLS container
  4. Data is ready for analytics with Synapse, Databricks, or HDInsight

Getting Started

  1. Create an Azure Data Lake Storage Gen2 account
  2. Configure authentication (Shared Key, SAS, or Azure AD)
  3. Set up file format and partitioning preferences
  4. Start streaming data to Azure

Supported Features

  • Azure Data Lake Storage Gen2
  • Parquet, Avro, JSON, CSV formats
  • Custom partitioning schemes
  • Integration with Azure Synapse Analytics
  • Integration with Azure Databricks

Why Streamkap?

Reliable

Serverless platform providing enterprise reliability and scale.

Affordable

Only pay for GB into Streamkap.

Longterm

Retain data for as long as you need.

Flexible

Read once, write to many destinations.