Azure Data Factory v2 and its available components in Data Flows

Azure Data Factory v2 and its available components in Data Flows

Many of you (including me) wonder about it.
Namely: Is it possible to move my ETL process from SSIS to ADF? How can I reflect current SSIS Data Flow business logic in Azure Data Factory? And turned out that new feature in ADF: Data Flow – comes with help. Furthermore, such solution will be scalable as Azure Databricks works under the hood. Don’t worry – you don’t have to know Databricks and extra language (Scala, Python) at all. How does it possible? Carry on reading.

In this post’s section, I would like to show you what kind of actions you can do and what is their equivalent in SQL and SSIS.


New world: Data Flow in Azure Data Factory

The big benefit here is that you will not write any line of code. You can design whole business logic from the scratch using Data Flow UX and appropriate code in Scala will be prepared, compile and execute in Azure Databricks behind the scenes. So that you can focus on business logic and data transformations like data cleaning, aggregation, data preparation and build code-free dataflow pipelines.
Additionally, the process would be automatically scale-out if you allow for that.

ADF Data Flow vs SSIS vs T-SQL

The main purpose of this post is to bring capabilities of (ADF) Data Flow closer and compare it to its counterparts from SSIS and relevant code of T-SQL.
Why? Because it’s far easier to understand something new by comparison to something that we know very well.
Furthermore, tables and icons talks to us much more, hence it is easy to acquire such new knowledge.
Having those fundamentals, you can re-design the current ETL process in Azure Data Factory when having a clear image of mapping components between SSIS and ADFDF. To fulfil the picture out, I have added a column that shows T-SQL code that does the same or similar things in SQL.
So, no matter which technology your current process uses, either Stored Procedures in SQL or SSIS, you are able to sit down and recreate that process to uncover new opportunities.


Operation / Activity Description SSIS equivalent SQL Server equivalent

New branch
Create a new flow branch with the same data
Multicast (+icon)

Join data from two streams based on a condition
Merge join

Conditional Split
Route data into different streams based on conditions
Conditional Split

Collect data from multiple streams
Union All

Lookup additional data from another stream
Subselect, function,

Derived Column
Compute new columns based on the existing once
Derived Column

Calculate aggregation on the stream

Surrogate Key
Add a surrogate key column to output stream from a specific value
Script Component

Incremental Primary Key
(with limited capabilities)

Check the existence of data in another stream
Lookup / Merge Join

Choose columns to flow to the next stream OUTPUT in components,
mapping columns

Filter rows in the stream based on a condition
Conditional Split

Order data in the stream based on column(s)

Use any custom logic from an external library
Script Component

Source for your data flow.
Obligatory first element of every Data Flow in ADF.

OLE DB Source and more …

Destination for your data flow
OLE DB Destination and more…

Update 04/01/2022
Do you think the above table is useful? Download the updated version (PDF) as a two-page cheat sheet.
More interesting materials like this can be found in the following free course: Cheat sheets for Data Engineers


This new feature has huge capabilities. I’m very excited being had opportunity to use it more.
An automatically scalable process, like this, might be very efficient with Big Data processing. Hence, it’s worth to start designing new processes with Azure Data Factory or even migrating existing processes when your enterprise suffers from performance degradation due to the amount of processing data.
Please be aware that among Microsoft solutions is another Data Flow – exists in Power BI. Do not confuse them.

In next posts of this series, I will be explaining all activities from ADF Data Flow a bit deeper.
Let me know your thoughts or leave a comment once you have any questions.
Thanks for reading!

Useful links

ADF Data Flow’s documentation
ADF Data Flow’s videos
Follow this tag on the blog: ADFDF

Previous Last week reading (2018-12-02)
Next Speaking at SQL Saturday #782 Slovenia

About author

Kamil Nowinski
Kamil Nowinski 189 posts

Blogger, speaker. Data Platform MVP, MCSE. Senior Data Engineer & data geek. Member of Data Community Poland, co-organizer of SQLDay, Happy husband & father.

View all posts by this author →

You might also like

General 1Comments

Skirmishes with automated deployment of Azure Data Factory

When it comes to Azure Data Factory deployment, I always have a few words to say. You might be aware that there are two approaches in this terms. I was

Azure Data Factory 13 Comments

Mapping Data Flow in Azure Data Factory (v2)

Azure Data Factory is more of an orchestration tool than a data movement tool, yes. It’s like using SSIS, with control flows only. Once they add Mapping Data Flows to

Publish ADF from code to further environments

Struggling with #ADF deployment? adf_publish branch doesn’t suit your purposes? Don’t have skills with PowerShell? I have good news for you. There is a new tool in the market. It’s a task for Azure


  1. TC
    February 13, 10:38 Reply

    Hi, any idea the release date? It’s been in preview for a while now

    In ADF dataflow, do you know if the source is able to read directly from gzip files? (this is possible with the Copy Activity source, so hoping this will be available in dataflow)

    Also will the dataflow source read all files in blob storage without having to create any looping logic, again like the copy activity

Leave a Reply