In this article, I am going to explain what Azure
Data Factory is by explaining an example.
What is Azure Data Factory?
The Azure Data Factory service is a fully managed
service for composing data storage, processing, and movement services into
streamlined, scalable, and reliable data production pipelines.
It provide access to on-premises data in SQL Server and cloud data in Azure Storage (Blob
and Tables) and Azure SQL Database.
In this article, I am going to show how we can
transfer data from One Blob Folder to another Blob Folder using Data Factory.
Below are the steps, need to follow for this:
Step 1: Create a Storage Account
Step 2: Create a Blob (Container) => Source
Step 3: Create another Blob => Destination
Step 4: Create a Data Factory
Step 5: Create a Pipeline
Step 6: Create 2 Data Set
Step 7: Create a Linked Service
Step 8: Validate Pipeline
Step 9: Debug Pipeline
Step 10: Trigger Pipeline Manually
Step 11: Add Schedule Trigger
Assuming that you have a Storage Account in your
Azure account. In this Storage account.

Image 1.
Here Select Blobs -> Create a Container (adfdemo)

Image 2.
Now upload a .txt file inside this like below:

Image 3.

Image 4.

Image 5.
Now time to create a Data Factory:
Go to Azure Portal: https://portal.azure.com/
Click on ->
+Crate a Resource -> Select Analytics -> Select Data Factory

Image 6.
Here enter all required details:

Image 7.
After creating New Data Factory "AzureDataFactoryDemoOne"
Click on "Author & Monitor".

Image 8.
New window will open. On Left Side click on Author
Option.

Image 9.
Now Time to create a Linked service so that we can link
our Azure Storage Account to Data Factory.
Click on Connection in Left Side menu.

Image 10.

Image 11.
Below is our newly created Linked Service.

Image 12.
Now time to create Data Sets as we are going to move data
from source (One Blob) to destination (to Another blob) so we will create 2
data Sets, type of Azure Blob, here names are:
1. InputDataSet (Azure Blob Type) => Source
2. OutputDataSet (Azure Blob Type) => Destination
To Add DataSet click on + in Factory Resource:

Image 13.

Image 14.
Click on Continue -> Select Format -> Binary

Image 15.

Image 16.
Click on Continue:

Image 17.
Now by following above steps we have to crate
OutputDataSet also.

Image 18.
Select DataSet -> Azure Blob Storage -> Binary=>

Image 19.

Image 20.
Now time to create a Pipeline with Copy Activity

Image 21.
Click on Pipeline -> In General Tab Name it "CopyDataPipeLine"
From Activities section => In Move & Transform,
Drag CopyData to Pipeline Design Surface.

Image 22.
Now click on Source Tab:

Image 23.
Sink Tab:

Image 24.
Now validate your Pipeline:

Image 25.
Now we need to Debug our Pipeline:

Image 26.

Image 27.
Now check your Storage account. Data has been copied.

Image 28.

Image 29.
Now time to Trigger Pipeline:
Here it will deploy all entities ie.: Linked Service,
data sets, pipeline to data factory.

Image 30.
We can manually trigger our pipeline like below:
Click on Add Trigger => Trigger Now

Image 31.

Image 32.
Now click on Monitor from Left side Menu:

Image 33.
You can view your Activity Runs

Image 34.

Image 35.
Click on Detail View:

Image 36.
Now add New
Trigger to trigger pipeline on a schedule basis:

Image 37.

Image 38.
Now publish all your activity J