Problem
How to send an email from Azure Data Factory pipeline with a custom message body using gmail as email provider?
Azure Data Factory does not allow us to send emails. It is possible to send alerts (emails) in ADF but these do not provide any clarity\customization on the message subject and body. Recently, I came across a scenario to send a pipeline completion email to a data consumer on when data load process finishes and initially I tried using built-in alerts to achieve the same but that didn’t allow any customizations and further added confusion because there was no way to distinguish between failure alerts and completion alerts.
In order to send out a custom alert I had to develop a solution using a Logic App and then using Web Activity in Azure Data Factory to call the logic app to send out email with custom subject and body. In this blog we will look at developing a solution to send automatic email that alerts you when a pipeline fails\succeeds\completes.
Solution
First you will will need to create a Logic App. Click Create a resource on the left corner of your Azure Portal. Next search for Logic App and click Create.
Choose a name, subscription, resource group and location. You can turn log analytics on or off based on your requirement.

How to decide the resource type? Are you going to use Consumption or Standard resource type? Here are some differences between the Consumption and Standard resource type:
Resource Type | Logic Apps (Consumption) | Logic Apps (Standard) |
---|---|---|
Environment | Runs on Multi-Tenant environment or Dedicated ISE | Single-tenant model, meaning no sharing of resources like compute power to Logic Apps from other tenants |
Pricing Model | Pay-for-what-you-use | Based on a hosting plan with a selected pricing tier |
# of Workflows | One | Multiple |
Operational Overhead | Fully managed | More control and fine-tuning capability around runtime and performance settings |
Limit Management | Azure Logic Apps manages the default values for these limits, but you can change some of these values, if that option exists for a specific limit. | You can change the default values for many limits, based on your scenario’s needs. |
Support Containerization | No | Yes |
My recommendation would be use consumption resource type for this particular logic app as we will triggering this only in case of a pipeline failure\success\completion.
When the Logic App is deployed and ready for use you retrieve a notification where you can navigate to the Logic App. To develop your Logic App click on Logic app designer under Development Tools and a new blade to design you app shall open.

Search for “When a HTTP request is received” template and select the trigger as your first step

From the trigger select Method from Add new parameter

Choose POST for the method

Paste the below JSON in the Request Body JSON Schema
{
"properties": {
"EmailTo": {
"type": "string"
},
"Subject": {
"type": "string"
},
"FactoryName": {
"type": "string"
},
"PipelineName": {
"type": "string"
},
"ActivityName": {
"type": "string"
},
"Message": {
"type": "string"
}
},
"type": "object"
}
Following is the list of properties passed as JSON body
EmailTo | The email address of the receiver. Will be passed as user defined Parameter from ADF. |
Subject | Custom subject for the email. Will be passed as user defined Parameter from ADF. |
FactoryName | Data Factory Name. Will be passed as a built in parameter from ADF. |
PipelineName | Pipeline Name. Will be passed as a built in parameter from ADF. |
ActivityName | Activity Name in the pipeline that has failed or completed. Will be passed as user defined Parameter from ADF. |
Message | Custom Body of the email. Will be passed as user defined Parameter from ADF. |
Next we will add a new step to Logic app send an email

Search for Send email (V2) for the new operation and locate for below provider. I will be using GMAIL, but if you want to use another email provider pick that one. Eg. Office 365 (Outlook).

Enter Connection Name and use Authentication Type as Use default shared application

Once you click on Sign In you will be redirected to authenticate with your Google Account credentials. Once you Sign In with your Google Account credentials and Click Allow your connection will be established successfully.

Add new parameters for Subject, Body and Importance by using Add new parameter drop down. You will fill in details using dynamic content based on the JSON schema we added in our start trigger. To do this you need to click See more.

Now you see all the available variables as shown below

Once you have added all the variables and parameters your step should look like below

Once you Save the HTTP POST URL will be generated. Copy the HTTP POST URL and and make a note of it as this will be used in the ADF Pipeline to send email using Web Activity URL

Now we will create a sample ADF pipeline to send out custom alerts by triggering the Logic App. For this we will need to use Activities > General > Web activity

We will add parameters to the pipeline that will passed to the HTTP POST Request. Below is the list of parameters you will need to add and others can be populated from built-in System Variables.

Navigate to Settings for Web activity and paste the HTTP POST URL copied from Logic App Trigger step.
Select Method – POST
For Headers – use the following settings
Name | Value |
Content-Type | application/json |
For Body use the following dynamic content. As you can see below we have made use of both System Variables and Parameters which will be passed to the Logic App Trigger.
{"EmailTo": "@{pipeline().parameters.EmailTo}","Subject": "Reload for @{pipeline().Pipeline}-pipeline completed","FactoryName": "@{pipeline().DataFactory}","PipelineName": "@{pipeline().Pipeline}","ActivityName": "@{pipeline().parameters.ActivityName}","Message": "@{pipeline().parameters.Message}"}
Upon adding all the details in the Settings tab select the integration runtime you wish to use and your populated settings should look like below image

Save and Debug and you should receive the custom message in your email

Now that we have successfully tested the activity we can copy paste the activity and parameters in our Data pipelines and use this to send out custom alerts from Azure Data Factory.