GitHub Repository:
Follow along with the video tutorial:
We will build a serverless application that uses Artificial Intelligence to conduct facial recognition and uses Courier to send alerts based on the results. We just launched our first hackathon and are giving away over $1K in prizes! Join us in building a cool project and winning any of the following prizes. 🏆
Additionally, everyone who submits a project successfully integrating the Courier API will receive a $20 Amazon gift card! Submissions close on September 28th. Register now to submit this project for a chance to win some cool prizes.
Register for the Hackathon:
Not sure where to start? In this tutorial, we will build a serverless lie detector that uses Artificial Intelligence for Facial Recognition.
Let’s get started. We are secret agents and headquarters is telling us that one of our spies has betrayed us and has been leaking sensitive, top-secret information to our enemies. Our goal is to build a lie detector that will alert our spy network when we identify the mole. We will use Azure's Cognitive Services to perform facial recognition on everyone on our team. When the Face API recognizes that one of our spies is being deceitful, we will use Courier to broadcast the identity of the mole to our spy network.
Some spies are in an area where they can only guarantee secure messaging through emails, and others prefer quick and secure SMS, so we need to ensure our app can accommodate all spy preferences.
Note: The first three Secret Agents to successfully complete this tutorial and task will receive a top-secret gift from HQ via Courier.
In Part 1, we will create our serverless application using Azure Functions. In Part 2, we will first integrate the Gmail and Twilio APIs, which Courier will use to send emails and text messages. In Part 3, we will demonstrate how to send single messages and set up routing to send multi-channel notifications from the Azure function. And finally, in Part 4, we will explore Azure Cognitive Services and integrate the Face API to analyze emotions and use the Courier API to send alerts when specific deceiving emotions are detected.
We first need to set up our local development environment to enable using Azure and testing our code.
Once the two extensions have been installed successfully, check the left menu for Azure’s A symbol. Completely close and reopen VS Code if the symbol does not automatically appear.
To build an HTTP Trigger Function:
If the subscription is not showing up locally, follow instructions from the
Once the location has been selected, we are prompted to make a few decisions about the type of function we want to create.
You can edit settings later within the function.json file. Let’s open the index.js file and take a moment to understand the boilerplate function.
module.exports = async function (context, req) {
context.log('JavaScript HTTP trigger function processed a request.');
const name = (req.query.name || (req.body && req.body.name));
const responseMessage = name
? "Hello, " + name + ". This HTTP triggered function executed successfully."
: "This HTTP triggered function executed successfully. Pass a name in the query string or in the request body for a personalized response.";
context.res = {
// status: 200, /* Defaults to 200 */
body: responseMessage
};
}
Line #4 demonstrates how we can get data from the function query or request body. Line #5 contains a string assigned to a variable named responseMessage
, which uses the data from line #4. On lines #9-12, this variable is then passed into the response object, context.res
.
Courier sends messages by consolidating multiple API calls into one. In this second part, we will need to authorize our API to send messages via the Gmail and Twilio APIs.
Once you see Agent Pigeon dancing, we are ready to use Courier to communicate with our spies. Before we build out our application, we need to set up the Twilio provider to enable text messages.
Lastly, you need to locate the Messaging Service SID, which you create in the Messaging tab on the left menu. Checkout Twilio’s docs on
We can use Courier to send single messages or set up routing to send multi-channel notifications from within the Azure function. In this third Part, we will start sending messages and will refer to the Courier Node.js quick start, which outlines how to get started with the SDK, and can be found within the Docs page on
If you would rather use the Courier API, check out the
The
Find the API key on
Save the API key in the local.settings.json file and bring it into the index.js file as const apiKey = process.env["API_KEY"]
. This application can now be authorized to use our Courier account.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "",
"FUNCTIONS_WORKER_RUNTIME": "node",
"API_KEY": "replace-with-your-key"
}
}
Install the Courier SDK by running the following command in the terminal: npm install @trycourier/courier
Use the require function to import the SDK into the index.js file.
const { CourierClient } = require("@trycourier/courier");
The last step is to walk through the API call within the API docs and integrate it into our codebase.
import { CourierClient } from "@trycourier/courier";
// alternatively:
// const { CourierClient } = require("@trycourier/courier");
const courier = CourierClient({ authorizationToken: "YOUR_AUTH_TOKEN_HERE" });
const { requestId } = await courier.send({
message: {
to: {
email: "[email protected]",
},
content: {
title: "Welcome!",
body: "Thanks for signing up, {{name}}",
},
data: {
name: "Peter Parker",
},
routing: {
method: "single",
channels: ["email"],
},
},
});
The code on lines #8-23 defines the message object, which provides data to Courier about the messages: the to
object about the user receiving the notification, the content
object about what the message contains, the data
object about any variables that impact the content
object or conditions for the outgoing notifications, and the routing
object about the types of notifications being sent.
Update the email that this message is being sent to. To protect the identities of our spies, we will use fake contact information here.
Update the message here. For example, we can change the title
or email subject to Mule Identified
and the body
to Beware! The mule is {{name}}.
In this case, we can either hardcode the name or get it from the HTTP trigger function body.
Update the responseMessage
and log it to the console. This new responseMessage
will indicate to us that this HTTP triggered function runs successfully by outputting the requestId
response from the Courier API call.
To run this function locally, first run the func start
command in the terminal, enabling the trigger for this function (and all functions within this project, if there were others). This command will also return to us the corresponding local endpoint that we can use to trigger this function.
const { requestId } = await courier.send({
message: {
to: {
email: "[email protected]",
},
content: {
title: "Mule Identified!",
body: "Beware! The mule's name is {{name}}.",
},
data: {
name: name,
},
routing: {
method: "single",
channels: ["email"],
},
},
});
We can use Postman or Insomnia to test this function. Here we will use the REST Client VS Code extension, which we installed earlier
Create a request.http file.
To create a new test call, type ###
at the top of the request.http file.
Below, define the type of request, in this case POST
, and paste the endpoint next to it.
The body of this function call still needs to be defined under the endpoint. Create an object that contains a name
parameter and define it as Secret Agent Pigeon.
###
POST http://localhost:7071/api/LieDetector
{
"name": "Secret Agent Pigeon"
}
Azure Cognitive Services enables us to add cognitive capabilities to apps through APIs and AI services.
To access the Face API, navigate to the
“Create a Resource” and locate “AI + Machine Learning” in the list of categories on the left.
Select “Face” from the list of services that appear and update the settings based on our preferences and account.
Hit “Review + Create” and then “Create” to begin the deployment process (this takes a few minutes to complete).
Once deployment is complete, head over to the resource, navigate to “Keys and Endpoints” on the left menu under “Resource Management”, copy one of the keys and the endpoint, and save them within our project in the local.settings.json file.
{
"IsEncrypted": false,
"Values": {
"AzureWebJobsStorage": "",
"FUNCTIONS_WORKER_RUNTIME": "node",
"API_KEY": "replace-with-your-key",
"FACE_API_KEY": "replace-with-your-azure-key",
"FACE_ENDPOINT": "replace-with-your-azure-endpoint"
}
}
These values are treated as secret keys and this file is included in the .gitignore.
Just as we used the Courier SDK, we will use the Face service with Azure’s SDK, which can be found on the
Run npm install @azure/cognitiveservices-face
in the terminal to install the Face service SDK.
Run npm install @azure/ms-rest-azure-js
in the terminal to install the REST Azure Client.
Copy the sample Face API call from the sample code and place it into our codebase.
Move the import statements to the top, above the Azure Function.
const { FaceClient, FaceModels } = require("@azure/cognitiveservices-face");
const { CognitiveServicesCredentials } = require("@azure/ms-rest-azure-js");
async function main() {
const faceKey = process.env["faceKey"] || "<faceKey>";
const faceEndPoint = process.env["faceEndPoint"] || "<faceEndPoint>";
const cognitiveServiceCredentials = new CognitiveServicesCredentials(faceKey);
const client = new FaceClient(cognitiveServiceCredentials, faceEndPoint);
const url =
"https://pbs.twimg.com/profile_images/3354326900/3a5168f2b45c07d0965098be1a4e3007.jpeg";
const options = {
returnFaceLandmarks: true
};
client.face
.detectWithUrl(url, options)
.then(result => {
console.log("The result is: ");
console.log(result);
})
.catch(err => {
console.log("An error occurred:");
console.error(err);
});
}
main();
Now, we can update the variables within the code snippet.
Update the name of the function on line #43 and the associated function call on line #65 to analyze_face()
.
Fix the key names on lines #44 and #45 to match the names we created in the local.settings.json file.
Line #49 contains the image this API will analyze. Find a link to an image of our own - to protect the identities of our spies, we will use IU’s image.
Change the options
object between lines #50 to #52 to returnFaceAttributes
and add an array with emotion
as an element.
async function analyze_face() {
const faceKey = process.env["FACE_API_KEY"];
const faceEndPoint = process.env["FACE_ENDPOINT"];
const cognitiveServiceCredentials = new CognitiveServicesCredentials(faceKey);
const client = new FaceClient(cognitiveServiceCredentials, faceEndPoint);
const url =
"https://www.allkpop.com/upload/2021/12/content/231225/web_data/allkpop_1640280755_untitled-1.jpg";
const options = {
returnFaceAttributes: ["emotions"]
};
client.face
.detectWithUrl(url, options)
.then(result => {
console.log("The result is: ");
console.log(result);
})
.catch(err => {
console.log("An error occurred:");
console.error(err);
});
}
analyze_face();
Finally, we need to be able to manipulate the response from this API call.
Save the response in a variable called result
.
Convert the response into a JSON string using the stringify
method.
const result = await client.face.detectWithUrl(url, options);
const resultJSON = JSON.stringify(result, null, 2);
We run into an error when we use the REST Client to test our function. The result is not displayed, which means that for some reason, analyze_face()
is not returning the correct response. We can check the Face API reference to determine the cause of the error. We can first attempt to resolve the issue by removing a specific emotion
from the result object.
const result = await client.face.detectWithUrl(url, options);
const anger = result[0].faceAttributes.emotion.anger;
const angerJSON = JSON.stringify(anger, null, 2);
The actual error stems from a typo on line #51 where the object returned is not plural and should be called emotion
. When we test the code, we see that the anger emotion has a value of 0, which matches the selected image.
const options = {
returnFaceAttributes: ["emotion"]
};
analyze_face()
function to return the entire emotion
object. This will enable us to compare multiple emotions' values and determine whether the face being analyzed is deceitful.
const result = await client.face.detectWithUrl(url, options);
return result[0].faceAttributes.emotion;
Following instructions from Headquarters, we know that our questions should only invoke specific reactions. If a face shows any hint of anger, neutral, or contempt emotions, we will have to assume that the person being questioned is the mule.
const emotions = await analyze_face();
const anger = emotions.anger;
const angerJSON = JSON.stringify(anger, null, 2);
const neutral = emotions.neutral;
const neutralJSON = JSON.stringify(neutral, null, 2);
const contempt = emotions.contempt;
const contemptJSON = JSON.stringify(contempt, null, 2);
if((angerJSON > 0)||(neutralJSON > 0)||(contemptJSON > 0)) {
deceptive = true;
}
Send out alerts to our entire spy network if and when these values are larger than 0.
const { CourierClient } = require("@trycourier/courier");
const { FaceClient, FaceModels } = require("@azure/cognitiveservices-face");
const { CognitiveServicesCredentials } = require("@azure/ms-rest-azure-js");
const apikey = process.env["API_KEY"];
const courier = CourierClient({ authorizationToken: apikey });
module.exports = async function (context, req) {
context.log('JavaScript HTTP trigger function processed a request.');
const name = (req.query.name || (req.body && req.body.name));
const emotions = await analyze_face();
const anger = emotions.anger;
const angerJSON = JSON.stringify(anger, null, 2);
const neutral = emotions.neutral;
const neutralJSON = JSON.stringify(neutral, null, 2);
const contempt = emotions.contempt;
const contemptJSON = JSON.stringify(contempt, null, 2);
let deceptive = false;
if((angerJSON > 0)||(neutralJSON > 0)||(contemptJSON > 0)) {
deceptive = true;
}
if(deceptive) {
const { requestId } = await courier.send({
message: {
to: {
email: "[email protected]",
},
content: {
title: "Mule Identified!",
body: "Beware! The mule's name is {{name}}.",
},
data: {
name: name,
},
routing: {
method: "single",
channels: ["email"],
},
},
});
}
const responseMessage = "The HTTP trigger function ran successfully.";
context.res = {
// status: 200, /* Defaults to 200 */
body: {
responseMessage,
"anger": angerJSON,
"neutral": neutralJSON,
"contempt": contemptJSON
}
};
}
async function analyze_face() {
const faceKey = process.env["FACE_API_KEY"];
const faceEndPoint = process.env["FACE_ENDPOINT"];
const cognitiveServiceCredentials = new CognitiveServicesCredentials(faceKey);
const client = new FaceClient(cognitiveServiceCredentials, faceEndPoint);
const url =
"https://www.allkpop.com/upload/2021/12/content/231225/web_data/allkpop_1640280755_untitled-1.jpg";
const options = {
returnFaceAttributes: ["emotion"]
};
const result = await client.face.detectWithUrl(url, options)
return result[0].faceAttributes.emotion;
}
Our lie detector is ready and will alert our spies anytime a captive tries to mess with us. Try building a lie detector of your own and alerting [email protected], and we will send a gift to the first three Secret Agents to complete this task! Head to
🔗 GitHub Repository:
🔗 Courier:
🔗 Register for the Hackathon:
🔗 Courier's Get Started with Node.js:
🔗 Courier Send API Docs:
🔗 Twilio Messaging Service SID Docs:
🔗 Courier API Reference:
🔗 Azure for Students:
🔗 Troubleshooting Azure Account Setup: https://github.com/microsoft/vscode-azure-account/wiki/Troubleshooting#setup-your-azure-account
🔗 Azure Cognitive Services:
🔗 Azure Portal:
🔗 Azure Cognitive Services SDK: