We recently built the application using some of the most powerful development tools available. By offloading the non-core development functionality onto available API services, we were able to build this application faster than I ever could have imagined. I’m excited to share with you how we did it. Story Licensing TL;DR Check out the live Story Licensing app The Requirements Every application starts with a goal and requirements. For this application, the goal was to acquire leads interested in purchasing the rights to repost online content from publications in the network. Our requirements: AMI Publications Stories (for our app, stories published on sites) need to be quickly and easily searchable @ami Stories need to indicate a ranking of publicly available social validation (Claps) to the interested lead Ability to capture the lead’s email (before the search action) Ability to capture the bid request data: story title, lead email, search term bid and send an email to the admin Ability to manage copy content by a content manager via a content management system The Services For this app we knew we didn’t want to build and manage *everything*. Instead, we wanted to offload non-core development to the best available services, only building what we need. Any web project with a time crunch (pretty much every project) would be well-advised to do this so you don’t unnecessarily reinvent functionality that already exists at a high level. So for this app we decided to use: for content management, data storage, this is also where our Medium articles are stored using the . Cosmic JS Medium Backup Application to search stories. It definitely satisfies our speedy search requirement (It’s FAST!) Algolia for sending email notifications SendGrid for supplying the story library @ami So with all of these, we’ve got the best in class services for the respective functionality. And building our application would involve locking these services together. Building the Search Functionality For this app I knew I wanted to use (mainly for personal preference). So in keeping with the “build only what you have to” theme, the natural framework choice for me was . Their value proposition is “React Applications Made Simple”. It makes it easy to get a React app up and running without a lot of boilerplate development while providing for a nice developer experience. React Next.js To integrate Algolia, I needed to implement the search bar into the React app. Luckily, they’ve got an excellent , which allows for easy integration into React apps. Just set your , and . You can find these in your Algolia dashboard. InstantSearch React Component appId apiKey indexName Now that I had the search bar implemented, I needed to add records to Algolia. To do this, I needed to add the Cosmic JS-saved Medium articles into Algolia. I created a script to add Objects from Cosmic JS into our Algolia search index. Here’s : add-records.js ( ).config() Cosmic = ( ) api = Cosmic() = ( ) algoliasearch = ( ) client = algoliasearch(process.env.ALGOLIA_ACCOUNT, process.env.ALGOLIA_INDEX) index = client.initIndex( ) buckets = [ , ] .eachSeries(buckets, (bucket_slug, bucketEachCallback) => { bucket = api.bucket({ : bucket_slug }) addRecords() added_count = addRecords = { .log(skip) locals = {} .series([ { objects = bucket.getObjects({ : , : , skip }).then( { objects = data.objects locals.objects = objects locals.total = data.total .log( , data.total) .log( , locals.objects.length) callback() }).catch( { .log(err) }) }, () => { .eachSeries(locals.objects, (object, eachCallback) => { object.content index.addObject(object, (err, content) => { .log( + content.objectID) .log( , object.slug) added_count++ eachCallback() }) }, () => { .log( + added_count, + locals.total) (added_count !== locals.total) { addRecords(added_count) } { bucketEachCallback() .log( ) } }) } ]) } }) require 'dotenv' const require 'cosmicjs' const const async require 'async' const require 'algoliasearch' const const 'Stories' const 'your-bucket-slug' 'another-bucket-slug' // Add all Bucket slugs here async const slug let 0 const ( ) => skip = 0 console const async => callback const type 'posts' limit 1000 => data const console 'Total:' console 'Object Length' => err console async // Save Algolia record delete console 'objectID=' console 'ADD' console 'Added ' 'Total:' if else console 'All done FOR REAL!' Next I needed to get claps for stories, because the Medium XML feed for stories (Example: ) doesn’t include this vital information for our app. So to do this, I needed to create a worker script that could run daily, updating all of our story clap counts. https://hackernoon.com/feed Here’s : get-claps.js ( ).config() = ( ) axios = ( ) algoliasearch = ( ) client = algoliasearch(process.env.ALGOLIA_ACCOUNT, process.env.ALGOLIA_INDEX) index = client.initIndex( ) getClaps = { hit_count = index.browse( , {}, { (err) { err } hits = content.hits .eachSeries(hits, (hit, callback) => { medium_url = hit.metadata.medium_link (!medium_url) callback() axios.get(medium_url).then( { str1 = str2 = claps = (response.data.split(str1).pop().split(str2).shift()) index.partialUpdateObject({ : claps, : hit.objectID }, { (err) err; callback() }); }).catch( { .log(err) callback() }) }, () => { hit_count = hit_count + content.hits.length (content.cursor) { index.browseFrom(content.cursor, browseDone) } { getClaps() .log( , hit_count) } }) }) } getClaps() require 'dotenv' const async require 'async' const require 'axios' const require 'algoliasearch' const const 'Stories' const => () let 0 '' ( ) function browseDone err, content if throw const async const if return => response const '"totalClapCount":' const ',"sectionCount' const Number claps objectID ( ) function err, content if throw // console.log(medium_url, claps, hit.objectID) => err console if else console 'DONE!' This script does the following: Gets all of the saved records in Algolia, hits the story URL, gets the JSON data, parses it for the clap count, and then updates the record in Algolia. This process can run in the background to get and save the latest claps count for each story. This will increase our record actions per day quota in Algolia, increasing our costs, but it’s worth it to get the latest social validation for each story. @ami With the search results coming from Algolia, the search term relevance gets priority, but we can also reorder the stories to show the highest clap count on top: Saving the Lead Emails and Bid Requests Since the goal of this app is to save bid requests for potential customers, we chose SendGrid for a reliable email service provider. I created two endpoints to save the lead email prior to search, and the bid request information after a story had been found. Here’s and : leads.js bids.js .exports = { Cosmic = ( ) api = Cosmic() bucket = api.bucket({ : , : process.env.COSMIC_WRITE_KEY }) bucket.addObject({ : + ( ()), : , : [{ : , : , : , : req.body.email }], : { : , : } }).then( { res.json(data) }) } module ( ) function req, res const require 'cosmicjs' const const slug 'app-bucket-slug' write_key title 'Lead - ' new Date type_slug 'leads' metafields title 'Email' key 'email' type 'text' value options content_editor false slug_input false => data .exports = { Cosmic = ( ) sgMail = ( ) = ( ) sgMail.setApiKey(process.env.SENDGRID_API_KEY) api = Cosmic() bucket = api.bucket({ : , : process.env.COSMIC_WRITE_KEY }) bid = bucket.addObject({ : + ( ()), : , : [ { : , : , : , : req.body.email }, { : , : , : , : req.body.bid }, { : , : , : , : req.body.post_title }, { : , : , : , : req.body.post_link } ], : { : , : } }) .series([ { subject = html_body = + req.body.post_title + + req.body.bid + message = { : { : req.body.email }, to: process.env.BID_EMAIL, subject: subject, text: subject, html: html_body, } sgMail.send(message) .then( { callback() }) .catch( { .error(error.toString()) res.json({ : }) }) }, () => { subject = html_body = + req.body.post_title + + req.body.bid + message = { : { : process.env.BID_EMAIL, : }, to: req.body.email, subject: subject, text: subject, html: html_body, } sgMail.send(message) .then( { res.json({ : }) }) .catch( { .error(error.toString()) res.json({ : }) }) } ]) } module async ( ) function req, res const require 'cosmicjs' const require '@sendgrid/mail' const async require 'async' const const slug 'story-licensing' write_key const await title 'Bid - ' new Date type_slug 'bids' metafields title 'Email' key 'email' type 'text' value title 'Bid' key 'bid' type 'text' value title 'Post Title' key 'post_title' type 'text' value title 'Post Link' key 'post_link' type 'text' value options content_editor false slug_input false async => callback // Send to Admin const 'A Bid has been received' const '<div>A bid has been received for <strong>' '</strong> for <strong>$' '</strong></div>' const // sender info from email // Comma separated list of recipients // Subject of the message // // plaintext body // HTML body => () //Celebrate => error // Log friendly error console success false // Send to Bidder const 'Your bid has been received' const '<div>A bid has been received for <strong>' '</strong> for <strong>$' '</strong>. Thank you.</div>' const // sender info from email name 'Story Licensing' // Comma separated list of recipients // Subject of the message // // plaintext body // HTML body => () // Celebrate success true => error // Log friendly error console success false Notice for both of these actions (lead email and bid request), we are also storing the data in Cosmic JS. This has been done so we can use this data for admin record keeping. Cosmic JS now has this data available to view, query and deliver via the Cosmic API into any future applications. In Conclusion The finished app is now available to help you find high-quality content from the network to license to your website or blog. . AMI Publications Check it out here Resources Used Cosmic JS NPM module Algolia React InstantSearch NPM module SendGrid NPM module I’m happy with the way this project came together. The development was fast thanks to using the best services and developer tools available (Algolia, SendGrid, Next.js and Cosmic JS) to deliver a fast and scalable application. Let me know your thoughts, and . join the conversation on Slack follow Cosmic JS on Twitter