Before you go, check out these stories!

0
Hackernoon logoCloud Infrastructure Can Set Legacy Data Free by@browserlondon

Cloud Infrastructure Can Set Legacy Data Free

Author profile picture

@browserlondonBrowser London

London based dev house specialising in UX design, bespoke web app development and service design.

For a long time, it’s been widely accepted that startup businesses can gain an edge over larger, establishment rivals, due to their lack of legacy tech baggage. For example, modern challenger banks have – in terms of features and UX at least – ran rings around the traditional stalwarts thanks to their modern IT and data systems.

As a consequence, many established firms have become concerned about the lack of flexibility in their legacy data infrastructure, viewing it as a barrier to competing with nimble new rivals. To combat this, we’ve seen many big businesses pushing to completely revise their technology stacks.

After Lloyds TSB was bought by Sabadell in 2015, for instance, the bank decided it needed to push hard to move its systems onto a younger platform. It didn’t go too well. Indeed, when you look at the history of big IT transitions (NHS Connecting for Health, anybody? Or maybe Hertz…) it’s easy to see why onlooking organizations may feel nervous about making the big changes to their IT systems that they think they need.

It doesn’t need to be so scary

Here at Browser, we believe that in many cases there is a less risky way to make use of all that difficult to access data cooped up in ‘legacy’ or ‘enterprise’ systems. In fact, it’s becoming easier and easier to help our clients with this problem, simply by exploiting one of the biggest shifts in corporate IT in the last decade; cloud computing.

Most large companies have already moved (or are in the process of moving) a lot of their day-to-day IT infrastructure to cloud-based services. The best known are Microsoft Azure, Google Cloud and Amazon Web Services (AWS).

These changes have usually been cost, reliability or floor-space driven, and have simply involved a straight swap from an on-premises, legacy IT system to the equivalent, cloud-based stack.

However, while this change may be billed primarily as an efficiency gain, it also brings an added benefit – new technology.

This is because most suites of cloud services are extremely interoperable, not only within their own brand eco-system but also with competitors eco-systems through jointly developed common integration standards. This is our route in.

Many modern systems and services support these access and integration standards extremely well, meaning there are myriad opportunities to interrogate previously difficult to access data in new ways. Let’s give you a real-life example.

Using Dynamics 365 to exploit this change for a client

One of our clients – a large exhibition management company – provides a customer portal for exhibitors using a service from our sister company, Twine. This has been in place for several years and helps coordinate customers and provides the information they may require, such as floor layouts, delivery instructions and so forth.

What was missing was the ability to show data personalised to the specific exhibitor viewing the portal, such as which particular stand on the floorplan was theirs and which additional services that exhibitor may have purchased. The client did have this data but it was locked up on an on-site legacy system and could not be surfaced in the customer portal, meaning exhibitors were not able to see everything they needed in one place.

Helpfully, however, our client recently migrated its IT stack onto Microsoft’s cloud-based Azure platform, including moving to Azure Active Directory and porting legacy CRM data onto Microsoft Dynamics 365.

Microsoft, of course, makes all of these modern, cloud-based systems accessible via standardised processes, meaning all our client’s customer data – old and new – can now be accessed and interrogated in new ways. Thus, with a few simple, modern Dynamics 365 tools, it’s surprisingly easy to update their portal to provide a better UX.

Remember, the client hasn’t made any disruptive changes to their technology stack here. They’ve simply updated their IT infrastructure to the cloud-based version of a technology they were already using.

Building the Dynamics 365 widget

For maximum flexibility, we decided to build an integration tool using modern web development technologies; Go for an integration server, and ReactJs and to produce a re-usable widget that would output the user’s personalised information.

This approach meant we could embed the widget into Twine in any way the client wanted, and it used the existing single sign-on provider session that is already implemented within the platform. From this, we can prove to the integration server what user is logged in, and the personalised data can be fetched from Dynamics 365. This is surprisingly easy given the standardised approach Microsoft has taken using OAuth2 flow to log in, and OData requests to fetch data predictably.

The first step to extracting the data was to log into the Azure AD instance using OAuth2 credentials flows and gain a request token. This token will expire after a certain period, but since we’re using modern tools we don’t need to worry about this – Go will handle that for us.

config := clientcredentials.Config{
	ClientID:     c.ClientID,
	ClientSecret: c.ClientSecret,
	TokenURL:     Authority + c.TenantId + "/oauth2/token",
	EndpointParams: url.Values{
		"resource": []string{
			c.ResourceUrl,
		},
	},
}
httpClient := config.Client(context.Background())

Above, we’re using the “golang.org/x/oauth2/clientcredentials” package, meaning we can easily create a normal Go HTTP client that will automatically fetch OAuth2 tokens as required for each request. We don’t need to track when a token has expired and renew it before issuing our API call, as this standard library will do this for us.

We will then use that HTTP client to make whatever API calls we need to, and of course, the response is in JSON, or rather OData to be exact; a standardised manner of interacting with JSON Web APIs that means we know – without reading a single document – how to search for and extract information.

This means requesting a specific account’s details is as simple as requesting a predictable URL that follows the OData format.

path := fmt.Sprintf("%s/accounts(%s)", BasePath, accountId)
res, err := client.Get(path)
if err != nil {
	return
}
defer res.Body.Close()

data, err := ioutil.ReadAll(res.Body)
if err != nil {
	return
}

We can then simply unmarshal the JSON response which is, again, in a standardised format.

The future

Now, I won’t pretend that all modern cloud versions of legacy software are going to be as well standardised as Dynamics 365 is, but those suppliers that go down this route can only be making themselves more attractive.

Wholesale IT changes do not always need to result in a debacle. If we exploit the trends that are occurring in the IT marketplace already, it can be surprisingly easy to improve user experience, even when big “enterprise” software systems are still being used.

The point of this post is simply to illustrate that when market-leading enterprise software companies embrace modern standards that lean towards integration, they open up competitive advantages to their clients. We strongly encourage this behaviour. Ultimately, we hope that this kind of thing will trickle down and result in better UX for all end users.

Previously published at https://www.browserlondon.com/blog/2020/03/18/cloud-dynamics-365-infrastructure-set-legacy-data-free/

Tags

Join Hacker Noon

Create your free account to unlock your custom reading experience.