NL / EN

Ready…. set… integrate!

As we continue our knowledge sharing journey, this blog is about how to prepare for success where it comes to integrations. We assisted a fair few clients in recent months that learned the hard way that preparation is key to achieve success with integrations. This blog should assist you to consider the important steps in terms of integration planning, so that the integration drives value and operates without issues from day 1 after go-live!

Scroll to next section
hero c-hero__image

Oh wait. Darn data quality. Tangled processes. People doing things they shouldn't be doing. Never worry free 🙂. Pun intended.

As it turns out, setting up an integration is sometimes easy, other times harder. But nearly always worthwhile in the end. If executed well. The reason we say that, is the amount of time and frustration it saves when an integration is working seamlessly, is invaluable!

What is the secret sauce for success?

So, after implementing hundreds of integrations for our customers, we felt we should share some insights into how to set up your integrations project for success. A well planned integration project can save you a tremendous amount of time in execution as well.

Using our learnings below, means you don’t have to reinvent the wheel on some of the commonly met obstacles along the way of configuring an integration.

The data points

We know this seems obvious, but there are some specific areas to consider when thinking about integrations.

The two most important concepts are:

  • source of truth
  • system of record

Often these terms get thrown about loosely. But what do they really mean? In the context of integrations, we often ship data from system A to system B, with varying levels of conversion and other logic in the middle. In order to do that, we need to know what data points to take, from which system to which, and how.

In making this conceptual design, for each data point, there needs to be a source of truth. This is the best and most (ultimately) reliable data point. There is no guidance or better practice list to turn to for understanding the source of truth. It really depends on the processes in your organisation.

For a set of data points that you plan to integrate, it could even be in different systems (or: systems of record) - meaning it is likely not one system that you would label as the source of truth for your entire organisation. Or for a planned integration, for that matter. One source of truth is easy, but if multiple sources of truth exist like four instances of marketing cloud in one company (live example for one of our clients), this is a lot harder. Therefore this is important to know about this early on in the planning phase.

So the core idea is to identify all relevant sources of truth and which system of record they are kept in, before even thinking about the processes and corresponding flows in your planned integration.

The process and flow

Once the data points, the origins (system/s of record), destinations and transformations are known, it is time to map the processes that the integration will support. Most processes become a lot simpler after the integration is implemented, since that is often one of the rationales to do an integration project in the first place. Therefore, some level of change management is required as part of the integration project, including communications to all teams involved in the processes affected.

In our work, we find the common surface-level understanding of a process does not always match reality. So a detailed and very specific walk-through is a very important investment of your time in the planning phase of the integration. This will then give you a realistic version of the current, and new process after the integration is implemented. In addition, you will be in an informed position regarding all possible downstream effects, risks, associated mitigation strategies and also opportunities that come as a result of implementing the integration.

Is the data synchronised?

So now, we know the data points, where the source of truth is, the system/s of record, how the business process will change and how we manage this change. The next step is to determine whether the data is currently actually synchronised at all.

Often when customers initially indicate their data is synchronised, they refer to a manual process. And in our world, the data is in fact not synchronised if this is the case. Because people make mistakes and just input information in different ways, it can be hard to actually work out which records match to which. Especially in the absence of unique identifiers that link records together.

Therefore, the next stage in the planning process entails determining whether the data points that are currently in the systems are actually synchronised or not. If not, an approach can be to overwrite the data in the destination system using Harmonizer. This is especially a good idea if the dataset is robust in the source of truth, and if there is a unique identifier. If there is no identifier, this may have to be a manual data cleanup process. Another approach may be to accept that historical data is not synchronised. All ways can work, but it needs to be a conscious decision of what the best fit is for the process and data involved.

A word of warning

If you opt to not clean the data for historical records in your destination system, you will have to accept a level of exception management once the integration starts synchronising the new data. This is a result of the destination system having new data that potentially looks different to historic data. This can also lead to a more complex process and integration flow as well, and to duplicate records in the destination system. The integration can also be more complex to troubleshoot.

Therefore, we tend to recommend cleaning data at the planning phase of an integration. It might seem like a lot of upfront work, but will save a lot of effort down the track, especially in testing and ongoing maintenance of the process and integration.

Technical considerations

These depend a lot on applications you are integrating with. Some systems could be really mature. At the other end of scale, there are systems that are just starting out as well. Generally speaking, the latter category of systems might not work in all the ways you expect. Or perhaps the API documentation is lacking, not complete or not accurate.

A second benefit of a more mature system can be that there are some in-built checks for data quality, although this does not have to be the case. The better the data quality, the smoother the integration will run from the outset.

To investigate and mitigate technical challenges, it is most insightful to see the integration in action. It would be dangerous to do this straight on a production environment though, just in case anything breaks or does severe damage to the data or to a system. This means that after fully understanding and documenting the data points, processes/flows and level of synchronisation, the integration needs to be set up in a test environment.

The key question for us is whether the data comes out of the system/s of record in the expected format and quantity, whether our code runs over it smoothly, whether we can write to destination systems without issues, and whether otherwise, upstream and downstream APIs are behaving as they should.

A final and very generic technical ‘rule of thumb’ is, as always: the simpler you can set something up, the better. There are many ways to approach a challenge, and it pays off to stop, think and find the simplest solution. This will enable a better customer experience and also less work technically to maintain and monitor the integration.

Conclusion

We hope that this will help you to better plan your integration projects. We know this may seem like a lot of work, but it is worth it for your integrations and beyond. It will do wonders for your data management, data governance, and business processes in general. All these areas get automatically uplifted as you work through this.

And finally, having an excellent understanding of data benefits your cyber security posture and enhances data privacy practices as well.

Photo by Braden Collum on Unsplash

Scroll to top