Yes, APIs are taking over the world. But are they new? Yes and no. APIs have existed as long as I can remember, because that is the way any software component exposes its data and services to other software components. But it has always been pretty difficult stuff to deal with. Then came the app and web developers who wanted an easier way to get what they needed to build the next killer app. And what do they know best, the protocols that made Internet as big as it is now: HTTP. Especially REST APIs have become big because of its stateless characteristics. Although great for external APIs and for some cases for internal integration, REST APIs are not fit for any job. Just like doing home construction with a full toolset, integration remains a craft where the right pattern or technique should be applied for the right integration job. Exchanging nanosecond transactions where keeping state is important doesn’t really work well over http. Also, doing complex mappings on your API management platform is generally not a good idea. And REST APIs can become chatty when set up too granular, and because of their request/reply nature. I’ve noticed many times that companies see API management and application integration as synonyms, focusing all their integration plans on setting up an API management platform. Maybe they do this because API management is new and seen as a way to get rid of the complexity of traditional application integration. But I am afraid API management is not the silver bullet that makes integration easy and straightforward. Application integration is still a vital capability of your organization and please treat it as such. And no, with application integration I don’t mean implementing a classical Enterprise Service Bus architecture. If you have the opportunity to set up a new integration strategy, please look into event-based microservices architecture with a shared logging and streaming infrastructure, such as Apache Kafka.
Category: Pitfalls
The red line through all these pitfalls seems to be the lack of business focus. Also, when it comes to exposing open APIs, this topic is typically seen as a technical project. But in practice it turns out that it triggers many business related questions, like:
- Who will be the consumer of the API?
- What kind of data or service will it provide?
- Which business domain and application will be the source of the API?
- Should the API be specific for one consumer with one goal? Or should it be more generic?
- Is it to expose from one source application or should it compose an aggregation from multiple sources?
- Is the API to be exposed to trusted parties only, or as a public API?
- How many calls and what size payloads do you expect?
- How secure should the API be?
- Does the API expose privacy related data?
- Do the consumers expect / need support?
- What kind of QoS (reliability, traceability, performance, etc.) do the consumers expect / need?
- Will you charge the consumers? If yes, what kind of pricing model will you apply?
- How will you manage version control?
- Have you covered liability with terms and conditions?
All these questions need to be properly answered to make your open API initiatives succeed. In fact, open APIs should be seen as new business services with a new target audience. Your new customers are external developers working for existing customers or partners, or even completely new customers you never new existed. Oh, and by the way, these new customers expect to consume high quality data. So if you haven’t tackled pitfall 4 yet, please think twice before you decide to expose your data to the outside world…
Almost all organizations (in fact I’ve not yet seen one who hasn’t) have problems with their data. And those problems really become apparent when collecting and aggregating them for reporting and analytics purposes. Which company doesn’t have a datawarehouse full of inconsistent and undefined data? And which organization isn’t trying to get cool predictive analytics models to work, only to find out that they produce unreliable outcomes or take for ever to be validated due to data quality issues? But the fact that data quality issues become apparent in the reporting and analytics capabilities of an organization doesn’t mean that these issues are best solved there. And worse even, all those BI and Big data tools claim to have excellent tools for cleansing or enhancing data, so why not go for it? Because it doesn’t makes the problem go away. Fix data quality issues at the root, the source where the data is created and you will work towards the end of the tunnel of misery. This is what data architecture is truly about. Unfortunately, I see too many organizations fall into the trap of focusing with their data architecture activities on building datalake like platforms with cool new tools, only to stay stuck in not getting anything into production.
Managing data as a technical issue
Extending the previous mentioned pitfall, managing data is not just a nice to have in making your digital ambitious come true. Data is traditionally seen as some technical issue IT should solve. Many businesses are becoming almost entirely digital however, making data processing and integration actually more of a core business capability. A customer is often just a set of data (a user account) that initiates digital transactions. And if all goes well, there is less and less actual human contact with customers, often only when things go wrong and customer care is required. So if data is so important to the success of the business, why is it not treated as such? Data is difficult and complex. Plus it often becomes a political topic because of the combination of problems with data pollution and the unclear ownership of data. In practice, I also often see a disconnect between data governance and managing data in the application landscape. I see managing data in your application landscape in the following way. The core applications define the heartbeat of your company. They run the transactions that keep your business alive. Keeping master and reference data synchronized makes sure that heartbeat runs smoothly, without producing pollution. Integration then is the blood running through the veins, making sure the data carrying the essential nutrients for the business arrive at the rights places. Are there no humans needed anymore then? Of course humans are still needed. People are less efficient in production or operations of the most common cases than software systems are, but are much better at supervision and handling exceptions in complex environments. The only thing left is making sure the external data your company consumes is validated before entering, and you are all set!
This is always the most touchy subject is my personal experience. When engaging in a digital transformation, everyone is thinking about new ways to engage with customers and other stakeholders through apps, gadgets and APIs. Nobody really wants to talk about the soundness of the data being used and created via those channels. But what kind of customer experience are you trying to offer when the data is incorrect or unclear? The foundation of a solid customer experience is the soundness of the data. And what seems in practice to be the biggest cause of companies not having sound data? It is not the technology, but lacking a clear and consistent Business language. At the end of the day, technology only makes a translation of what the business wants it to do. If the business doesn’t set these business concepts clearly, who will? Do you trust your developers they will define business concepts correctly for you? And yes, setting such business language is a hard, maybe even unreachable goal, because it can become very theoretical and philosophical. But there is also a practical approach to setting a business language that really makes the difference. So, if your organization wants to really distinguish itself from all other digital transformations taking place, focus on what they are (almost) all neglecting.
Being truly agile is the dream of any organization. In practice, the term ‘agile’ is often translated into: being able to deliver new customer features fast. This however is not exactly what the original agile manifesto meant. Many things about agile are in fact wrongly interpreted, like: working software OVER documentation is in practice often translated to working software NO documentation. And the idea agile made architecture obsolete, because agile teams are self-steering. A while ago I had the opportunity to talk to Arie van Bennekum about some of the misconceptions of agile. He himself acknowledged to me that architecture is vital for making agile work in a complex enterprise environment. And that agile meant that teams are self-organizing instead of self-steering, meaning they still have to play by the policies and rules of the organization. But besides the the misinterpretations of what agile way of working should be like, of course your architecture should also be able to facilitate or enable agility. If your architecture is made up by an enormous monolith application with enormous internal dependencies (a.k.a big ball of mud) then good luck to you. And even if you don’t have such limiting architecture, how do you ensure you constantly keep cleaning up technical debt so you don’t create one in the future?
Keeping your customer journeys too vague
Creating customer journeys is a powerful tool for exploring new ways to interact with your customers. Oftentimes, these customer journeys are the outcome of externally facilitated accelerator workshops with predominantly business stakeholders. Although very valuable, these customer journeys often stay too abstract and greenfield-oriented. They are not yet concrete enough for a digital development teams to implement. Also, the feasibility check has not been done during the customer journey ideation phase. This often results in disappointment because the rather vague requirements are functionally misinterpreted by the digital development teams, or turn out to be technical infeasible during the first few sprints. This results in the high expectations of business owners not being met, with initial disappointment as a consequence during the crucial starting phase of the digital transformation journey.
Are you in the mids of a digital transformation? Or thinking about starting one? Then you’d better avoid these common pitfalls. By reading this blog post, you will learn how to spot them.
1) Keeping your customer journeys too vague
2) Thinking becoming agile is only about delivering cool new features fast
3) Not bothering about creating an ubiquitous business language
4) Managing data as a technical issue
5) Thinking big data technologies are the answers to your data problems
6) Exposing open APIs just because everyone is doing that nowadays
7) Mixing up API management with application integration
8) Worrying about security and privacy later
Will follow later
9) Not having a clear plan how to manage dependencies
Will follow later
10) Trying to find problems for cool new technologies
Will follow later