By Remi Poujeaux, SVP of Innovation at Odaseva
This first appeared as a series of Linkedin posts
Part 1: A story of battleships
Having worked in IT for the past 20+ years, I’ve seen the ineluctable movement of globalization of companies, reflected in the centralisation of IT.
Global systems, global data, global processes, Centers of Excellence. Worldwide scale, no borders. One size fits all (or at least a few sizes fit most…)
There was somewhat of a balance between local and global approaches – but overall there was a clear trend: Globalization.
But then a few years ago, the pendulum began swinging in the other direction. Globalization of IT architecture shrank. Central governments created “information borders” to restrict data movement between countries.
How did IT leaders react to this change? How did it impact today’s approach to architecture?
We can answer these questions through the lens of a historical movement – the evolution of naval warfare.
The requirements for a warship are simple – but need the right balancing for the right purpose: to operate weapons on the sea, go fast on long distances, or carry the necessary crew and supplies.
The solutions evolved over the centuries, strongly influenced by the limitations of the available technologies.
- First, ships were made from wood. This limited the overall size of the ship, while propulsion by sails or oars limited the speed, and conservation of food limited the distance it could travel. Communication between ships was only visual.
- Then steel ships came along. This enabled navies to build larger and larger ships with bigger and bigger weaponry. But you can’t fight against physics, and the reality is that doubling the length multiplied the surface by 4x and the volume by 8x, resulting in costly monsters with a single point of failure.
- A movement called “Jeune École” then appeared in France. Driving this movement was the idea that instead of building a few battleships, the French navy would build a fleet of small boats, providing agility and resilience by design. But one major flaw rendered the idea impractical: the radius of action. A small boat can not safely travel a long distance. So the strategy of building bigger and bigger battleships prevailed.
- The change ultimately came from the air: war planes made the battleships obsolete within a few years. The naval pattern changed to one aircraft carrier with its jet fighters, complemented by a connected fleet of ships of different sizes… but no more gigantic vessels.
So, does this ring a bell to you in the context of IT architecture yet? Let’s continue.
Part 2: The changing IT landscape
So how do these lessons apply to the deglobalization of IT architecture?
A parallel can be made with the different steps we’ve seen in architecture:
- Mainframes are wooden ships and battleships: leveraging the best of technology but not very agile, and forcing centralisation with a single point of failure and no resiliency. Mainframes are limited by storage capacity, memory size and network speed.
- The “fleet of small ships” is the microservice architecture that, pushed to the extreme, becomes unmanageable and unrealistic for companies whose core business is not IT and must rely on proven technologies such as SAP or Salesforce, which provides the foundation for the digitization journey.
- The aircraft carrier represents the platform approach, and can inspire a modern architecture in our defragmenting world: by keeping shared resources centrally such as the ERP or the CRM, you can have agile deployment (the jet fighters) complemented by peripheral systems (the fleet of specialized ships) connected with API (the radio communication between the ships). This is the best trade-off between over-centralisation and over-distribution, leveraging existing investment and allowing to focus on the edge differentiators.
We see that for performance, resilience, and regulatory reasons, the trend is towards a more distributed architecture.
The operational processing and storage needs to be as close as possible to the operations, at the edge.
The analytics can be centralized as it requires to ingress as much data as possible.
This approach is centered on the applications… but what about the data? And especially sensitive data? I’ll discuss this in the next post!