The ingenuity and perseverance in developing and implementing technical workarounds during the initial stages of the pandemic are already finding their way into technology folklore. From plugging gaps in workflows and processes, to providing access and opportunity to share information, solutions had to be found - which forced new ways of working and, often, the abandonment of long-standing enterprise technology strategies.
But new challenges are emerging and the post-Covid backdrop is turning into an ungoverned information sprawl. This is against a backdrop of an unprecedented need for resilient systems and technology that can adapt in response to unexpected stress.
At the centre are monolithic systems - in some cases serving the company well for the last decade - that hamper cohesion across technology platforms.
The risk is that this monolith - let's call it Legacy Systems 2.0 - will force organisations to deliver content and processes within the limitations of existing technologies. Rather than creating a foundation for future agility and competitive strength, such technologies will impact customer experience, productivity and profitable growth.
Take for example a content management platform, which may have been in place for a decade. In reality, there’s a good chance it has become a disparate collection of overlapping technologies that have led to an amorphous sprawl of information, inadequate functionality, and the potential for non-compliance and security failures.
Dismantling the monolith
When the volume of content being created has never been greater, adding more apps to the monolith merely makes it more difficult for users within the business to work effectively, either individually or as a cohesive whole.
Arguably, there is limited value in reinventing the organisational structure to make it fit for the digital age, unless the technology can support your new ambitions.
The alternative is to develop a strategy that delivers future flexibility and modularity.
This should address the inherent opportunities of democratising software development, to empower non-technical staff to take ownership of the process. By introducing low- or no-code tools, new applications can be created at, for example, departmental level, without the need for expensive, time-consuming and detailed code.
Not surprisingly, a growing number of organisations see this as the way forward, with some analysts predicting that nearly three quarters of new enterprise-developed applications will be built this way, by the middle of this decade.
While a devolved approach to technology development often becomes a catalyst for further business process innovation and automation, users will still need access to data from multiple applications. By harnessing software components, or Application Programming Interfaces (APIs), the flow of data between individual applications can be streamlined, benefiting productivity, reducing errors, and improving data security and compliance.
In my experience, this modernisation process is likely to accelerate the move away from centralised content repositories that are unwieldy to host and expensive to run.
As far back as 2017, IT research and advisory firm Gartner declared that ECM (Enterprise Content Management) was "dead' and "kaput" and that 'content services' was now the name of the game, as new cloud technologies enabled working across multiple systems and content repositories.
Fast forward five years and today’s most advanced content services platforms dynamically connect content and streamline workflows.
Migrating away from the monolith
So how do you go about this process of modernisation and what are lessons learned by some of those that have gone down this path?
Start by auditing the content you have and its use across departments.
This assessment process will not only help identify and prioritise issues but also provide you with a 'requirements list', which will be helpful when vetting potential content services partners. And you do need to find the right partner because it will make the transition that much easier.
Establish a set of appropriate KPIs and gather relevant historic metrics to act as a baseline. You will want to be able to evaluate the solution's impact on the business, including the softer benefits, such as how new tools might be helping employees do specific jobs better.
Once this is done, let users start testing the new solution. This is a good way to reveal any critical issues and process gaps before full deployment.
As new systems are introduced, work out how to move away from old systems, without disrupting daily workflows or customer experience.
Develop a plan for wider implementation of any successful solution. For instance, what will be the nature of future iterations and when will they be rolled out? If you have chosen your partner wisely, they will be able to provide advice and direction if you find a solution that is not working quite as expected.
At this stage, by monitoring progress against the KPIs, you should be able to start seeing the impact of your implementation and determine the performance of the new solution.
Seek feedback from peers and external stakeholders not originally involved in implementing the new system.
Scale up the new solution to other teams and locations as required, keeping in mind your longer-term content strategy, which ideally your partner will share.
As the volume and diversity of data continue to grow, the legacy status of yesterday’s monolithic self-contained systems and repositories is set gain further momentum.
The future is one of shared repositories and common APIs, which can best be achieved with responsive cloud content services technology platforms. With the capacity to shapeshift to the needs of both the few and the many, these will be at the centre of digital workplace architecture, providing employee-centricity while anticipating and adapting to a changing world of work.