skip to Main Content

Each Technology page is effectively a mini site for a particular technology solution. Each page contains contextual information about the topic, commentary from our analysts, links to product information and additional educational resources if you wish to dig deeper.

  • All
  • Action
  • Data
  • Infrastructure
  • Trust

5g, IoT and Edge Computing

Revolutions are by their nature disruptive. They create winners and losers. Businesses need to be implementing IoT projects now and be developing their 5g networks and edge computing strategies to win in the 4th Industrial Revolution.

Advanced threat protection

Organisations are increasingly having to defend against attacks and threats that use ever-more-sophisticated tools and methods, which are highly targeted.


Analytics supports the complex multi-variant decision-making that is now taking place in the volatile global markets in which companies, and most governmental bodies, now operate.

Big Data

Big Data refers to the ability to analyse any type of data, not limited to (but including) data in relational tables. There may be a lot of such data …

Big Software

Big Software delivers real competitive advantage through significantly lower IT infrastructure costs and the ability to deploy more new business applications faster.


A blockchain is a distributed ledger, implemented across many servers, consisting of a continuously growing list of records, called blocks, which are linked across the whole network and secured using cryptography.

Business Intelligence & CPM

From being a rather neglected area tacked on to the end of ERP solutions, Business Intelligence has become central to the offering of all of the major vendors


A social business uses computer-mediated networks of people to create business outcomes. The greatest value comes when these networks take on a life and culture of their own, producing ‘social collaboration’.

Complex Event Processing

Complex event processing, also known as event, stream or event stream processing is a technique used for querying data prior to its being stored within a database or, in some cases, without it ever being so stored.

Data Archival

Data archival is the process of moving data that is no longer actively used to a separate data storage device for long-term retention of that data.

Data Catalogues

If you care about analytics – and who doesn’t – then a data catalogue is a fundamental requirement for creating a data-driven business and supporting self-service.

Data Centres

Data Centres are the dedicated spaces used to locate and run servers, storage and networking equipment.

Data Governance

Data governance is about ensuring that your organisation’s information assets meet corporate policies for accuracy, timeliness, completeness and compliance.

Data Integration

Data integration is a set of capabilities that allow data that are in one place to be used in another place, regardless of how they are formatted.

Data Masking

Developers and data scientists may not be allowed to see detailed personal information but at the same time they may need that data, or its equivalent.

Data Preparation (self-service)

Data preparation tools allow business analysts/users (sometimes data scientists) to prepare data for analysis without having to rely on IT.

Data Profiling and Discovery

Data profiling collects statistics about the validity of data and data discovery discovers relationships between different data elements

Data Quality

Data quality is about ensuring that data is fit for purpose; that it is accurate, timely and complete enough for use in business.

Data Virtualisation

Data virtualisation makes all data, regardless of where it is located and regardless of what format it is in, look as if it is one place in a consistent format.

Data Warehousing

A data warehouse is a database implementation that supports the storage and analysis of historic data, with the aim of understanding what happened in the past or predicting what will happen in the future.

Database Management Systems

A Database Management System provides a way of storing data and managing its security, availability, quality etc.


DevOps is about encouraging a culture of collaboration across development, operations and business silos (and, by implication, removing these silos).


Encryption – and its opposite, decryption – is the process of converting data into code that is unreadable by a human who does not possess the necessary key for decrypting it.

Endpoint Detection and Response

EDR technologies vastly improve visibility into not only which endpoints are on the network, but also their security posture, allowing for advanced threat detection and response to counter even the most sophisticated threats.

Graph Databases

Graph databases store data in terms of entities and the relationships between entities.

Hybrid Infrastructure Management

Hybrid Infrastructure Management is the monitoring and analysing of your IT infrastructure in a manner that relates to their impact on the performance and availability of your business applications.

Information Governance and Data Security

Adequate and appropriate security controls must be applied to all information and data for effective information governance purposes and accountability.


Our infrastructure practice aims to articulate the infrastructure story to make it meaningful for the CIO

Low-Code/No-Code Development

Low and no-code development allow you to develop applications unprecedentedly quickly and efficiently while enabling citizen development, collaboration and self-service.

MaaS – 
Mainframe as a Service

MaaS is the simplest and most effective way to modernise mainframe legacy and to make mainframe levels of performance, concurrency, security and reliability available to everyone.

Machine Learning and Artificial Intelligence

At its simplest it is using computers to run algorithms that give them the ability to undertake reasoning that was previously seen as the preserve of humans.


Many enterprises still run on Mainframes but start-ups that value growth without early over-provisioning should also consider Mainframe platforms.


Large organisations have many systems which store data, and the man­agement of multiple and competing definitions of this data is known as Master Data Management (MDM)

Network and Endpoint Security

Network security refers to those technologies and processes that are used to keep networks in good, secure working order. Network security encompasses the prime

NoSQL Databases

NoSQL – not only SQL – non-relational databases.

Predictive and Prescriptive Analytics

Predictive analytics refers to the ability to predict events before they occur, allowing you to take remedial action in advance. Prescriptive analytics extends this concept by either suggesting the actions you should take or automating those actions.

Security analytics

Analysing the vast swathes of security information that the average organisation generates has enormous potential for reducing the risks that they face.

Security Response Orchestration

Security response orchestration arms organisations with the tools that they need to automate responses where possible, supplementing this with human knowledge and experience where required.

Service Virtualisation

Service virtualisation is a technique used to virtualise external services used by a developing system, so that it can be developed and tested without regard to the stability or availability of services it depends on.

Spreadsheet Governance

Spreadsheet governance is a specialised aspect of governance – dealing with the error-prone nature of spreadsheets and their ungoverned proliferation.

Streaming Analytics Platforms

Once there was complex event processing but now the major market for event processing is in streaming analytics.

Test Data Management

This is the use of tools to make a representative subset or copy of a whole database that lets you adequately test relevant applications.

Test Design Automation

Test design automation provides a potentially massive boost to testing speed and efficiency. Properly leveraged, it can also dramatically improve the end-user experience.

Transaction Processing DBMSs

Transaction processing is designed to maintain database integrity (the consistency of related data items) in a known, consistent state.


Trust can be defined as the ability to believe in the reliability and integrity of something or someone.

User and Entity Behavior Analytics

Context is king when defending against sophisticated, advanced threats. UEBA technologies provide that context and are a key addition to any security analytics and intelligence capability.

Back To Top