Each Technology page is effectively a mini site for a particular technology solution. Each page contains contextual information about the topic, commentary from our analysts, links to product information and additional educational resources if you wish to dig deeper.
A data warehouse is a database implementation that supports the storage and analysis of historic data, with the aim of understanding what happened in the past or predicting what will happen in the future.
Analytics supports the complex multi-variant decision-making that is now taking place in the volatile global markets in which companies, and most governmental bodies, now operate.
Application Assurance comprises the culture, processes and technologies that enables Outcome Automation to deliver business outcomes.
Big Data refers to the ability to analyse any type of data, not limited to (but including) data in relational tables. There may be a lot of such data …
A blockchain is a distributed ledger, implemented across many servers, consisting of a continuously growing list of records, called blocks, which are linked across the whole network and secured using cryptography.
Cloud provides a business the option of having its applications and underpinning compute and storage technology, delivered as a service.
The boundaries of what we view, broadly, as the Cloud Management market have increased.
What is it?What does it do?Who should care?Emerging trendsVendor landscapeWhat is it? Treating data as an asset essentially requires four things: Knowing what data (or content) you have and where it is Understanding that data and how it relates to…
Data is either in a particular place (or places) or it is being moved between places. The former is referred to as “Data at rest”.
Both data discovery and data cataloguing are designed to allow you to know what you have and how it is related.
Data governance is about ensuring that your organisation’s information assets meet corporate policies for accuracy, timeliness, completeness and compliance.
Data is either in a particular place (or places) or it is being moved between places. The latter is referred to as “Data in motion”.
Developers and data scientists may not be allowed to see detailed personal information but at the same time they may need that data, or its equivalent.
Data movement does what its name suggests and there are multiple ways of doing so and of avoiding do so.
Data quality is about ensuring that data is fit for purpose; that it is accurate, timely and complete enough for use in business.
DevOps is about encouraging a culture of collaboration across development, operations and business silos (and, by implication, removing these silos).
Edge computing, along with SDNs, the global ubiquity of smart phones, 5G mobile technology, and IoT platforms and frameworks, are key enablers for Industry 4.0.
Encryption – and its opposite, decryption – is the process of converting data into code that is unreadable by a human who does not possess the necessary key for decrypting it.
FinOps is an essential component of Cloud Management. There’s a Foundation for it. Practitioners can get badges of certification for it.
Graph databases store data in terms of entities and the relationships between entities.
Identity and access management tools are key weapons in any organisation’s arsenal for protecting the confidentiality, integrity and availability of sensitive information.
Adequate and appropriate security controls must be applied to all information and data for effective information governance purposes and accountability.
Low and no-code development allow you to develop applications unprecedentedly quickly and efficiently while enabling citizen development, collaboration and self-service.
At its simplest it is using computers to run algorithms that give them the ability to undertake reasoning that was previously seen as the preserve of humans.
Many enterprises still run on Mainframes but start-ups that value growth without early over-provisioning should also consider Mainframe platforms.
Large organisations have many systems which store master data, and the man¬agement of multiple and competing definitions of this data is known as Master Data Management (MDM).
Network security refers to those technologies and processes that are used to keep networks in good, secure working order. Network security encompasses the prime
An operational database is designed to run the day-to-day operations or transactions of your business.
Outcome Automation is based on Design Thinking, culture and IT, and it provides a capability for automating the delivery of mutable business outcomes.
Application and system feedback and particularly the insights that can be gleaned from it are a fundamental part of almost all business systems. The question is how to acquire the right insight.
Service virtualisation is a technique used to virtualise external services used by a developing system, so that it can be developed and tested without regard to the stability or availability of services it depends on.
Spreadsheet governance is a specialised aspect of governance – dealing with the error-prone nature of spreadsheets and their ungoverned proliferation.
Once there was complex event processing but now the major market for event processing is in streaming analytics.
This is the use of tools to make a representative subset or copy of a whole database that lets you adequately test relevant applications.
Test design automation provides a potentially massive boost to testing speed and efficiency. Properly leveraged, it can also dramatically improve the end-user experience.
Efficient threat detection and response is imperative in combating today’s complex threat landscape. This webpage describes the options available.
Trust can be defined as the ability to believe in the reliability and integrity of something or someone.
There are various data tools that enable us to trust, specifically data profiling, data quality and data preparation.