BigPanda’s use of Generative AI points the way to autonomous IT operations

Written By:
Published:
Content Copyright © 2023 Bloor. All Rights Reserved.
Also posted on: Bloor blogs

BigPanda’s use of Generative AI points the way to autonomous IT operations banner

BigPanda has generated a lot of interest in the press and amongst the IT analyst community following the July launch announcement of its Generative AI capability. I have just caught up with Blair Sibille, BigPanda’s Field Chief Technology Officer (CTO) to find out how early customer trials have been going and understand how changing technologies are reshaping its own strategy and direction.

The one thing that is very clear is that BigPanda is not just bolting on Generative AI and making slightly more effective chatbots, it wants to integrate Generative AI into the whole event correlation and automation heart of its solution to help improve business outcomes.

It’s worth stepping back to the beginning of the year when BigPanda realised that the emergence of Large Language Models (LLMs) had the potential to change not just the AIOps market, but the whole IT operations sphere in general. Understanding that this might put a kink in their technical roadmaps, the company ran a series of tests on the OpenAI, Google Bard and AI21 Labs LLMs. OpenAI produced the best results and was then used in trials on real monitoring and alert data. I reported on this in my 2nd August blog so I won’t go into detail here. Suffice it to say, the results showed that Generative AI produced outstanding results in quickly identifying actual root cause of problems when it was fed enriched, contextualised and correlated data.

This is the core of BigPanda’s approach to AIOps, that precedes the arrival of LLMs by some way. I didn’t ask the question that, apparently, most other analysts and journalists have asked…why BigPanda and why now. To me it was obvious. BigPanda is as much about data quality and data organisation as it about event correlation and automation. It had already made great strides in reducing the amount of alert noise from monitoring tools by adding in contextual data around applications, topology and a variety of CMDB data, where available, to provide more effective event correlation and large reductions in system generated support tickets. Blair likens this to providing a 3-dimensional view, rather than a flat one-dimensional stream of data.

Currently BigPanda is running Alpha trials with a number of customers. Early responses from those customers look very encouraging, with all support levels identifying benefits derived from the use of the LLM. It’s early days, but I am looking forward to seeing how the trials pan out and to be able to report on them fully.

It feels like BigPanda has entered a new phase in its history and development.  Significant new hires for the senior management team in marketing across both product and customer management have added experience in both startup and also larger corporate environments. There is now a focus on providing continuity of account management across the whole customer lifecycle and the company is ramping up its Centre of Excellence focused particular on AI Delivery and Research, and Customer Delivery. A bit like LLM technology itself, it feels as though BigPanda is reaching a new stage of maturity.

Fully autonomous IT operations are some way off. Indeed, there may always be a need for a guiding human hand on the steering wheel. However, BigPanda are taking a big step towards validating the potential of LLMs to automate and resolve IT operations problems that have been exacerbated by both a volume and complexity is impossible to keep up with manually.