Why the Leading Gartner AIOps Analyst Joined Moogsoft

Will Cappelli | Wednesday August 1 2018

From IT industry observer to industry insider, Will Cappelli recounts his journey in defining the AIOps space. Now, he’s focused on changing the world.

Why the Leading Gartner AIOps Analyst Joined Moogsoft

Hello world! I recently started as CTO of EMEA and Global VP of Product Strategy at Moogsoft, and thought I’d share with you the key reasons why I opted to leave industry research and analysis behind and join the company that I consider to be the leader in AIOps. I will begin with some reflections on my time at Gartner, go on to discuss the promise of AIOps in general, and then lay out, in some detail, my reasons for thinking that Moogsoft is uniquely positioned to deliver on that promise.

The Importance of Gartner and IT Industry Research

While few doubt the importance of Gartner and its competitors to enterprise IT decision making, I do not believe the historical role that IT industry research and analysis has played in shaping the deep structure of the IT market is fully appreciated. If one goes back to the late 1970s and early 1980s, the IT market was dominated by a handful of large players, including IBM and its competitors nicknamed the “BUNCH.” There were certainly dynamics at play. Digital Equipment Corporation was growing rapidly as a result of its introduction of a computer system with a smaller form factor and, based in Japan, companies like Fujitsu and Hitachi were introducing IBM plug-compatible machines to the market. But as far as enterprises were concerned, they were buying and deploying a more or less mysterious box of integrated hardware and software that crunched numbers and printed text. There was a massive asymmetry in knowledge between the supply side and the demand side, and IT was rapidly evolving into a utility which buyers did not understand or controlled.

Tweet Section

Although my strategy was moderately successful and I do believe my colleagues and I managed to shift the market equilibrium in both case to points of higher value I also became aware that the effectiveness of my strategy and, maybe even the overall salience of IT industry research and advisory, was beginning to decline.

If that trajectory had continued, the IT market today would most likely look like the telecommunications or energy market: enterprises largely taking what was being offered as long as the cost was right without truly grasping the functionality of what they were consuming. On the one hand, there would have been minimal need for in-house IT infrastructure, operations, or application development. On the other hand, the world would have experienced little of the innovation and the business-transforming impact that we have come to associate with IT.

That trajectory did not continue, however. Some out-of-the-box thinking about how to repackage information and research originally created for consumption by financial analysts eventually transformed Gartner into research and analysis whose primary goal was to inform and support enterprise users in their hardware and software buying decisions. (It did not happen overnight, of course, but it did eventually happen. I was there at the creation.) With Gartner and competitors and spinoffs like Forrester and Meta at work, the asymmetry of information began to be corrected in the favor of the buyer.

Enterprises, armed with analysts’ advice and writings, were able to question and critique vendors, drive down prices, and directly shape product development. From one perspective, this was all a very good thing lower prices, more innovation, greater control. On the other hand, it led to an anomalous industrial structure that some might argue burdens businesses even today. Whereas the general path of economic evolution is to segregate stages of production and to transform the interfaces between those stages of production into markets, the emergence of Gartner et al prevented that “normal” evolutionary step from taking place.

So we now live with the curious circumstance that almost all businesses occupy at least two successive stages of production and are geared to create and delivery at least two kinds of product: the product which is the nominal point of their existence, and the IT services and functionality product. The exact analogue would be if enterprises still, to this day, generated their own electricity using power plants that they themselves built and energy sources which they owned directly.

Since the goal was to diminish the asymmetry of information, many analysts interpreted their role in terms of user advocacy. We were the ones supplying a user community of Davids with the slingshots and rocks that could be used against a vendor community of Goliaths. In my own practice, however, I did not think this was an accurate understanding of the value we brought to the economy in the wake of our destabilization of what would have been its normal evolution. Instead, I came to view the analyst’s role as being that of meta-entrepreneur that finds missed opportunities of value on both sides of the market divide and moves both sides through vision, analysis, and persuasion to see and realize those opportunities.

Look at it this way. At least at the level of microeconomics, most markets will move to some kind of equilibrium, a point where the supply and the demand side will match their respective strategies in such a way that given the vendor has adopted the strategy, it has chosen to pursue, the buyer has no incentive to modify the strategy that he or she has chosen and vice versa. However, there is no guarantee that this particular match of strategies is indeed the best match of strategies that delivers the most value to both vendor and user. The reasons for moving to such a suboptimal strategy largely come down to lack of trust induced by lack of information and, most importantly, lack of information about what the other party knows. In other words, we are dealing with a classic Prisoner’s Dilemma.

As an analyst, I came to see my task as providing the information and analysis required to ensure that the market on both sides moves away from that suboptimal Prisoner’s Dilemma equilibrium and towards a strategic equilibrium which delivers maximum value to all. My own strategy for accomplishing this end was pretty simple: find big user requirements that were currently not being met and look at how vendors could easily combine existing but separately maintained technological capabilities to meet those needs. Hence my “discovery” and encouragement of Application Performance Monitoring (APM) and, later, of Artificial Intelligence for IT Operations (AIOps).

Although my strategy was moderately successful — and I do believe my colleagues and I managed to shift the market equilibrium in both case to points of higher value I also became aware that the effectiveness of my strategy and, maybe even the overall salience of IT industry research and advisory, was beginning to decline.

There were three reasons for this. First, the velocity with which ideas moved from the academic research stage to viable commercialization had increased by an order of magnitude since the early 1990s. Things were moving just too fast for analysts not in the thick of academic research themselves to track innovation in the market. Second, until the late 1990s, new ideas tended to commercialize first in the business-to-business world. In other words, our clients, the people we talked with and supported directly on the demand side, experienced the impact of IT innovation where and as they worked. Then came the consumerization of IT which set in motion an era where the primary vector of innovation came through the interface between consumer and business. Research in a consumerized IT era needed to be conducted via rapid fire surveys and digital interactions not through intense and lengthy professional dialogue. Third, and most importantly, as late as 2007, at best 20 percent of the events that constitute the execution of an average business process were digital, in the literal sense that that those events themselves were actually IT system state changes. Now, a bit more than 10 years later that percentage has grown to something like 60% overall and among digitally intensive industries like financial services or media and entertainment, the percentage is closer to 80%.

IT and the business are merging even if they are not always willing to acknowledge it. Put another way, the sharp break between supply side and demand side is becoming ever more dull and even dissolving. And with that dissolution, the game changes.

Over the next few years, the asymmetry of information between vendor and unassisted business buyer will vanish and the IT market will come to resemble the closing scene of George Orwell’s Animal Farm: ‘The creatures outside looked from pig to man, and from man to pig, and from pig to man again; but already it was impossible to say which was which.’ In a word, if I still want to increase value in the market for all participants — and I do! — the only effective way is to do it as an insider. So, with fond memories and no regrets, I am leaving mediation between the demand and supply sides and starting a life of partnership and participation with the integrated digital business community as we maximize value across the economy at large.

The Importance of AIOPs

As to why AIOps? Well, first of all, naturally I believe in AIOps as a trend and business endeavor. My analysis of the market indicated that in 2017, the worldwide spend on AIOps (the application of automated pattern discovery, anomaly detection, and root cause analysis to large complex data or signal sets generated by IT system state changes, which are known as events) topped $2 billion and was growing at a rate of close to 25% a year. That figure was split with about 70% spent on the data/signal capture and ingestion piece of the puzzle, and 30% on the systems driving the automated pattern discovery, anomaly detection, and root cause analysis algorithms. But many conversations with decision makers and discussions of future spending plans led me to believe that spending on the latter would grow faster than spending on the former. In fact, I will predict that by 2023, the spend on the algorithms will exceed the spend on collecting and storing the signals and the data.

There are some, including my former colleagues at Gartner, that try to make the case that while AIOps has a great future and, indeed, will become pervasive, it is still somewhat immature and works against existing corporate cultures. Immaturity is, of course, an aesthetic judgement but this much is true: Moogsoft’s AIOps Platform has an extremely stable code bases and has been deployed around the world to great effect. Furthermore, $2 billion (more than half the size of the much older APM market) does not sound to me like the kind of spend that characterizes an immature market. With regard to the fixity of corporate culture, the truth is that IT operations teams do not have a choice. The number of events is increasing overall by a rate close 50% a year. Furthermore, with the increased modularity, dynamism, and geographical spread of IT systems, the overall entropy of data being generated by infrastructure and application stacks is likewise growing at blistering speed.

What this means for IT operations professionals is that the ability to infer the global health of the technology and, therefore, the digital business process from a small subset of events has been fundamentally compromised. AIOps is the only hope of regaining visibility into their estate through the deployment of automated inferencing and pattern discovery functionality. It is not a question of only planning for the future and delivering new value, it is question of being able to continue to tread water in a modern environment. Not to put too fine a point on it, but the reason why the world is already spending $2 billion on AIOps is because it is already a necessity.

The Importance of Moogsoft

There are nearly 50 vendors that claim they are doing AIOps, and many of those claims are certainly legitimate. So why Moogsoft? When I began to think about where I wanted to go, I naturally wanted to work with a company that, according to my lights, had the most accurate understanding of market requirements and the most visionary technology. Now, there has been a lot of debate among vendors, users, analysts, and investors about what is the right approach to AIOps and few of the issues can be resolved with absolute certainty. But five years into the market, I think it is possible to make up one’s mind about where things should be placed along four dimensions.

The first dimension is scope. Some argue that AIOps functionality should be deployed as an extension or enhancement to domain-specific monitoring tools or process-specific technology platforms. Application, infrastructure, and network monitoring tools on the one hand, or service desks or CMDBs on the other, are already gathering the requisite data. It doesn’t make sense then to just add AI algorithms to these technologies and reap the advantages of automated pattern discovery, anomaly detection, and root cause analysis without occasioning the disruption and, in many cases, the redundancy of bringing in a new domain/process independent platform. While reasonable and comfortably conservative, there is a big problem with this approach. The web of causality now spreads across components and levels and respects no domain or process boundaries and incidents typically involve multiple root causes situated in multiple domains. A domain or process-specific approach will be, at best, myopic and, at worst, blind. So in examining the first dimension, it became clear to me that only a domain/process-independent approach of the sort realized in the Moogsoft AIOps Platform held the greatest promise. Domain-specific and process-specific AIOps may play a role in some overall AIOps architecture but it will always be subsidiary.

The second dimension is process. Some argue that AIOps is best deployed as an extension of a big data platform. Let’s assume that we are going with a domain/process-independent approach and trying to cope with vast quantities of data from multiple sources. Just describing the problem this way suggests that what an enterprise needs is the ability to deploy a big data lake into which to pour the data and once it is situated there, one should apply the algorithms to help IT operations professionals “navigate” the lake. There are, however, two problems with this approach. First, the increasing ephemerality of the objects that constitute the IT estate and interconnecting topologies means that enterprises simply don’t have the leisure to wait for the data to make its way to the lake before it gets analyzed. The analysis needs to begin as close to point in time of the data’s origin. Otherwise, critical elements will be missed and insights will not be retrieved quickly enough. Second, the data lake approach requires one to ingest and store ALL of the data generated by the IT estate. Unfortunately, not all of the data is ultimately useful. Despite the high degree of entropy I mentioned above, there still remains a lot of redundancy in the data generated redundancy which can, in many cases, be spotted before any further algorithm is applied. It makes much more sense to set up redundancy filters close to point where the data originates to save time and also save the costs of a) applying algorithms to redundant data and b) the storage of significant quantities of junk data. In other words, the data lake approach is fundamentally flawed. A real-time approach of the sort deployed by the Moogsoft AIOPs Platform that applies automated pattern discovery, anomaly detection, and root cause analysis to minimally redundant streaming data is far better suited to the ephemeral stacks that characterize modern IT and support digital business. This is not to say that data lakes have no use. Historical analysis is and will remain important but rather than filling that lake with redundant, unanalyzed data, why not fill it only with essential data that has already gone through process of real time analysis?

The third dimension is the approach to AI itself. AI has been a subject of academic study since the late 1950s and, from the inception of that study, there have been two approaches. The first approach saw AI as being based on the automation of logical inference. Basic rules (mostly of an “if then else” form) were fed to an AI program as “axioms” and then, in a manner reminiscent of high school Euclidean geometry, the program would apply logic-based inference rules to such axioms and generate conclusions. This approach was commercialized in the 1980s SDI, thinking machines, symbolics, the Japanese 5th Generation Computing Project, CYC and, once the Cold War ended, found its final home in the IT operations management software industry. In fact, most of today’s service desk platforms descended from logic-based AI programs and, in the event management world, of course, the Tivoli Management Environment was fundamentally a supersized expert system written in the logic-programming language Prolog. One could argue that ITOps has, without knowing it, been AIOps since 1987, if not earlier!

The second approach, however, saw AI as a reflection of the mathematical pattern discovery processes which appear to be embedded in the neural mesh of our brains. The algorithms sought after utilizing this approach consumed data sets and tried to generate statistical and other mathematically describable patterns that would be capable of accounting for the data generated. This is, of course, how science works. Someone like Newton examines large sets of data about planetary movements and missile trajectories and, through a leap of intellectual imagination, an insight, comes up with the equations capable of describing both motion on the surface of the earth and the movement of the planets in the solar system. The costs and architectural limitations of compute power had rendered this second approach to AI commercially problematic through the early 2000s but since then it has exploded onto the market, driving many of the innovations behind the web-scale service industry and consumer IT.

Now, given that the challenge of modern IT operations is primarily a challenge of seeing, anticipating, and responding to patterns in complex data sets, the mathematical approach to AI seems at this stage in the game to be far more fundamental than the logic-based approach. Now, the core algorithms deployed in the Moogsoft AIOps Platform are mathematics-based, not logic-based and, hence, if I am right about what is truly critical, then Moogsoft, on this score, has the right idea. Once again, the logic-based approach still has its uses and is, in fact, widely deployed across the IT operations world, but since the relevant data streams show no sign of becoming easier to interpret and understand, the logic-based approach will prove increasingly marginal.

The fourth dimension is innovation. Many vendors that identify themselves as players in the AIOps space have focused more on applying well-understood, textbook-sourced, machine learning or pattern discovery algorithms to either streaming or historical IT operations data. In one way or another, they appear to have concluded that the value to be delivered is a consequence of marrying what amounts to public domain AI to the right data sets. Now, there is no question that lots of value can be delivered in this manner if the project is carried out properly. However, there is still a lot of room and a lot of need for innovation. In fact, I think a good case can be made that most public domain, mathematically oriented AI misses the AIOps mark because the algorithms are targeted at generic data sets. The data sets relevant to AIOPs, instead, are data sets generated by observable systems and hence require a different flavor of algorithm. Now, the Moogsoft AIOPs Platform already features a lot of algorithms that you just won’t find elsewhere, many of which represent a real departure from the textbook way of thinking about mathematical AI. Furthermore, the core culture of the company is built around the continued creation of innovative IP in this space. So, in the end, when considering the fourth dimension, I concluded that not only is an emphasis on new rather than textbook algorithms the right way to go but also such a culture would be the most fulfilling and the most fun!

If you will, please allow me to summarize. Along the dimension of scope, Moogsoft has made the right choice to support cross domain observation and analysis; along the dimension of process, Moogsoft has made the right choice to focus on real-time observation and analysis; along the dimension of approach, Moogsoft rightly chosen to concentrate on mathematics-based as opposed to logic-based AI; and along the dimension of innovation, Moogsoft has made the right choice to focus on building up a base of new IP rather than rely on textbook, public domain algorithms. There is no other vendor, at least among the 50 or so I am aware of and studied, that gets it exactly right across all four dimensions.

I believe that the AIOPs market is about to embark upon a period of significant growth and world-changing impact. I believe that I can generate more value as a market participant than I can as a market observer. And, finally, I believe that Moogsoft has the best approach to AIOps across multiple dimensions.

Karl Marx is buried in the Highgate Cemetery not far from where I live in London and carved on his grave are the following words taken from the 11th and last of his Theses on Feuerbach: ‘Philosophers have, hitherto, interpreted the world in various ways; the point, however, is to change it.’

As an analyst for more years than I care to remember, I was a bit of a philosopher, and I did indeed interpret the world. I think I often interpreted it in interesting and helpful ways but, now, as part of the Moogsoft team, I am really looking forward to changing it.

Moogsoft is a pioneer and leading provider of AIOps solutions that help IT teams work faster and smarter. With patented AI analyzing billions of events daily across the world’s most complex IT environments, the Moogsoft AIOps platform helps the world’s top enterprises avoid outages, automate service assurance, and accelerate digital transformation initiatives.

Will Cappelli

About the Author

Will studied math and philosophy at university, has been involved in the IT industry for over 30 years, and for most of his professional life has focused on both AI and IT operations management technology and practises. As an analyst at Gartner he is widely credited for having been the first to define the AIOps market and has recently joined Moogsoft as CTO, EMEA and VP of Product Strategy. In his spare time, he dabbles in ancient languages.

See more posts from this author >