Big Data is not the goal, it’s how we reach the goal. Do you know what the goal is?
A conversation I seem to be having a lot these days is about Big Data. People will ask me to come in and present to them what we can do for their Big Data project, and how we compare with some of the dedicated platforms in that space.
The problem is that very few people seem to have thought through what they want to achieve with all of the Big Data that they gather.
The thought process seems to go something like this:
- Big Data
The thing is, gathering data will not deliver any value unless and until you actually do something with it. At the risk of giving away my conclusion, getting value out of data is hard work.
The hardest part is right at the beginning: You have to decide what you want to achieve. Surely there’s a reason why you are gathering all those data in the first place, right?
The problem I see in the typical NOC is not a lack of data; in fact, it’s the opposite problem, with operators drowning in data and unable to figure out what is actually important.
Quality > Quantity
So let’s work through this scenario from the beginning. Moogsoft is about improving the users’ experience of an IT service. This is the goal that we help our users to achieve. All of the data gathering and processing are in service of this goal.
Now that we have that end goal defined and agreed, how do we go about achieving it? The firehose approach where we simply gather every piece of information does not help, because it just results in overwhelm. Human operators can’t drink from that firehose, and so they try to deal with the excessive volume of data any way they can, filtering and triaging it using whatever tools they have to hand. Often, this devolves to email folders and rules!
These operators do not need more information — in fact, that is the last thing they need. What they do need is for the information to be meaningful.
By adding intelligence, Moogsoft technology can eliminate the noise and reduce the event volume to a manageable level. It can then further help operations teams by correlating events in real time to identify what is actually happening in the environment and how that is affecting the services that users rely on.
This is a fully modern approach, but because we start from the goal of improving service availability and user experience, we are not just chasing the Big Data buzzword. Rather, we leverage those techniques to build a solution that is able to scale dramatically, to roll with the changes that growth inevitably brings, and to interoperate with all of the other tools that are also involved with delivering the expected results.
Interoperation between tools is one thing, but an even more important requirement is collaboration between the different people and roles that are involved in achieving that goal of service availability. In particular, this means streamlined communication and knowledge sharing across the boundaries between teams and departments, and even between different companies in the case of outsourced infrastructure.
These three aspects together are what deliver the massive increases in operational efficiency that our users expect.
Because we have moved Big Data from a goal in itself to a part of the process, and because we fleshed out that intermediate step instead of just leaving a question mark, we can actually deliver on the final step, by protecting the organization’s profits and reducing revenue risk.
- 99% Noise Reduction
- 75% MTTD Reduction
- 10,000’s events/sec
- 90+ Integrations
- 75% MTTR Reduction
- Less Resources/Incident
About the author
Dominic Wellington is the Director of Strategic Architecture at Moogsoft. He has been involved in IT operations for a number of years, working in fields as diverse as SecOps, cloud computing, and data center automation.