The Evolution Of Metrics And Monitoring: A Conversation with Silent IT’s Tim Stone
We recently sat down with Silent IT’s Tim Stone to chat about how measurements and monitoring are not only changing as businesses continue to move to the cloud but also how these metrics are being tied closer to business outcomes than ever.
Tim, let’s talk a little bit about how you might set the table for success within a project. How do you decide what metrics should enter the equation and in turn, how do you get all parties on the same page with those metrics?
TIM STONE: What I find has dramatically changed in the world of metrics and monitoring is that as more people are sharing items from the cloud or even within their on-prem environment, companies are moving toward measuring the things that matter to the business, all while monitoring for data integrity.
It’s a far cry from those “old school” IT departments that still seek to troubleshoot the same things they always have – CPU, memory, disk space and more, which elicit some fairly routine metrics. Those types of departments are certainly still out there, but as we meet with a business, we’re diving deeply into what the business really needs. What should happen at the click of a button and what do they expect to happen? Does it, in fact, happen? That’s really the evolution of the monitoring world.
We hear, “When I click ‘submit,’ I need the ERP data to be returned in thirty seconds.” You’re measuring the time it took for the data to arrive into the payroll system and other metrics that matter to the business. If a web server is running at 60% utilization and a database server is running at 45% utilization, why should the business care?
What other key pieces are entailed with monitoring?
Verifying data integrity is always important. As we move to the cloud, you start seeing more loosely coupled applications, which brings the risk that some data could be lost or altered. What happens when everything is supposed to move smoothly in a perfect world…but doesn’t? That’s where monitoring pays off, to ensure that you see a hundred pieces of data at the front end of the pipe and a hundred pieces of data at the back end of the pipe. You’re monitoring to ensure that each step is getting the same hundred transactions.
If you have a piece of data that goes through six steps and you monitor at each step or you put in instrumentation at each step to count it, you can also say, “Well, from step one to step two took three seconds, but from step four to step five took eleven seconds. Let’s look at our architecture and figure out how we can get eleven seconds down to three.” That gives you the next step, so even if you’re not chasing that goal now, you’re planning ahead for that.
As businesses move to the cloud, how much education is required about the process changes to come? Is that all happening upfront at the beginning of the journey?
It really depends on the project, but many times we do have to shed light on what the operational impact to the organization will be. Part of our process involves talking to the stakeholders and sharing how we’re going to monitor this project so everyone understands how it brings value to the business.
One of the metrics you’ve talked about is utilization, but there’s a unique way you define that. Can you elaborate?
For me, it’s not so much about utilization in terms of how much CPU is being used, for example, but rather utilization in terms of how many people are using your application at one time. Let’s say you have a thousand people in your company. If someone spends an hour with the application, they may love it. But if they spend five seconds with it, they probably hate it.
Those types of metrics also allow us to go back to the business and say either:
A) “We’re very successful because there are a thousand people in our company and half of them login every day.”
B) “There are a thousand people in our company and nobody’s logged in, in a week. Why aren’t they using the application?”
So besides developing an application, we have to ask questions such as:
· How are we going to get this application institutionalized in the company?
· How are we going to get it operationalized in IT so they can support it?
· How do we train the product owners and bring IT up to speed so they understand what they have to be prioritizing?
As you move the business from point A to point B, how often should they be looped in with the metrics you’re using and what other metrics do you focus on besides utilization?
It’s actually more important that we educate them about a new way of monitoring, as opposed to talking to them about individual metrics. That said, besides utilization, we also focus on the business’ Service-Level Agreement (SLA). What does the business want from this application? What response times do they need? What throughput do they need? There may even be a situation where Marketing is engaging in an activity that the rest of the business needs to be aware of.
Speaking of that instance where you’re referring to Marketing and the need to involve others, are there any other stakeholders in the business that you find you have to regularly bring in when talking about metrics such as utilization and the business SLA?
It’s about the business users. If you’re building a marketing application, it should be the marketing people in the room. Still, it’s not typically one group has ownership over an application. The conversation should be, “Who are my users? Who am I going to go talk with?”
So we’re always incorporating monitoring in the overall application design and the operationalization of the solution into the architecture.
It’s rare to understand the intricate details of IT while possessing a keen business perspective too. But Tim Stone clearly has that combination, with career accomplishments for clients over the course of 20 years that include increasing annual sales by 25%, supporting 66 operational stores and delivering $3.5 million across 7 lines of technology.
Through the complex projects he takes on at Silent IT for healthcare organizations and in post-merger integration, Tim transforms internal IT departments beyond the traditional cost centers they so often tend to be.