This is the seventh chapter in a series of blog posts exploring the role that intelligent observability plays in the day-to-day life of smart teams. In this chapter, our DevOps Engineer, Sarah, experiments with low code and Moogsoft in her team’s DevOps toolchain to rush a new feature out the door to keep up with a competitor.
I’m kind of happy I became a DevOps engineer, as it increasingly seems that developers are losing their cachet. Well, maybe that’s a bit strong, but last time I built a bit of the Animapanions product, I barely wrote any code at all – I composed most of it from open-source binary artifacts from online repositories.
Actually, even as a DevOps engineer, while I write some scripts, a lot of what I need to do is in the tools. But last week, we were under the cosh. One of our competitors had released a new feature and their app reviews were going wild. Our product owner, Pat, said we needed to match it forthwith.
The feature was a pet breed wishlist. All human slaves of, for example, Russian blue cats, can form a community in their app and make recommendations based on what their felines really like, or what they thought they’d really want. I get it. I am a slave to PrincessPeachyMelba. She’s a Russian blue and it’s a constant battle to keep her crazy little mind entertained. I blow a good chunk of my salary on toys for her. Specific breeds do have particular characteristics. What’s more, humans get pretty passionate about the breeds they love and cohabit with. And as Pat said – all that customer data! Imagine what we can do with it in terms of product placement, marketing, buying, and cross-selling.
In this morning’s stand-up, we broke the rule about making no changes in the sprint and pushed out the tech debt user stories I’ve been working on until the next one. To be fair, this is the first time this has happened and I’m confident Pat’s not going to be doing it again in a hurry.
It was time for me to put my developer hat on once more. But when I sat down to code after that standup, Jason, our lead developer rolled his chair over to mine and said:
“You want to try something new?”
I’m all in on experimentation. You might say it’s in my blood; both my parents are scientists. My mother’s a science teacher, retired now. But I had the rare pleasure of being educated at the school where she taught. Not as bad as you might imagine; she was pretty popular as teachers go. But rarely a lunchtime passed without guffaws from my peers, just out of her class, at yet another failed experiment. I understand now that it doesn’t matter so much if experiments fail. The important thing is that they have a hypothesis, are empirical and we respond to the feedback. In our case, that means developing a small new feature, testing it out with a small number of users (we practice feature toggling, A/B testing, and canary releases extensively), and comparing the results to our original hypothesis to decide whether to pivot or persevere. It’s the classic plan, do, check, act – also known as the Deming cycle.
It makes me wonder what our competitors had hypothesized for their new feature and what success criteria they had defined in terms of value outcomes. We can only see what they are getting from their app reviews and only imagine what it’s done for their market share, new customer count, basket size, and all the other things that positively correlate with sublime customer experience and make for a high performing organization.
“You’re on,” I replied to Jason. “What’s cooking?”
“A couple of weeks ago I was rootling about in the AWS Marketplace and stumbled across this low-code solution,” he said.
“Is that how you’ve been crushing your story points?” I asked. We don’t like to measure developer performance on story points or productivity – it’s team performance and value flow that matter after all. We use story points so that Pat can properly set expectations at the portfolio level about what we are likely to deliver when. But Jason’s productivity had spiked in the last sprint and he hadn’t been going home any later than usual. It hadn’t gone unnoticed in the sprint review, but he’d just laughed it off and blamed it on the new coffee machine. He nodded.
“How come this is the first we’ve heard about it?” I asked. We create plenty of opportunities to share – the daily standups and the reviews and retrospectives for starters, and we’re colocated again so it’s not like we have to Slack everything. Water cooler conversations happen.
“I guess I was a little embarrassed,” he replied. “You know, I’ve always been a bit of a snob about low-code, no-code. It’s for people that can’t code, right?” I nodded now. We developers can have a bit of an ego about our coding skills. And quite right too – it’s not a skill everyone has and is hard-earned.
“Same, same,” I said. “The general view is that they are also quite restrictive though… difficult to customize.”
“You notice any problems with the feedback on my user stories from the last sprint?” Jason asked. I shook my head.
“This tool does let you get into the underlying code if you want to. I didn’t need to, as it happens, but it’s an option. A couple of things you’ll really like though. Firstly, knowing how much you like AI, it uses it to test and manage the dependencies.”
“That darn warehousing integration,” I murmured, looking at his screen where he was showing me the IDE.
“Exactly. And, it has its own CICD capabilities but also has integrations into all the tools in your DevOps toolchain. Including Moogsoft – so we can get performance data and feedback way before we even try to go into production.” He took a sip of his coffee.
“Our DevOps toolchain.” I corrected.
“Team tool, your masterpiece.” We smiled at each other.
“You have been a busy bee.”
“I sure have. Shall we have a go at the breed-based community wish list feature now? I thought we could do some pair programming today. Actually, it’s more like trio programming – Pat’s joining us too.”
“I’m sorry?” I said. Pat’s the product owner as I said, not a developer.
“I know what you’re thinking,” said Jason. “But she said herself it’s all hands on deck. She studied computer science and tinkers away with a bit of Python on the side for a charity project she’s involved with. She’s got enough skill to do some damage with low code. Good damage, that is,” he qualified.
They weren’t the shortest few days, but I did get to know both Jason and Pat a lot better. And we had a hoot. And most importantly, in the sprint review, we were already looking at the results from the release of the new feature. We’d only done a canary release to around 5% of users less than 24 hours earlier, but we could already see via Moogsoft it was holding up well as our pre-production analysis had suggested. There was a definite spike in users and that feature itself was seeing some strong session times and transaction conversion rates. Can’t wait to tell my mother how our experiment worked.
Want to learn more?
Register and attend the live webinar Intelligent Observability: Automation in DevOps Toolchains this Thursday, June 10 at 10am PT | 1pm ET | 6pm BST.
About the author
Helen Beal is a DevOps and Ways of Working coach, Chief Ambassador at DevOps Institute and an Ambassador for the Continuous Delivery Foundation. She provides strategic advisory services to DevOps industry leaders and is an analyst at Accelerated Strategies Group. She hosts the Day-to-Day DevOps webinar series for BrightTalk, speaks regularly on DevOps topics, is a DevOps editor for InfoQ and also writes for a number of other online platforms. Outside of DevOps she is an ecologist and novelist.