A Changing Mindset

Experience Reports, Software Testing

Ever since I left my short stint at the meat factory, I’ve been a Software Testing Consultant for all of my modest career. Until a few months ago, when fate threw me into a Product Owner role.
5 months in, I feel my priorities, my thinking, my mindset… change.

This is not necessarily a good thing, but it is a necessary thing. First, I was Product Owner of Test Automation. But as that team disbanded due to too much overhead for a reasonably small team, I became Product Owner of a 8-headed SCRUM team of developer-architects, a tester, a test automation specialist, a DevOps specialist and soon a new junior developer.

My previous two blog posts were about helping a relatively small team learn more, move them forwards and become confident.
My new role is again different and it’s providing me insights about myself, how I adapt to these dynamics.

Mindset

My mindset has changed drastically. Where I was focused on risks, oversights and possible problems before, I am now looking at ‘good enough’ and going forward with the things ‘that matter’. Because of my Testing background and my now PO role, I realise that those two things are very different for me than other team members. I don’t know the risks well enough, I don’t know the scope too well (as the product is very new to me) and I can only guess at the value our changes bring.

Yet, this doesn’t seem to stop me forming opinions and making decisions.

It frightens me to take steps forward into this vast uncertainty of unknown unknowns knowing that I’m probably on top of the Dunning-Kruger ‘Mount Stupid’.
I caught my self disregarding several risks people mentioned, just because they intervened with my plans…
I have critisised many Product Owners before, when I was a tester, that I could see they had no clue what they were doing or where they were going.

 


I’m beginning to believe that this uncertainty is a big part of the role.
I need a tester to keep my feet on the ground.
I need this done as early as possible.

My priorities lie with keeping the team happy and delivering business value to the stakeholders. Not in risks, maintenance or changes…

Because of that, I’m not thinking of 3 out of 4 types of work.


Four Types of Work

When you find yourself in a situation where you don’t know enough or feel inadequate, start learning, reading and discussing. That’s what I do at least. I needed to ‘up my game’.

One extremely important finding for me were the four types of work featured in ‘The Phoenix Project’: Business Projects, Internal IT, Changes and the highly destructive Unplanned Work.

This connected several frustrations of mine into one model.
My current customer is quite good at pinning down Business Projects. At the very beginning, we do a three-amigo kind of thing where we lay the fundamental vision for the project and immediately try to cut down all the surrounding waste.

Internal IT is handled reasonably, though the responsible people seem to live on a well frequented island. We have two Admins who seem to troubleshoot and fix several major problems a day.

Changes are frequently happening, but are largely unmanaged. I’ve added a blank User Story in our sprints to capture ‘surprise tasks’. This should create a good baseline to see where these change requests come from and how much time they soak up. From there on out we can create procedures to mitigate, ignore, prioritise, escalate… What exactly we’ll do with the data, I don’t know yet, but we’ll have a better idea on how to tackle these changes.

I finally can put into words why I as a tester was often a source of frustration for a Product Owner: Unplanned Work. This type of work disrupts your whole flow, motivation, plans and ultimately, can destroy your project. Call it, bugs, risk, oversights,… it’s everything that suddenly requires someone’s attention and who can therefore no do anything that was planned. It eats your plans. It tears apart your flow and energy. It makes sure people get frustrated.

When Work In Progress is often called the silent killer, Unplanned work is the loud bloody zombie apocalypse that comes to exterminate your project. It terrifies me.
… enter the jolly testers who tell us we forgot about something important.

We just had two sprints torn up by the walking dead. Project management: ‘oh, we forgot to include these highly crucial features that need to be in production by the end of the month.’
Nor I, nor the team, was amused.

A Change in Thinking

A year ago, when I was a tester in this situation I would raise many bugs, make them visible and be loud about the frustrations I could notice in the team.
In similar occasions, I’d have given up and watch the train ride into a wall (again) to then see what we could make of the pieces.

Being in this situation as a Product Owner I try to make the best of the situation. Hope for the best and try to plan for the worst.
As a contingency, I put machinations in place that will bring more insight:

  • We will capture the ‘surprise tasks’ that weigh on the team to manage Changes
  • We will analyse the bugs found after development (and initial testing) from the past 6 months to build a checklist that can help us identify Unplanned Work
  • I need to keep a buffer to allow for Unplanned Work

The data in 1 will be a baseline to come up with certain Change Procedure(s).
From the data in 2, we can build automated checks, monitoring, alerts and ‘have you thought about/talked to X’-checklists for management.

I’m now in a role where I don’t have to be the 20-something-year-old screaming bloody murder anymore. It might sound strange or unfair, but my words have more impact these days. I won’t complain.
This phenomenon has given me the power to actually strategise and bring change while being very obvious about it. I’m not trying to persuade people to follow my ideas anymore. I’m gathering them by being direct.


I want to avert future disasters like we’re now in. I want the team to be on top of things. Maybe in the future, we’ll simulate our own disasters, while we’re still in control. Just for fun. And learning.

fb_img_1526309449076

Product Owner of Test Automation 2

Experience Reports, Software Testing

In my previous post, I explained the strategy I envisioned for the team. Comparing it to a boardgame.

What this post lacked thoroughly, was a clear focus on team learning. I feel like an idiot for not noticing this earlier.

Learning Objectives as part of the sprint

What has become abundantly clear to me is that the Team Members are the heart of your team. They need to be nurtured, grown.
In our team, we’ve been actively investing into people to become more confident and knowledgeable. ‘Learning objectives’ have become about 50% of our sprint stories.

I add in: Spikes, Proof of Concepts, Blog Posts, Challenges,… to have people work through material and produce reports, concepts, demo’s or anything that reproduces the acquired knowledge. After that, they ask feedback from other team members, discuss or teach. The aim is to achieve two things: new learnings and something valuable for the team, project or product. This keeps our stakeholders happy and our team in learning-mode.

But 50% is a lot of time… how do you explain this to stakeholders?
Test Automation is a valuable endeavor. Though in uncertain conditions it can be rendered useless, time consuming or time wasting even. That’s where we are now with the team. Many different things are changing. The application, the architecture and the development teams are all getting a good shake. This is not a good moment to heavily invest in UI or even API checks. Instead, I shift the focus of the team in a different direction.

Whereto now?

I see a lot of opportunities to coach, train and pioneer automation strategies, as a team.
Once the dust settles from all the management decision-making and architecture workshops it’ll fall on the automation people to strongly improve our release pipeline.
To achieve this, we need to become better at what we do, need to become more confident in what we say and become more respected for the value we bring.
As a team.

Instead of building more automation, our focus shifts to coaching, training and knowledge sharing. The issue, however, is that we first need to do knowledge gathering and train ourselves. The good thing is, we’re more of a team now than before and we can help each other out. We also have some time to invest in ourselves, which will pay off tenfold in the future. Hopefully.

In congruence to the team building up their skills, I’m monitoring progress of these changes and looking for opportunities to help out. Whether it’s now or in the future, I want to know where we can add business value fast. Additionally, I’m collecting examples of good practices in our context and using those as a basis to build an automation strategy.

Changes are coming our way, but we’re preparing to deal with them.

20180214_153619

The Automation team, at sprint kickoff

Session Based Learning

Software Testing

Last week at BREWT1, a peer conference in Belgium, I was talking to Simon Tomes about an idea his new tool TestBuddy had triggered:

Session Based Learning Management.

Let me first introduce you to Simon and his project. Simon is a wonderful human being who’s mission in life is to raise the spirits of everyone around him just a little bit higher. With years of experience, he’s become pretty darn effective at it. #GoEncourage is his mission to see things positively and communicate as such. As a tester, he’s found it especially helpful for his developer-tester relationships.

TestBuddy is his Simon’s and Rajit Singh‘s brainchild. From the way thin

gs are looking, this will become the go-to application for Session Based Test Management. Centralising your charters, your missions, your team and giving an overview of what’s done and is still on the stack. I’ve had the privilege to watch their journey and am eager to see it evolve further.

5a169cc16b2f3a0001d3350c_testbuddy-logo@4x-p-500

What do we have to learn

The idea that sparked me to talk to Simon was that their tool could very effectively be ‘abused’ to guide team learning.
Imagine being on a team that kept a backlog of ‘what do we have to learn’, the same way we have charters that guide us in ‘what do we have to test’. Same concept, different goals.

Imagine sitting in a planning meeting that would outline skills, information, books, videos,… that must/should/could be explored. Having those ‘learnings’ split up in charters that work the same way as you’d test an application:

Plan your learnings:

  1. As a team, pinpoint a skill, piece of knowledge,… needed by the team
  2. Explore the skill to get a basic overview (a first charter?)
  3. Outline what the absolute basis is to start building from
  4. Identify what outside help/tools/resources you need
  5. Try to plan a step-by-step pathway of learning consisting of several charters

For every charter:

  1. State your mission of learning and describe what a successful session would look like
  2. Whenever you see opportunity to have a sidetrack, create a new charter for it
  3. Use the time of the session to learn in function of the charter mission.
  4. Debrief with the team: What have you learnt and what can you teach?

Debrief:
Using Jon Bach‘s PROOF model:

P: How did you go about your learning journey?
R: What can you identify as having learnt?
O: What stood in the way of learning?
O: Did you see sidetracks or uncover new steps to explore?
F: Does the learning path still feel valuable? Would you abandon/change/evolve the pathway?

Notice how the process guides you through different learning phases?
Explore, Draw, Internalise, Debate. 

I could see a wonderful learning path for the whole team using this method. In the long term, there’s nothing making us happier than to learn something new.

Working together to gather new insights, collaborate on setting learning goals and sharing acquired knowledge,… I image would be an incredibly strong psychological, emotional and fulfilling journey for any one team.

What do you think? Could this be something you could apply in your project? How much time would the team be able to invest in this per day, week, sprint?

IMG_20171126_125422.jpg

Tired of Answers

Complaining, Software Testing

During the past few years, I’ve interacted with many experts, thought leaders and emerging talent and boy, there’s plenty of them. I mean this in a non-comical way, there is tons of talent, keep your eyes open and ears cleaned (this was meant comically).

Regardless of where I am in the food chain of conference goers I can’t help but observe, learn and… among other feelings; get frustrated.
The thing that has been baffling me for the longest time is that people come to conferences, fora, slacks and: ask questions. That’s not the crazy part. This is:

People expect simple answers and crazier yet, they get them.

Here’s why that’s preposterous: You are working in a complex situation. Everything is subject to change. Nothing is certain to be static. Assuming it will be, is risky.
That’s the only given we have.

But how many people deal with that given, feels so very wrong.

We cut the problem into many smaller ones. We like that. It’s easier to understand. Less brain intensive, more straightforward to plan for and very manageable.
It’s a common engineering pattern. Imagine you’re on the team tasked with building the Golden Gate bridge? You look at designs of other bridges, work out how many bolts, nuts and bars you need and get setting them together.
That sounds easy, right? Except that something in the back of your mind is telling you it probably isn’t. Nobody has built a bridge of this magnitude before. This task might be more complicated than initially thought…

On top of that add the the following parameters:

  • The water it crosses is not a river, but a bay. Subject to the tides and rapid weather changes.
  • The political and territorial landscape might change.
  • Vehicles crossing the bridge will inevitably evolve, maybe even much sooner than anticipated.

Compared to other bridges, this one is proving to be a complex beast. As soon as we take the holistic view, we realise how wrong we were.


Here’s a secret. We’re all building complex beasts. We don’t understand the fundamentals, we can’t foresee the future, we don’t know our present well enough and we have no idea which parameters will change our situation today, tomorrow or just after the next major release.

Therefore I would advise everyone to quit looking for simple answers. They are false.
Therefore I would advise everyone to quit giving simple answers. You are not helping.

 

There are principles, models, sets of guiding rules, methodologies, approaches, tactics, heuristics, practices,… and many different ideas to help you do a better job.

Some of these can help you greatly, most of them were created and defined with good intentions but ALL of them have been misunderstood and/or misused by malicious or irresponsible actors. (including myself, at times)

The next time you’re faced with the need of giving or receiving simple, easy answers, repress that urge. Instead try to trigger thinking. Within yourself and within others. Thinking deeper, broader and in different directions. Offer alternative ways of thinking, provide helpful information and ask questions that may lead down a path.
The path of learning.

img_20171126_125504.jpg

 

Applied heuristics

Experience Reports, Software Testing

This is an experiment. I’m trying to figure out my understanding of what the word ‘Heuristics’ means to me and whether it’s helpful to me and my craft: Testing.

Therefore, I’ll rehash an old story of how I found a bug and annotate the heuristics used in Bold and see what I can analyse from that afterwards.

Any guidance, ideas, intuitions, sources are very much appreciated.


Finding a Memory Leak

It was past noon. I’d been testing an application that wasn’t too complex for quite a long time. I knew it’s ins and outs and I felt myself getting bored.
I remember going through the customer files rather methodically one by one, click,… click,… click , without a clear aim. You could say I was randomly exploring.

Until something didn’t feel right. Call it surprise, intuition, something was wrong.
I noticed a pattern of loading times slowing down almost unnoticeable. I can’t put to words what triggered it for me. Was it an Observer-expectancy effect? Was it my negative mindset that sharpened my senses? I haven’t got a clue, but I felt I had to focus, zoom in.

I chose a tactic that was less boring than meticulously clicking my way up. Selenium IDE isn’t the best of tools, but it fit my purpose perfectly. I recorded my click and copied it a thousand times. I monitored the behaviour with developer tools, pressed ‘play’ and went for a coffee. After a while, I could identify that it went terribly slow because I could see the latency increase. Eventually I saw it ramping up until it crashed. Pairing with a developer we could conclude that there was a serious memory leak.


Implicit and Explicit heuristics?

A heuristic is “A fallible approach to solve a problem”

The teachings of RST and BBST and possibly how ‘heuristics’ were meant in the science books would identify all the bold phrases as some variation of a heuristic. I’m sure many other heuristics played in my mind that I don’t have words for yet.

However, notice how the first part of my story is almost completely based on hunches and feelings. Something directed me to find this bug and it wasn’t intentional.

In the second part, I took matters into my own hand. I had a goal, I chose a tactic and was able to measure my results. Other tactics, could have been clicking the button myself for the umpteenth time and miss out on a coffee. I might have asked a developer to look into it straight away and maybe had time left for two or three coffees.

I can’t know for sure whether I chose the correct heuristic or tactic for that situation and I’m sure as hell “# of cups of coffee drunk” isn’t the correct way to measure that success.

We can’t go back in time and compare results of heuristics used vs. the ones not used. Though I’m pretty happy with the results of the one I set up.


The Question Remains

Are the ‘Implicit heuristics’, (i.e. feelings, patterns, biases, hunches) which can’t really be controlled, measured,… useful to call heuristics? They fail, yes. They help you notice things, yes. But do they solve problems?

Are ‘Explicit heuristics’, (i.e. repetition to find boundaries, monitoring for abnormalities, pairing to increase understanding) worth a dime without their implicit counterparts?

 

It’s not up to me to answer these questions. Yet I find it intriguing to ponder over the matter.

 

IMG_20170718_102211846