The Need for Creativity

The Need for Creativity

The other day I was watching a show on TV about how food is made and I started thinking about software testing (because that’s what normal people do right)?

So much of what we have in our society is made on assembly lines and food is no exception.  That candy you bought?  Produced in an assembly line style environment.  The loaf of bread and container of milk?  Same thing.  We still have artisanal food, but the reality is that most of what he eat is at least partially made in a factory somewhere.

How do you find meaning in what you do in the assembly line world?  How do you find meaning in just being a step in a process. Do you actually feel like you have made something when at the end of the day, you’ve rolled out 240 candy canes per hour? I think that we as humans have an innate desire to create, but does a process like this allow you to create? Can we find meaning and fulfillment in a job like this? Can we meet our need to create things when most of what we build as a society requires the input of many different people?

I think we need to think about these questions, and not just in the assembly line context. These kinds of questions matter in the software world as well. It isn’t just physical goods that are too complex for most of us to make on our own.  The software systems we build require teams as well.  We may call this creative work, but very few of us create entire products. We work on one small piece that fits in with other pieces. Can we fulfill our need to create in this context?

What about those of us who’s primary role on the team is to test? What do we create?  How do we scratch that creative itch?

I think this is why a common complaint among testers is that they get left out of decisions about what the product will do or look like. It’s why we talk about the need to shift left. It isn’t just because those things are helpful (although they usually are). It’s because we have a need to create something and to be included in the creative process. If all we are doing is checking someone else’s work to see if they did it correctly, we have no job fulfillment.

Before I started in the testing job I have now, I worked in a couple of QA jobs in factories. I hated them. Take this batch of paint. Measure it for various properties.  Compare to the specification table the customer had provided and accept or reject. Boring work with nothing to point to. At least if I worked at a factory that made candy canes, I could look at a candy cane in the store and be proud of the fact that I had twisted two colors together perfectly. If I looked at a can of paint, all I could say was that I had told everyone they did their job correctly. Not only was the job boring, it felt like it wasn’t tied to something I was creating.

Now back to software testing. If my job as a tester is to compare the work someone is doing to a specification and accept or reject it, I’m not going to feel like I am producing something of value. We want to be involved in making something. We want to be able to point to the system proudly and say “I helped make that!” The human need for creativity is why we testers want to be involved in more than just checking if the product does x or y, but – and this is important – it is also why your team needs a great tester. All members of the team want to be able to point to the software and be proud of the fact that we made something great.  If we can’t do that we will be demoralized as a team and we aren’t going to produce a lot of value in the long run. Testers help with this! Value your testers. Listen to them. They share the same desires you do.  The desire to make something we can all be proud of. Let’s do this together!

 

Advertisements

What Hume Can Teach us About Automation.

What Hume Can Teach us About Automation.

David Hume’s Guillotine tells us that we cannot derive an ought from an is.

Abstract philosophical musings you might think, but then again, you work in software, so you know that philosophy matters, right? I want to muse on this for a while, and not in an abstract realm.  I want to take this down into the real world.  I’m going to take the law at face value. If it really is true that an ought cannot be derived from an is, what should we do differently?

“How can I automate this?”

“Should I automate this?”

Two similar sounding questions, but which one you start with makes a world of difference.  How often do we start with the is? We start with the fact that we have a framework in front of us that let’s us easily add tests, or the fact that we are told to automate everything or even the ‘fact’ that manual testing is too slow or too cumbersome for the modern software shop.  But aren’t we forgetting something?  Aren’t we forgetting the law our friend Hume told us about?  Just because we can do something doesn’t mean that we should.

This applies at every level of software development (because what is software development except automation?)

Should you make this product?

A number of years back the company I work at started a new initiative. We had acquired two previously competing products and after the acquisitions we decided to try and combine the best of both products and re-architect everything into a new product. It seemed like a great idea and we put a lot of effort into it for several years, but recently the product was canceled.  Why?  Because we had decided that this product did not need to be made.  Some people were interested in it, but market analysis and other factors indicated that it was not worth further effort. We might have been right or wrong about the market forces at the play and the strategy employed, but this does illustrate something.  Could we make this product? Sure.  Should we?  Not if it won’t help people. Not if it will lose us money.

Just because you can do something doesn’t mean you should.

Should you automate this?

What about an example that is a little more related to test automation?

At one point I worked with a team that did workflow tests.  These were scripts that we would run through every morning to check if the nightly builds were ok.  As you can imagine they were boring to do and so we decided to automate them. Makes sense right? Automate away the boring stuff. We don’t want to spend half an hour every morning working through the same boring scripts. So we went ahead and did it. We automated them, but should we have?

I’ve talked before about how boredom can be an indicator that you should automate something, but that it can also be an indicator that you are doing something useless.  So which is it here?  I think a bit of both actually.  We automated them all at that time, but since then we have removed or changed many of them.  There are still a few that have stayed around because they are valuable, but we certainly didn’t need as many as we had.

This once again illustrates the point that just because you can automate something doesn’t mean you should.  If we start from the is – is this possible?  Does a framework exist that allows me to do this? – we never get to the ought –  does this make sense?  Will this be valuable?  If we had started with ought questions we would have just gotten rid of half the scripts we had and automated the few that were actually valuable. Instead we started with the is questions and ended up spending a lot of time creating and maintaining tests that we really didn’t need.

We need to start with the ought.  In the software world we (understandably) love using software to do things.  Sometimes that is the wrong answer.  Sometimes there is a non-software solution to something. Sometimes we are just doing the wrong thing and so setting up systems to do it faster isn’t helping anyone.

Start with ought.  Know why you do what you do.  Take a minute to think things through before you plunge in and start doing.  If you know the ought, figuring out what to actually do becomes much easier

Book Review – Systemantics

Book Review – Systemantics

I recently read through Systemantics by John Gall (or the Systems Bible as the newer editions are called). Although presented in a very humorous and entertaining way, this book is packed with ideas that make you stop and think.

Why don’t things work the way you expect them to? Well, this book will tell you. It might seem discouraging to know that a “Complex System cannot be ‘made’ to work. It either work’s or it doesn’t,” but when you think it about, it is easier to (principle 31) “align your system with human motivational vectors,” than it is to keep banging your head against fundamental systems laws.

And never forget that “systems will display antics.” Don’t be surprised when the system doesn’t do what it is designed to do.

This book might make you a little cynical, but you’ll probably get further ahead in the world if you understand why you are feeling frustrated by the systems you are in. Sometimes a does of realism is good for us right? I would highly recommend this book to anyone who works with systems (i.e. all of us).

Writing Automation is Easy

Writing Automation is Easy

“It’s easy to write automation”

As the thought jumped through my mind, I had to tip my chair back, look up at the ceiling and ponder.  Is that true?  Is it easy to write automation? As I stared at the grey painted ceiling and chewed on the thought, some examples came to mind that illustrated the truth of this.  It really is easy to write automation.

I had put together a new test in a half an hour and even while I was doing it, I was thinking about how much of the test creation work could be automated.  With a simple template and a few shared functions we should be able to make it even easier to create an automated test.  Creating a test is easy

I had a new co-op student and after a bit of training on our automation system he was able to add several new tests in a day.  Creating a test is easy.

We have thousands of tests in our automation system.  How did we get so many? Well, creating tests is easy.

The Struggle

But, if it is so easy, why do so many people struggle with test automation?  Why is it an ever popular subject of books, blogs and conference talks?  Why do we spend so much time and effort on this.

Well, the answer is simple.  Test automation is really hard.

Wait, what?  What kind of weird world do I live in.  Didn’t I just say it was easy?  Well, not quite.  Adding a new test is pretty straightforward.  Adding a test that is going to be both valuable and low cost – that’s a whole different story.  Adding tests is easy.  Adding good tests?  Not so much.

Remember that test I added in a half an hour.  Easy right?  Well, I ended up making several major edits and changes to it, based on review feedback.  Why?  because adding a test is easy, but adding a good test is hard.

Those tests the co-op student put in?  We spent a lot of time together on figuring out what kinds of things to check in these and how to make them work well.  Adding the tests was easy, making them good tests was hard.

Our automation system does indeed have thousands of tests and we are struggling with how to keep up with them.  How do we change and tweak them so that they are more valuable?  Once again, adding tests is easy, but adding good tests is hard.

 

How do we fix automation problems?  Do we need to hire more automation engineers? Do we need a new framework? Do we need to get more developers involved?  Maybe. But there is one thing that we really need.  We need skill. Changing tools and processes can help, but to get good automation you need skilled automators.

We need automation craftsmen and women.  We need people who know and understand how to write high quality automation.  People who know how to use automation to actually lower the costs of  testing.  People who know how to write the kinds of tests that actually add value.

You want to stand out from the crowd?  Don’t just write automation – write good automation. Study automation.  Learn how to leverage it.  Learn how to clean up automation messes. Learn how to move automated tests from mediocre to great. Figure out what makes for good automation and then find the tools you need to help you achieve that.  If you are a student of automation and you can craft useful and valuable tests, you’ll have nothing to fear. Machine learning and AI are just additional tools in your toolbox.  If all you do is turn test cases into automated scripts – well, I’m sorry, but that’s easy and your job security is low. Learn how to write good automation before the machines come for you.

Writing Automation is easy. Writing good automation is hard.  Do hard things!

30 Days of Agile Testing – RED BUILD!

30 Days of Agile Testing – RED BUILD!

Note that this post is part of a series where I am ‘live blogging’ my way through the ministry of testing’s 30 days of Agile Testing challenge.

What actions do we take when there is a red build?  Well, what a timely question!  I’ve just spent the last couple of days trying to figure out a few different build issues.  The story illustrates one set of responses to a red build, but it also shows that there isn’t just one answer to question of what we do when there is a red build.

Thursday I noticed that one of my builds used to check on a lower level package was red. It wasn’t just one or two tests failing.  Every single test was failing. Clearly something was going badly wrong.  I spent some time digging into it and finally realized that part of the package wasn’t getting extracted correctly during the setup.  After some more time (and frustration), I finally figured out that the issue boiled down to the regression VM using an older version of 7zip.  Apparently the build machine creating the package had been updated to a newer version and so now the old version on the machine I was using couldn’t properly extract the package.  I updated the version of 7zip and re-ran the build. Everything was passing, so I posted an artifact to get picked up in the final build process. Everything is good now right?

Wrong.

Friday morning I came in to find that instead of the build picking up an artifact from Thursday (as it should have), it had picked one up from May?!?! Stop the presses! More sleuthing required!  We stopped the build from progressing and starting digging into it. The problem ended up being another machine that had the wrong version of 7zip installed. This machine had also had not cleaned up properly at some point and so had an old file hanging around that it could (and did) use.  We fixed the 7zip version and updated the scripts to make sure they were correctly cleaning things up and now *touch wood* everything in running smoothly again.

The point of this story is to show that the things we do to deal with red builds varies. Normally we wouldn’t stop all other work and  focus all energy on fixing the build, but in this case the red build was of the ‘Nothing Works!’ category and so the steps taken were more drastic.  In the ‘normal’ day to day red build where a test or two is failing our approach to it would be different.  We would look into it and follow up, but if the issue was small enough we would let the build pipeline continue and just follow up with a fix. Or if we caught the issue early enough, we might just quickly revert a change and things could continue on as expected.  The approach to a red build can’t be strictly prescribed and often requires exploration to figure out.

The lesson? Even when it comes to red builds, the context matters!

30 Days of Agile Testing – Work Tracking

30 Days of Agile Testing – Work Tracking

Note that this post is part of a series where I am ‘live blogging’ my way through the ministry of testing’s 30 days of Agile Testing challenge.

Today’s challenge asks what columns we use on our work tracker or kanban board.  To be honest we don’t use columns at all….

I know, I know, bad us right? Perhaps so. This probably would be something worth trying, but for some reason we have never gone down this road.  I’m not sure why, but it hasn’t risen up as a high priority thing to try.  Perhaps those of you who do use a kanban style of work management could share: does this transform the way your work? Please leave a comment with your experiences!  We are trying to move towards a more agile way of working from a process that is, frankly, quite waterfall in many ways.  Is this something that would be helpful for us in this journey? Trying new things takes time and energy.  Is this something that is worth the time and energy it would take?

30 Days of Agile Testing – Learning Culture

30 Days of Agile Testing – Learning Culture

Note that this post is part of a series where I am ‘live blogging’ my way through the ministry of testing’s 30 days of Agile Testing challenge.

Today’s challenge is about contributing to the learning culture in my company.  I work at a pretty big company that is split into a lot of different divisions so I don’t know that I can speak to the learning culture of the company as whole.  Instead I will focus in on the learning culture of the testing team I am a part of.

Testing Team Learning Culture

On our testing team, we approach learning in a few different ways.  We are currently a distributed team spread across 4 offices and so we try to foster learning in ways that accommodate this distributed nature. For example, occasionally during team meeting we will discuss an article that we have all read.  This helps us think about things that might be outside of what we usually do and lets us discuss different viewpoints and approaches to testing.

We also use retrospectives as a learning opportunity to try and see how we can grow and learn from shared problems.  Looking back on problems we have faced as individuals and discussing together ways to address them or think about them is a very helpful learning tactic

Another things we have tried recently is sharing ‘tips and tricks’ at our weekly team meeting.  This is a way to share a quick little tip or tool that you have come across that might be helpful to other testers.

One other, very important way we foster a learning culture is through group testing sessions.  We use these sessions as an opportunity to learn new things about the product and to help each other get better at interacting with and testing the product in various ways.  This also gives us the opportunity to observe other people testing (we use screen sharing during these sessions) and thus to learn from their actions in that way as well.

As a testing team we realize that an ongoing commitment to learning is an important part of become an ever better tester and so we invest time into this.  Don’t get complacent with where you are.  Keep on learning!