Monday, October 29, 2018

Automation Pain Points I: Synchronization

Let me say first that I love what I can do with test automation. It has definitely become an art over the years I've been doing it.

One of the very first pain points I encountered was with synchronization. In those days, I had to build my own tools. This is just before SilkTest went Beta 1. I tried sleeps. They worked poorly.

And when I started at Home Depot, I got into quite the argument about using sleep statements. A co-worker, Clay, got bent at me because I had put a sleep of 1/3 second into a routine.

Now, mind you, what this routine did was to poll for multiple different controls, every third of a second, until one of several conditions were met. He insisted "no sleep statements!". So I took him for a walk inside of webdriver, where it does the exact same thing. It was a good example of somebody following a rule because there's a rule, not because it was warranted in that case.

Sleep statements are not a good solution in test automation, mostly ever. My condition was unusual because there was no way to do all these checks at the same time otherwise.

But when anybody puts a sleep in a test otherwise, I call them on it. I know just how bad careless use of sleep statements can be. I have made tests take longer than necessary by trying to extend a wait to cover all the various response times from the application under test (AUT).

So instead, I ask folks to look at "how do you as a human know the AUT is ready to continue?" Is it because a field has become populated? We have an assert_not_empty() to cover that condition. Is it because a control exists? We have assert_exist() to cover that condition. Are we waiting for it to have a specific value? We have assert_value() to cover that one. All of these validation routines take a timeout as an argument.

I don't believe in rules for their own sake. In fact, I think the fewer rules we have, the faster development goes. Everything about our framework is about velocity. Reducing the time to build and maintain tests. Don't use sleep except in the most unusual of circumstances is one I do keep.

Keeping Your Perspective

For those of us who've been in IT for many years, the number of new and exciting technologies, paradigms, methodologies and philosophies around us right now can seem overwhelming.  Ideas that started in small shops, startups and incubators are now reaching in to even the most conservative of industries. Even my own industry, dental insurance, is starting to adopt agile practices.

It can be difficult to find your way through this barrage of new ideas: We want continuous integration and continuous delivery and your framework needs to support our particular flavour of agile but it also needs to support our legacy waterfall apps but we're not going to call them waterfall any more and also we need better reporting and traceability and and and ...

At times like this, I find it very helpful to take a step back and re-orient myself around a single, simple tenant:
   My job, as automation engineer, is to support quality.

Let's unpack that a little.  Quality means different things to different people and organizations.  You might measure quality with defect metrics, you might have some sort of federally mandated guidelines, you hopefully have a set of functional and non-functional requirements your are gauging against.  Regardless of how you measure it, your job, as an automation engineer, is to do everything you can to help ensure quality.

Functionally, test automation is a component of overall QA. If your core QA practices are shaky, the best automation in the world will not save you.  Everything you do as an automation engineer need to ultimately serve to bolster QA.  Whether you are part of a small team or have an impact across the enterprise, this holds true.

I use this tenant every day.  We are in the midst of developing a new enterprise-wide automation framework.  Keeping my perspective on "support quality" helps me filter through the options when choosing technologies and methodologies.  It helps me to remember who the stakeholders are when I'm designing elements of the framework, such as reporting.  It helps me figure out the how when tasked with something like adding testing to our CI setup.  Hopefully it can help you too.

Sunday, October 21, 2018

The Pain Of Test Automation

"So what do YOU think are the biggest pain points for test automation?" Someone asked me. I've been thinking about that for the last several weeks. This is the list I came up with:

Synchronization - Making sure the test doesn't get ahead of itself, and making sure that it can continue.

Resilience - Recovering quickly after the application under test changes.

Interface Proliferation - When test automation libraries get too big.

Networking Problems - Possibly my number one issue at my current gig.

Looking Stuff Up - How much time is spent looking up how to do things. A lot more than most people think.

Data Management - Getting consistent data into the application under test, mocking external interfaces, and so on.

Artifact Aging - What test artifacts to hold on to and for how long.

Reading and Maintaining Other People's Code - Coding standards, training, and so on.

The next few posts will look at each of those in a little more depth, including how I deal with that in my current work.

Stay tuned!