Saturday, July 21, 2018

Tulsa Techfest 2018 - DevOps and Drive Thrus

On July 20, Tulsa TechFest was held at OSU-Tulsa.

There were 16 different tracks of and four different session plus a keynote plus lunch.  Not bad for a free conference in town.

The themes of the day seemed to be C#, Agile, Feature Flags, and AI.
Image Credit: Marc Carlson

The keynote was given by Donovan Brown, DevOps guru at Microsoft.  It was inspiring, despite the fact that he was super excited about their move to fire / transfer to other areas within Microsoft their manual QA team for their VSTS project.  However he did clarify that it was a special case, that the team itself was both a subject matter expert on the product and acting as end users before doing a controlled rollout to concentric circles of customers, who are presumably more and more risk averse as you move outward.

He spoke about automating all of the things, focusing on the most offensive bottlenecks first, letting developers bear the burden of whatever bad code they write, both by being responsible for automating their own tests and taking the 3am phone calls when something blows up in production.

It resembled heavily the recent discussions on Modern Testing, specifically the reduction of using QA as a safety net to enable less-than-careful behavior from developers and identification/removal of bottlenecks. 

My top takeaways now that I've had a day to process.

1) Ed Eckenstein had the same experience I had with someone cutting them off at the McDonald's double drive through.  Though, as he's done a lot of work with coaching on mindfulness, connection, and gratitude since his survival of the Oklahoma City bombing, he had a better spin on his drive-through experience.  "What is the kindest interpretation of this behavior?" -- which is an interesting thought exercise.  I bet his McDonald's opponent didn't get out of their car and start cursing at him with two children in earshot...so my grudge-holding doesn't seem quite so petty.  Though his point was indeed well taken for all non-fast food related matters.

2) Even if developers write the highest-quality code to the best of their ability, QA provides value acting as subject matter experts.  Donovan Brown said if his team were writing software for truckers, they would for sure still have some manual QA around.

3) I don't like the term "manual QA", which seems to me to be the label for anyone whose eyes see the software without the sole purpose of creating a script --   What's a better term that doesn't make us sound like early primates happily clicking their keyboards randomly...Subject Matter Engineers?  Air Traffic Control? Pre-Crime Investigators?

4) Continuous improvement is a team effort.  We have to support one another, both to promote a connected, low-stress working environment and to encourage experimentation. Getting better doesn't always work out on the first try.


Monday, July 2, 2018

30 Days of Automation in Testing - #1 & #2

Organized by the Ministry of Testing.

1 Look up some definitions for ʻAutomationʼ, compare them against definitions for ʻTest Automationʼ.

I googled a bit and found a nice rant that is probably not completely relevant here since it has to do with the testing vs. checking distinction.  I found the regression testing as a form of version control point interesting.  Regression testing is table stakes - it needs to be done, it's boring and time consuming.  Automate it.

Call it a check if the person you're talking with cares about the distinction. Call it a check if the acceptance criteria of creating an automated regression check is a clear programming task and it doesn't need a tester mindset to ensure that it was created correctly.  Once it's coded and checks what it's supposed to check, then and only then shall we call it a check.  But ensuring that the check checks what it's supposed to check, either with clear acceptance criteria or by double checking the check....that's a testing activity.

My thoughts on automation in general are to use whatever makes your job easier or less boring.  Don't turn off your brain just yet though.  You have to know what your automation is DOING.

The tool may be making pretty graphs and tables of load test results, but don't trust the numbers unless you know that it's really timing what you intend it to be timing.

Know whether your checks are passing because your software is still awesome or because the logic of your test code is always "everything's fine!"

2 Begin reading an automation related book and share something youʼve learnt by day 30.

I have a work project going on related to JMeter, so why not Performance Testing with JMeter 3 - Third Edition?  I'm an ACM member with benefits to use the OReilly/Safari/Packt collection of books, so I won't lose any money at least :-)