[u-u] On testing Re: jobs @ rakuten/kobo

Ken Burtch ken at pegasoft.ca
Sun Aug 6 09:20:40 EDT 2017


Hi Folks,

In general regards to automated testing, there are a lot of myths that 
don't seem to be supported.

* As Dave suggests, unit tests are often written to get the test 
coverage green, rather than to ensure the program works as intended.  
This is one of the risks of hiring someone to write tests separate from 
the programmer, as the tester may not know what was the program is 
supposed to do.

* Unit testing 100% of the program is often seen as a good thing, when 
it can be a liability.  There are parts of a program, for example, that 
may be there for future features or as good style to catch bugs during 
development, which are not meant to normally run.  Yet they show up as 
dead code in coverage tests.

* Some types of programming are cost-prohibitive to unit test, such as 
features that are temporary or are in constant flux.  Such things may 
not be worth unit testing.  Other features, like handling out-of-memory 
or out-of-disk-space errors, may not be worth simulating because the 
operating environment itself becomes unstable and unpredictable under 
these conditions, and the test results may not mean much.

* Too many managers assume 100% unit test coverage means a flawless 
program, and they don't consider other forms of testing, such as team 
walkthroughs (e.g. to ensure all exceptions are caught, all 
possibilities are identified and handled).

* Integration testing (i.e. requirements testing, black-box testing) is 
often neglected.  Even if program components test successfully 
individually, it doesn't mean that they work function correctly when 
assembled together.  Ideally, integration testing should be separately 
evaluated with its own test coverage but seldom is.

* TDD itself has many claims such as better design, faster development, 
eliminating documentation (since the tests themselves theoretically 
describe the project requirements) but I have not found these to be true.

In regards to documentation, in my last couple of jobs, the developers 
argued that documentation slowed them down (deflected them from 
development) and was pointless when they were the only programmers on a 
project.  That had little interest in having their projects ready to 
hand off to another developer should they leave.

When I introduced unit testing, the developers took offense, saying that 
testing questioned their skills and slowed them down. Their concerns 
about speed were not entirely unjustified, as management often measures 
job performance by short-term speed in closing work tickets, not 
quality, meeting industry standards or ease of maintenance, which can be 
bigger performance factors in the long-term.

Robert L. Glass, in "Facts and Fallacies of Software Engineering", 
cautions that there is no single good approach to testing, as each 
method has different strengths and weaknesses: a person must examine a 
project from different angles, using different approaches, to ensure 
good quality.  Unit testing, unfortunately, makes pretty Jenkins graphs 
that impress management, so developers often use that as an excuse to 
stop testing early and get back to development.

IIRC, Glass also suggests that requirements-oriented testing (i.e. BDD) 
is also not sufficient for thorough testing, for these reasons.

Being a test engineer is a special personality, as many would find 
debugging others people's stuff all day long immensely boring.

Ken B.

On 2017-08-05 9:14 AM, David Collier-Brown wrote:
> The wording suggests a group doing "code first, test whenever".
>
> And with people other than the authors writing the tests, they may 
> easily test for a particular implementation rather than a desired 
> result. That was endemic at WG.
>
>  My recent experiments with BDD suggest black-box testing at a high 
> level for success first, then cautiously for regressions that were 
> user-visible.
>
> --dave
>
> On 05/08/17 09:02 AM, D'Arcy Cain wrote:
>> On 08/04/2017 10:22 PM, Anthony de Boer wrote:
>>> William Park wrote:
>>>> I couldn't help notice that this position says "Automation 
>>>> Engineer" in title but says "Software Developer in Test" in 
>>>> description.  I assume it's similar to "Test Automation Engineer"?
>>>
>>> Welcome to the 21st century, where more time and effort goes into 
>>> coding
>>> the automated regression testing of a new feature than needs to go into
>>> coding the feature itself.  :-)
>>
>> Which is a good thing.  Better than "Welcome to the 20th century 
>> where more time and effort goes into debugging regressions than went 
>> into the coding the feature itself."
>>
>

-- 
Ken O. Burtch
Open Source Entrepreneur, Linux Architect & Author
905.562.0848 / ken at pegasoft.ca  / www.pegasoft.ca/homes/kburtch.html



More information about the u-u mailing list