Blog

Cambridge Lean Coffee (Hiccupps)

On November 26, 2015, in Syndicated, by Association for Software Testing
0

Yesterday’s Lean Coffee was hosted by Jagex.  Here’s a brief note on the topics that made it to discussion in the group that I was in.

Automated testing.

  • A big topic but mostly restricted this time to the question of screenshot comparison for web testing.
  • Experience reports say it’s fragile.
  • Understanding what you want to achieve with it is crucial because maintenance costs will likely be high.
  • Looking to test at the lowest level possible, for the smallest testable element possible, can probably reduce the number of screenshots you will want to take.
  • For example, to check that a background image is visible for a page, you might check at a lower level that the image is served and assume that browsers are reliable enough to render it rather than taking a screenshot of the whole page which includes much more than the simple background image.

Why go to a testing conference?

  • It builds your confidence as a tester to find that other people think similar things, make similar decisions, have similar solutions.
  • It also reassures you when other people have similar problems.
  • You are exposed in a short space of time, in a sympathetic environment, to new ideas or new perspectives on old ideas.
  • You can meet people that you’ve only previously followed or tweeted at, and deepen the connection with them. “Testing royalty” is accessible!
  • When you come back, sharing what you found can clarify it for you and hopefully make positive changes to the way you work.

Strategies for compatibility testing.

  • Experience reports say that there’s reasonable success with online services – to avoid having a stack of devices in-house – although not when high data throughput is required.
  • Reduce the permutations with a risk analysis.
  • Reduce the permutations by taking guidance from the business. What is important to your context, your customers?

How do you know which automated tests to remove?

  • Some tests have been running for years and never failed. This is wasting time and resource. 
  • Perhaps you shouldn’t remove them if the impact of them failing is considered too serious.
  • Perhaps there’s other ways to save time and resource. Do you even need to save this time and resource?
  • Can you run them differently? e.g. prioritise each test and run higher priority tests with greater frequency?
  • Can you run them differently? e.g. run only those that could be affected by a code change?
  • Can you run them differently? e.g. use randomisation to run subsets and build coverage over time?
  • Can you run them differently? e.g. run every suite frequently, but some configurations less frequently?
  • Chris George has a good talk on legacy tests.

Why isn’t testing easier?

  • We’ve been testing software for decades now. Why hasn’t it got easy?
  • It’s bespoke to each solution.
  • Teams often want to reinvent the wheel (and all the mistakes that go into invention.)
  • You can’t test everything.
  • Complexity of the inputs and deployment contexts increases at least as fast as advances in testing.
  • Systems are so interconnected these days, and pull in dependencies from all over the place.
  • People don’t like to change and so get stuck with out of date ideas about testing that don’t fit the current context.
 

Comments are closed.


Looking for something?

Use the form below to search the site:


Still not finding what you're looking for? Drop a comment on a post or contact us so we can take care of it!