Agile Up to Here: an experience report
If you haven’t heard the term “Agilistry”, don’t worry, it’s not a new development methodology you have to learn in order to be current, but there is a good chance you will be hearing more about it.
Agilistry is the name for a training space in Pleasanton, CA, opened by Agile luminary and long-time software development consultant Elisabeth Hendrickson. Known for her immersive and practical software development exercises, Elisabeth has opened a space for software professionals to learn the “true spirit of Agile software development.”
Last week, I had a chance to see if her studio lived up to her claim of “a place where Agile software development professionals come to sharpen their saws and practice their craft.”
I’ve known Elisabeth since 2000 when she came to Satisfice, the company (and training space) my brother created in 1999. He created it to give testers a chance to practice their craft. Ten years later (and partly inspired by her experience at Satisfice) she has turned the tables and invited me to see it in action. Actually, I was just one of 11 guests summoned to Pleasanton to see what she had in mind for her workshop idea called “Agile Up to Here.” (search #au2h on Twitter for threads)
As Manager for Corporate Intellect here at Quardev, part of my job is to put myself in places that maximize my ability to learn new things about software so we can stay competitive. Principles and practices related to Agile Development are things that continue to emerge for us on more and more projects we are asked to bid on.
When she invited me, my main concern was what value I would add to an Agile workshop. In my experience, Agile was about programmers doing all of the testing and I’m not a programmer. Also, Agile proponents always seem to imply that there are no defined roles for testers because developers did all testing through unit and acceptance tests.
I expressed this concern to Elisabeth and she was adamant. “Not only is exploratory testing part of Agile, it is a crucial component of it. You are required to be here.” That made me feel better. I trusted Elisabeth because she had demonstrated that although a very fervent fan of Agile, she hadn’t lost her passion for testing.
I’m not a newbie to Agile, but there are tons of people who know a lot more about it than me. Sure, I’m familiar with the Agile Manifesto and know about story cards, backlogs, refactoring, sprints, Scrumboards, big visible charts, Test-Driven Design. I was also a stage producer at the Agile2008 Conference in Toronto, hosting the “Questioning Agile” track, and I have worked as a test manager on projects that used facets of Agile.
At Agilistry last week, I was first to arrive (a bag of Seattle coffee in hand to brew for the crew) and found Elisabeth setting up. There were 7 pairing stations, a big rolling whiteboard, index cards of every color everywhere, a few small couches to sit, a monitor on the wall for the Hudson integration system to advertise its results, a small fridge and sink area, a printer, a wireless network… and that was about it. A pleasant space in Pleasanton, not over-complicated, but resembling what the Agile conventions suggested – no cubes, no walls, maximized for pairing, transparency, and communication.
Leading up to the workshop, there had been a wiki for us to get to know each other, post our bios and expectations, take advantage of the Twitter hashtag (#au2h), etc., but as people arrived, it wasn’t clear to me what our mission was.
We had our first stand-up – introductions. Everybody was a programmer except for me. Just as I had thought, I was sure I was going to be made obsolete, but I trusted what Elisabeth told me. That I was a required component. That I would add value by being there.
Alan Cooper from Cooper Interaction Design and author of The Inmates Are Running the Aslyum and About Face told us the mission: He was a word nut. For years he had collected homophones – words that sound alike but are spelled differently and mean different things (e.g. ere, air, and heir). He had a website that listed some of his collection, but a lot of it was tucked away on his hard drive. Furthermore, his site was old – vintage 1997, web .5 (not even 1.0) and the list was hardcoded HTML.
As Product Owner (not designer), his main objective for us was “Get me out of 1997!”
He didn’t elaborate more than telling us what homophones were, but he did make enough of an introduction for me to get the gist that we would be building a site for him from scratch in these 5 days. I love challenges like that, especially when they are authentic – a real problem for a real person. Abstraction lessons can be fun, too, but I’d much rather provide value to some person.
Part facilitator, part host, and part programmer, Elisabeth announced that she would need some help configuring the machines. In seconds, she got two of the programmer-types to volunteer — Pat Maddox and BJ Clark helped her configure the pairing stations with the tools we needed: Hudson CI, GitHub, Rspec, Ruby on Rails, and Cucumber.
Jeff Patton, an independent consultant and Agile coach, was also in attendance and emerged as a natural ScrumMaster, suggesting that the rest of us meet with Alan to get an idea of the kinds of things we wanted to see in a new site.
And just like that, without fanfare or ceremony, we broke from our huddle like a team taking the field.
It felt weird. No specs, no design docs, no budget, no buy-in, no high-level meetings, no executives, no paperwork to fill out. Just go and DO.
So as Jeff Patton took the lead to interview Alan Cooper about his ideas for the new site, Dale Emery, Matt Barcomb, Katrina Owen, and I gathered around to listen. Index cards were plentiful and Jeff used them like a sculptor uses clay.
Two hours later, the machines were set up and my group was done talking with Alan – we had enough to get an idea of what he wanted and the board was full of Backlog.
The standup we had after that was simple. After a quick status report, Alan did a brief chalk talk on design, then we set to work, picking the few stories that we’d do the rest of that day – no bickering, no dissention, no turmoil. It just flowed. There was no confusion, no chaos, no tension. It reminded me from that scene in Apollo 13, where the ground crew had to build a filter out of spare parts. Yes, there was urgency and energy around the mission, but there was no clumsiness. People worked together and all they had to do was say or suggest something and a natural affinity formed for people who agreed. For those that wanted to do something different, they did and found someone to pair with.
What struck me when I paired with Elisabeth was TDD seemed like hacking. She would write code and then tests around that code and the tests would fail. That was a good thing, she said. Then she did trial and error fixing so that the tests worked. She admitted when she was stuck or didn’t know how to do something and she’d just ask the other pair next to her for advice or look it up online or in the API help docs and a solution would emerge, but I rolled my eyes because this was just hacking. She was trying different things, not knowing if they would work. That was TDD?!? Come on, really?!?
When I questioned Elisabeth about this, she said something that instantly hit me.
Yes, experimentation is ok with TDD, but it’s not just trying *anything*, it’s thoughtful experimentation. In one phrase, Elisabeth caught me judging TDD in the same way people attack exploratory testing as just reckless “banging on the keys.” There was a method to her trials and I didn’t see it because I didn’t know what to look for. Much in the same way test managers and execs aren’t hip to the language of skills and tactics that testers use when they explore – things like modeling, conjecturing, observing, branching, backtracking, questioning – these words describe what many people walking by would call “playing around”, but when the right language is used to describe what exploration really is, it’s more apt to be understood and taken seriously.
Another thing I chided Elisabeth was how she found a bug and fixed it in about 30 seconds. The finding and fixing part was cool, but then she took 30 minutes to write TDD tests around it! I thought that was a waste of effort. The bug was found and fixed, why waste all that time writing a regression fix for such a little thing?!? Then she explained it to me, it’s not just regression, but the *process* of creating the test that’s important. The lessons learned in building that test may come in handy later.
Again, I felt sheepish. Sometimes I go down a rat hole with a test and it may seem like a waste of time to a stakeholder. But it was what I learned from that “wasteful” test that stays with me. That seemed to me to be a big part of Agile development – learning. In fact, I was surprised (happily so) to know that when developers do a spate of programming in this trial-and-error way, they call it a learning “spike”. I liked that. I have a word for it, too, called a “session”, but I didn’t have a word for a smaller period of time, so “spike” is what I can borrow from them.
The first three days, I did not feel that the site or any of its functions were ready for me to test using my favorite testing approach. I didn’t feel that I would have added *value* by testing what was there. The components were simple, they worked, and to test it in the ways I had in mind did not seem to suit anyone, even me. The risk was low and it was still under construction anyway.
When developers finished a story and some TDD tests, they would ring a bell and everyone would want to know what was implemented. That turned out to be an important component of feeling we were providing value — a mini-celebration. The bell rang more frequently that I had expected. Progress was very fast, but not sloppy. The confirmatory tests we wrote were working, but I was ready to try something more sinister to expose risks.
By Wednesday, enough of the pieces were coming together where I felt it would be worth it to the team to see what could be wrong with it. So I started pairing.
Then I paired with Pat on a session to explore risks in how homophone sets were presented:
I got to show exploratory testing in action — questioning, adapting, chartering, note-taking, and learning *outside* of TDD-creation. And the programmers were open and receptive. I bounced ideas off them, and they bounced ideas of me. When we found bugs, I was happy, but instead of ringing a bell, all the celebration I needed was to write it on a red card and put it on the board to make the point Elisabeth knew all along — exploratory testing has an important place in Agile development. And no one complained about that. On the contrary, they reacted with purpose and curiosity to what I have found.
I learned that the synergy of Agile programming and testing was not meant to make testers extinct after all. It was a means to learn both sides of two important components of development. In fact, I’d say it was the fun part of the studio environment. It was, as Elisabeth might say, “Agilistry in action.”
Most importantly, in 5 days, we turned this old, 1997 site:
“You just have to try it for yourself,” is a conversation-stopper. It’s usually said when the person trying to persuade you of something has given up on you. But if you dismiss the freight and take them up on their invitation, it might be a profound experience.
After what I went through at #au2h, I was honored to have been invited. I wanted the chance to see if Elisabeth’s studio was indeed a place where “Agile software development professionals come to sharpen their saws and practice their craft” and I left convinced that she had hit a home run in designing the perfect space to emphasize these experiences.
Oh, by the way… did you remember that Alan Cooper was Product Owner? If you want to read his lessons on what happened for him, here it is: http://www.cooper.com/journal/2010/05/agile_up_to_here.html