Friday, January 15, 2010

Testing Legacy Apps

I'm making my slides from my talk today available here for all to see and slam. The slides themselves are fairly content-sparse, because I really do hate having a bunch of text on slides when I'm talking. People try to read while I'm talking, or I read text that's clearly on the slides, neither of which seems productive. So the slides are clipart-heavy. In addition, however, I am also going to post my mind map from when I was creating this presentation.

The basis of the presentation centers around my most recent few years of experience: how to handle targeting a test strategy on a legacy system. I've seen a surprisingly large number of companies that either "go agile" after having only manual testing for a while, or companies that have been around for a while and just never had a test team at all.

In these cases, trying to create a test team or convert a team of manual testers to more "agile" testers, often times while the whole organization is transitioning, is an incredibly daunting task. Personally, I've gotten overwhelmed at times with all of the things that need to be done -- how do you automate when there's still all of those manual tests to be done? My presentation relies heavily on the expert advice (via books and blogs) from Mike Cohn and Michael Feathers. In any case, here are the slides and the mind map (in pdf format).

Please provide feedback .... I think this is a talk I may try to give more often, and would love to ensure that I have the most broad base of input I can get. Thanks in advance :)

Welcome to the virtual Agile world!

So, today I gave my very first conference presentation. It was really my first presentation outside of my own team or business scope. I gave a talk on "Testing Legacy Apps" with a high-level strategy for how to tackle organizing the testing effort for a legacy (read little to no automated test harnessing) application. I've been on teams undergoing this effort several times in the past few years. Sometimes, it's a company that has never had a test team at all. Sometimes, it's a company that has had testing, but all testing was done manually and there was no automation at all. (In this case, by "automation", I am talking about things like unit tests, automated GUI tests, and/or all of the computer-executed tests in between.)

In any case, the point of this post is to talk about SecondLife, which is where I gave my talk. SecondLife is a virtual world, where people are represented by avatars, and they interact with a virtual world, created by others in that world (or themselves). I've said that I was giving a talk in SecondLife many times in the past few weeks, since AgileBill4D (Bill Krebs) asked me to speak.

The response is always the same facial expression. The look that says "Ohhhhhh, you're living an alternate, orgy-filled, sexually deviant life on the side, eh?" Being me, I do tend to launch immediately into the defense of SecondLife, now that I have been in it. I have to admit that when I first met Bill and he talked about SecondLife, I thought the same thing. But over time, his description of what he was doing in there have kind of grown on me. Let me tell you what Bill is doing there in SecondLife.

Bill has created an entire area in SecondLife for Agile training. He uses the space to hold training classes around the world, where he can create a space that looks much like an ideal immersion space (I'm thinking Agilistry Studios, via Elisabeth Hendrickson). Without having to spend money to travel to client sites around the world, he holds classes in SecondLife and teaches the concepts of agile. With a headset, he can talk as if on a conference call, or even an individual phone call (as if through Skype). With the ability to converse and share things in the virtual world, he can even pair program with someone sitting anywhere else in the world!

This picture shows my SecondLife avatar, TestyDawn Darkmatter, in front of a scrum board at the Agile space.

If I understand correctly, in the text chats, there is a translator available, so that the two people speaking don't even have to be typing in the same language! The translator will translate between the languages, so that each person can type in their native language. I can imagine that this would really help with some common cross-language communication issues!

I understand that some colleges and universities are using SecondLife to teach classes, too. It seems like it's much like the distributed learning classes that I have seen in webinar format, or like GotoMeeting. Since SecondLife offers both text and voice chat, classes can be held to a broad audience. Something I heard today that I had never thought of was that people who are deaf can still participate through the chat portion. It really was great to realize that this virtual world was not limiting to people with disabilities.

The conference looked much like any other small to moderate sized conference. There was a large room with many rows of big blocky black chairs, and a HUGE screen in front. There was even a podium in the corner for the presenter to stand near, all real-life-like.

In preparation for the conference, I became accustomed to how to navigate through SecondLife. It takes a bit to get used to the movements for my avatar versus controlling camera angles and views. You can zoom, pan and tilt the camera in order to get a better look at things. It takes a few hours to get a bit used to it. Someone has also asked me about customizing the avatar, and I'd like to address that as well. When you sign up, you choose a generic avatar to represent you in SecondLife. There are free things you can pick up to customize that avatar, and there is a whole component built in to change the appearance. The appearance covers the actual avatar body -- things like skin, hair, eyes, body type, etc. It reminded me of the Sims, honestly. Then out in the world, you go and find (or buy!) clothes, maybe hairstyles, jewelry, shoes, accessories, etc. This is something that you can spend as little or as much time on as you want to. I did spend a few hours shopping for some professional clothing in order to give my talk.

After a hard evening of shopping, I spent a little bit of time relaxing by the campfire. The crackling of the fire sounded very relaxing.

I am sure that there are places inside SecondLife that are not things I want to see or people I want to interact with (tho I bet it would help if I could find a skirt that actually covered my avatar's tush!). I can say that I have not run into these places or people. I have kept to the Agile space and Rockcliffe University (next door, I think) for the most part (other than clothes shopping), and there is a lot of neat stuff in those places.

As far as giving the talk went, I bet it was very similar to giving an actual talk. I think the major thing I noticed (that made me nervous), was that I couldn't see anyone's facial expressions or body language! Because of that, I had *no idea* if people were paying attention, or interested or not. It made the interactivity slightly more difficult, though it picked up at the end of my talk. I had a powerpoint slide deck, and though Bill made it so I was able to click through the slides, I used Bill's help and just asked him to click the slides for me (I was not confident enough in my camera navigation ability!).

Overall, it was a pretty neat experience and I'd totally be willing to do it again. I suggest checking it out. Look for me (TestyDawn Darkmatter) or Bill (AgileBill4D) for a little guidance if you want.

Here are a few links for some more information:

SecondLife (the software requires a download, but there are multiple platform versions)

Agile Dimensions (in SecondLife)

Agile Bill Krebs (the guy behind the curtain)

Tuesday, January 12, 2010

A funny story

You have to have a sense of humor to be a tester. Seriously, you have to laugh at things that **nobody** else would **ever** laugh at. You have to be able to see the most crazy, off-the-wall things and rather than lose your mind, laugh at them, and then deal with them.

If you're a developer, especially a developer who has worked for years without a QA department or a professional tester at all on your team, you don't *have to* have a sense of humor, but it sure makes everyone's lives easier and makes for a room full of awesomeness if you do.

Let me tell you a funny story.

I had this bug fix to test. The bug was that when submitting a commerce-type action, if the name of the group was greater than 50 characters, the commerce transaction failed. The solution was to truncate the name (in sending the transaction) if it was greater than 50 characters.

Still in learning mode and lots of manual testing aiding that, I went into the UI of the QA server so that I could create a group name that was > 50 characters. The field in the UI prevented me from doing that. So I pulled the developer over and asked him what was going on. I figured if the UI prevented it, that was great, then the problem wouldn't happen, but what about records in the database that already had group names greater than 50 chars? Had we run a db script to modify them? Had the UI change just been done to prevent this bug from happening again?

He explained to me that the UI change had been done some time ago, but that we had not modified original records in the database. In those cases, the name would be truncated when the commerce transaction went through. So I went into the database to create myself a record with this quality and then I planned on going back into the UI to attempt the commerce transaction.

SQL Server told me no go on creating a group name like this: "Test One this is a really long group name because I want to test greater than 50 characters and I had to make a long name". It turns out, the database constraint on this field was a varchar(50). WTF? The database would not let me have a value that exceeded 50 characters, so how the heck could this issue even EXIST??

So this time I slid my chair over to the dev desk and asked how that could *ever* happen, and why were we fixing this bug that could not possibly exist. He said to me "Wait, here's how we are going to make it happen." I sat and watched him right-click on the table in SQL Server Mgmt Studio, choose Design..., and then go in and change the field to varchar(100).


He laughed too and then tried to explain to me that the production database really had that column set to varchar(100) so it was okay. I started laughing harder and said "WAIT! WHAT? The QA database doesn't even MATCH the production database?!?!?"

(as completely crazy as this story sounds, and please take your jaw off the floor, the following is an example of why I really love this team)

At about this point, the dev team lead came over and asked what all the hubbub was about. After we laughed through the explanation, during which I was described as being "a bit dramatic", which I was (I can kinda get a bit OMFGWTFLOL when I encounter these types of things), we agreed that we needed to have a QA web site up that had a QA-only database that was an exact copy of what was on production. We also agreed that from this point on, all changes to the database would be done on development, and then dev/test would check it out there, and they would be scripted such that when dev code was promoted to the QA site (let's call this the "exploratory" site), database changes would also be promoted with the code.

It was as easy as that. The three of us laughed about this craziness, figured out a way to set things up so that we weren't doing that one small thing in a crazy way any more, and just like that, we solved the issue and moved forward. As funny as the example is, I think it's a really great example of how little by little, a cooperative, respectful team can get things done.

And it's so much more fun when we can all laugh about it. Good thing I have a sense of humor!

Saturday, January 9, 2010

Let's not forget what makes a good tester a good tester

In the past year or so, I've spent more time than not trying to find really good "agile testers" to hire. In this search, I have also had many conversations with other people about what I am looking for, what they are looking for in hiring, and how we can find what we are looking for.

Lanette Creamer asked a question recently about whether people searching for agile testers seem to be starting to focus too much on technical skills over *testing* skills. (By the way, Lanette will be looking for a job soon -- if you have an opening or might during this year, I suggest taking a look at her blog, Testy Redhead, and contacting her for further discussions.) As someone who has and will continue to hire testers to work on agile teams, I'd like to lay down my priorities in what I look for, and make an appeal to other hiring managers not to forget the qualities that make a really great tester.

So, if I had to lay out a bulleted list of what I am looking for in a person to join my team, it would look something like this:
  • Passion: I am looking first and foremost for someone who is passionate about what they do. I want someone who enjoys testing, someone who will be on my team for the long haul and not just use it as a stepping stone.
  • "The testing mindset": This one is hard to quantify, and I am relatively certain that I won't do this description justice. But here's a shot at it: I believe that many great testers are naturally curious, naturally analytical, and are critical thinkers. I believe they are particularly good at seeing through a problem down to its root issue. I believe they usually enjoy puzzles and logic problems. Many very good testers I have known are the kinds of people that see a typo or grammatical error on some publication and it stands out to them like a sore thumb.
  • Technical ability: Coding ability is icing on the cake for me. I love to have it, but if it's not there explicitly, something that indicates an ability to learn to code is all I really need. I've said for a while that I think it's *much* easier to teach coding skills than "the testing mindset".
So when I am sitting in front of a pile of resumes, what process do I go through to try to find these people? It is by far not a perfect process, but here is *roughly* how I try to find great testers.
  • Before I have resumes, I must have posted a job description. I try to make my job descriptions stand out from the norm. My classic role model for this is the Atlassian QA Engineer job description: I do not require experience on an agile team or automation or coding skills.
  • A resume that stands out: (Let's assume I always do resume scanning/filtering on my own rather than using a recruiter) I found last year that a large percentage (just over 50%?) of the resumes I received were very bland. Many of them appeared to be nothing more than a broad list of technology keywords squished onto a few pages. Unfortunately, listing a whole bunch of technologies you have heard of (they can't possibly be masters of all of these technologies!), tells me *nothing* about what you actually *do* or how you do it. I look for a resume that describes your accomplishments and tells me what initiatives you are going to bring to my team. I also look specifically for signs of continuous improvement, both in themselves and in their roles.
  • During phone screening, I can usually get a good feel for whether passion exists or not. I tend to ask candidates about themselves, what they do, what they like about testing, and what they don't like, and what made them apply for my position. I tend to ask the typical biggest accomplishment and biggest disappointment type questions. I learned this year to ask questions about what they do outside of work (favorite blogs, books, websites, etc) and to pause long enough to let them say more than just their answer. Some people make it pretty clear at this point that they really want to do something else (like be a developer) or that they just need a job, period.
  • For those that I want to talk to in person, I've started giving them a testing exercise (thanks to Patrick Wilson-Welsh) to do between the phone interview and in-person interview. I've chosen to do it this way because I don't really want to filter out people just because they are nervous in an in-person interview. In this example, if they can follow and test Java code, that's great, but if they don't have that particular skill, I send them to just the target website of that exercise for testing, and ask them to jot down a list of what they found and questions they have. I encourage them to communicate with me and ask questions freely. I have actually had people "self-select" at this point and refuse to do it.
It's worth noting that this exercise is what gives me the most information about their analytical and testing skills. Their responses to this will tell me a few things. I will get to see how well they think through the problem and which things they have questions on. I'll get to see how comfortable they are asking questions (I want someone who is not afraid to ask questions). Hopefully, I will also get to see them find some bugs and evaluate how they report those bugs.
  • The in-person interview: It has been suggested to me that I work through the testing exercise in more detail with them in person -- kind of like a pair-testing exercise. In the future, I believe I will incorporate this more, because it makes sense. In general, though, by this point, I am fairly comfortable with the major qualities and fundamentals. During this stage, I tend to describe the current environment, along with the biggest challenges and hurdles to jump. I ask them how they would approach some of the real problems or tackle some of the actual challenges.
Obviously, there are many many resources available out there on how to find good people. Social networking has become a really great resource for recommendations for job applicants and shouldn't be overlooked as a source of finding people who can be specifically recommended. Johanna Rothman has an entire blog dedicated to Hiring Technical People. Books have been written on interviewing and lots of websites offer tips and advice on questions to ask. I'd like to hear from you about particularly good or bad interview experiences (from either side of the table) that you've had. What have I missed here?

Thursday, January 7, 2010

My first published article

Oh yay!! My first article is out! Lisa Crispin co-authored this with me, and it's kinda cool to see it :)

(you do have to register, but it's free)

I can't possibly post this, however, without also giving a shoutout to all of the awesome women in the latest ST&P Magazine: Women of Influence. Find it at

Sunday, January 3, 2010

Nervous about an interview? Try this!

I think for most of us, being on the interviewee side of an interview is *hard*. Personally, I am always really nervous when I feel like I am being judged. I don't like being the center of attention. I have trouble with public speaking for these same reasons. An interview is a judgement of my professional abilities, and to not be offered a position after an interview is a *rejection* that hits a nerve. It's enough to make anybody nervous.

In public speaking, many people have little tips and tricks to help with being nervous. They may tell you to picture everyone naked (does anyone actually *do* that??), or to focus on a single person, make eye contact, and pretend like you are speaking only to them.

For interviewing, I found a way of looking at the interview that seemed to really help alleviate my nervousness before and during an interview: I pretended like I was just chatting with someone I had met at a conference.

I went to Agile 2009 in August, and I met a great many people that were in all kinds of different positions. Some were testers, some were developers, some were coaches, some were Ux experts. Some were consultants, some worked for very large companies, some very small. The range of experiences and situations was pretty big. I found that conversations were easy to get into with every single person I met. I'd ask a bit about where they worked and what they did, and what they were hoping to get out of the conference. I'd tell them a bit about me and what I was hoping to get out of the conference, too.

Sometimes someone would say something about an issue they were trying to solve that I had experience with, and I would be able to share some of my experience with them. Of course, this was a two way street, and I got a lot of great advice too. The point is that I did this enough times that in just a few short days it became pretty natural. If you've been to a large professional gathering, I'd guess that you've done much of the same thing.

So the next time I found myself on the interviewee side of the table, I told myself that I was just talking to another person I had met at a conference. I think that this idea addresses two main points of an interview: it helps me to learn about the company I am interviewing for (after all, an interview *is* (or should be) two-sided), *and* it helps them to get insight into what *I* can do for their team should they hire me.

When I ask them about what they do, I learn some more than just a job description provides about the company, the team, and the role. I can ask specific questions, like what a day in the life of this role looks like. When I ask them about their biggest pain points, I get insight into where their weaknesses are, and whether I think these are things I want to deal with or tackle.

Asking about biggest pain points is doubly beneficial, though, because then I can also look for things I can relate to. I look for things similar to what I have experienced in the past. When I find them, I can then discuss my own similar experience and what happened in that situation. In doing that, I provide the interviewer with information about how I can help them, how their hiring me will benefit them and make me the best candidate for the position.

Is it possible that I won't find anything I can relate to? I suppose, but I don't think it's all that frequent. If it happens, it may mean that I'm not a good fit for that position anyway.

I also think that to whatever degree I can pull this attitude off, it takes some of the pressure off of me too. I learned many years ago that I interviewed the best when I didn't really *need* the job, likely because I wasn't paralyzed with terror for my job status.

So, take a deep breath, relax, and just chat. Your passion and expertise will come out naturally. :)

How well should testers know the customer?

In an ideal world, our whole team (all of those people involved in creating a product) would all be on the same page, and the team would include customers. The customers on the team would be able to tell us immediately when something doesn't make sense from a domain context, and we would not waste any time on the things that don't make sense.

In my experiences, few teams are ideal. So let's assume that you don't have the cell phone number of your customer, *and* let's assume that much of the development team is not intimate with the deepest details of the domain of their product. Are there ways to minimize waste, still? Sure there are ... Here is a story I tell people a lot and I figured I would share it here.

I worked in a job once where I was in charge of running the QA team of a small start-up. Their software focused on the bioinformatics researcher. We made heavy use of statistical algorithms for genetic comparison -- sometimes pairwise searching, sometimes aligning, sometimes tracking experimental data over time. Our software did a lot of visualization of this data, so that these researchers could easily track down the information they were looking for to guide their next step in research.

The company was small enough that it didn't really have anyone who represented a BA kind of role. We had a "Project Manager", and "Software Leads", but QA really ended up representing the customer in most things. At times, we ended up actually being customer support as well. I had noticed while we were growing that the new developers lacked domain knowledge, and many times, when we would submit bugs ("These results can't possibly be correct"), they had no idea what was incorrect because they didn't immediately see *why* it was incorrect.

So one of the things I did to combat this issue was train my QA team in the depths of domain knowledge. We learned how to perform the algorithms BY HAND. We split the algorithms between us, and each became an "expert" in a group of these algorithms. We spent a lot of time learning the command-lines and working through examples manually.

As a result, I believe that my little circle of influence was pretty efficient. The testers could look at a visualization chart and immediately flag the parts that were problematic. We could understand what the customer was talking about when they came to us and reported an issue. We could even ask them the necessary questions to pin down the source of the issue (most of the time!).

Since that time, I've worked on a few other teams where many people on the team (dev, tester, etc) did not have depth of domain knowledge. I have seen how many issues slip right on through, because nobody in the technical team realized there was a problem (and we all know how expensive things are to fix once they're out in the wild). Troubleshooting issues that *are* identified takes longer, because the root of the issue may not be immediately clear.

So, how well should testers know the customer? Well, testers and developers alike should have as much domain knowledge as possible, and should be as intimate with the customer as possible. Especially in the absence of an actual customer (or lack of constant access to one), the more domain knowledge the technical team has, the more likely they are to find issue early and track them down easily.