Wednesday 22 March 2017

NWEWT #2 Growing Testers

Introduction

Last weekend I attended the second edition of the North West Exploratory Workshop on Testing (NWEWT). If you don’t know what an exploratory workshop is or want to know more about NWEWT, read my previous blog post here:

Attendees

The attendees were as follows, the content of this blog post should be attributed to their input as much as mine, the thoughts I have here were brought together through collaboration:
Ady Stokes
Ash Winter
Callum Hough
Claire Reckless
Dan Ashby
Duncan Nisbet
Emma Preston
Gwen Diagram
Jit Gosai
Marc Muller
Vernon Richards
Vishnu Priya

Growing testers

This years theme was “growing testers”, looking to spark discussion on our own experiences as we’ve grown ourselves as testers and how we help other testers grow. We had a mix of new faces, some of whom it was their first time public speaking, and experienced people, which led to a nice mix of discussions exploring the topic from one end to the other.

I’m not going to go through all of the talks and everything that was discussed here, I’d just like to quickly blog about the discussions that really hit a chord for me and where my thoughts are on the subject.

Main takeaways

The major takeaway for me was Ash Winter’s ‘wheel of testing’, I really liked this idea and I think it struck a chord with me because I’m relatively new to managing testers and trying to guide them in their career progression. The more ideas I can try and explore to make my own ones, the better, I feel.
Ash explained that the wheel came from his dislike of competencey frameworks and the typical talk around growth being a linear path, whereas really it’s quite a chaotic and winding path. So he came up with a wheel to visualise the different areas a tester could focus on to improve. I’ll let Ash publish and explain his wheel himself, but effectively it contained different core areas of testing, with specialised or more focused subjects going outwards. The idea was not to tick off particular areas or focus people on any one path, but demonstrate what paths are available and engage testers in a discussion.

I also liked Marc Muller’s model which took 5 areas of testing skills and mapped them onto a radar chart. He asked testers to score themselves from 0 to 10 in each area and used this to get a picture of his team. I liked the simple visual nature of this chart and just as in Ash’s model it’s a useful tool to open up the conversation with testers on what the different skills mean to them and what they would like to improve.

Several people gave experience reports of what it was like for them to grow as a tester and I recognised so many familiar aspects to my career. It seems things have still not changed in that respect, people are still falling into it and accidentally happen across the testing community.

Naturally the topic of growing testers eventually led to the topic of “the future of testers”. While we didn’t go too far into this, as it’s a huge topic in itself, it was clear there was a fairly large difference in opinion and my takeaway from this is that I’d love to get into it more!

My talk

My interpretation of growing testers had two aspects to it, one was an introspective look at how I’ve grown as a tester and how I manage and attempt to help testers within my team grow. Another aspect was how to improve the growth of testers in the industry. The former I didn’t feel I was making any interesting points on, so in hindsight I wish I had dropped that part. But the latter point I’ve realised I’m quite interested in and curious about.

I argued that to help grow more and better testers in the software industry, we (society in general, not just the testing community) could be doing more to improve awareness about testing through education. I referred to the example of Scratch which is used to educate children on programming at school - could we be doing something similar for testing or somehow bringing elements of testing into those exercises?

I believe we can, I believe we could be improving how software development in general is taught (or not taught!) throughout education. I don’t mean testing degrees or testing qualifications though. How testing could be brought into education, how people could be made more aware could take many forms:
  • The obvious option being degrees or qualifications like GCSEs.
  • Supplemental modules or specialisms within existing computer science or software development or engineering courses.
  • A change in the way programming is taught in existing modules or courses. Rather than focusing on pure coding problems, could we be focusing on delivery of software? We don’t have to call it “testing” but we could be changing programmers to be more used to understanding the wider challenges of software development and better advocates of testing. If a programmer recognises the need for a critical eye on their work, even if they don’t call that “testing”, aren’t they more likely to ask for it?
  • A better promoted option in careers discussions. Career discussions generally are quite poor at university from my experience from 2010. We all wondered what the hell we could be other than programmers but had no idea. Simply having someone talk to use about the different roles in the real world would have made a difference.
  • A one off talk from an experienced tester, maybe tied in with the career discussions.
  • Including assignments for programmers to build software that other students will test and project manage. Maybe not very practical but maybe there is a way to make this work. The best way to demonstrate the effectiveness of testing is actually try and produce software for somebody else.
  • Introducing ideas and techniques such as pairing, mobbing, code reviews, TDD, BDD, continuous delivery, logging and monitoring. These are not about testing but can be discussed quite easily in the context of testability. Through these subjects we could discuss testing. I also feel these ideas can be introduced even at a young age, at least to get people used to the people skills and communication challenges. If we could make people more aware of this before entering work would help I think.
  • Sandwich courses, where students take a year out from their course to work in industry. If I had understood testing better I think I would definitely have taken this option because testing is a great way to learn about development just as much as it’s a career in itself.

After this conference I’m pretty damn motivated to conduct more research about how software development in general is being taught through the various levels of education. I’m well aware that it may be a large time sink and require some commitment but I’ve thought about pursuing this avenue for a while now. Having spent a majority of my life in education, I really enjoyed it and I believe it can be much better and much more inspiring. 

Through the Q&A session we had after my talk, it felt like there were mixed feelings on this subject. I think its fair to say some people felt that education isn't the best place to learn about testing. Also some people agreed with the sentiment around Scratch as a way to perhaps find more testers and spread awareness. I definitely feel there is more to research and discuss on this subject and there is something in helping academia improve.

The other side of the interview table

Introduction

I’ve recently been in the privileged position of being on the other side of the interview table for several interviews over the past year. I’ve decided I’d like to share my experience and get some ideas written down.

Reading CVs

So before an interview, you usually need to review CVs and pick ones that you feel warrant pursuing. Why do we pick out CVs? Because interviewing is a costly process, it takes time and focus away from our daily work, particularly in my case at a mid-sized company where we don’t tend to interview on a regular basis. We simply don’t have the time to interview everyone that we receive a CV for, so we are forced to filter them down.
My general approach for this was the following:
  • Read through the CV thoroughly  - everything on the CV is a small clue about the person.
  • I looked at first for some sign of personality in the CV, something that told me why this person was looking for work and what motivates them to work.
  • I noted any skill that I thought may be relevant, not just programming skills. For example, skills with Business Analysis tools or experience on a Support team. Anything that could be valuable and bring something different to my test team.
  • Depending on the role we were looking for, I would review the years of experience.
  • I would make a note of any certifications, I personally don’t put a great amount of value on ISTQB certifications, but I considered them just the same as any training a candidate might mention.
  • I always looked for some mention that the person attended meetups, conferences, workshops or is somewhat actively engaged with the testing community. While this doesn’t rule people out (as it’s pretty rare that I see it on CVs), when people do mention it, it makes them stand out.
  • I would carefully analyse the wording chosen, especially when talking about skills or previous employment. While I wouldn’t necessarily reject a CV because of a typo, it’s pretty embarrassing when people have them in sentences such as “I have a keen eye for qaulity”.

My experience so far didn’t include the initial CV collation and filtering, however, I have done this once or twice with sets of 10 or 12 CVs. Perhaps if I was filtering a stack of 100 CVs, I probably wouldn’t be as thorough reading the CVs and may be more arbitrary about the criteria I reject them on.

My general experience with this part of interviewing is there is not much right and wrong here. Only you can decide what a “good” CV is and what matches your criteria for the role. I have my own personal preferences for people that add a little personality to their CV, with opinions and motivations but other people may value lists of skills or abilities more highly.

I will say though that many, many people seem to have very, very similar CVs, which makes it hard to pick a few to take forward to interview. This is why you may end up using pretty arbitrary rules for filtering and it also biases you towards those CVs that look a bit different. As an interviewee you can use this to your advantage, but as an interviewer I feel you need to be careful not to let this bias lead you too much. Sometimes a dull CV hides a gem of a candidate!

Preparing for the interview

Who is the person? What do I want to find out?
If it’s been quite a while or if I’ve been quite busy with other work between reading the CV the first time and the date of the interview, I will first start with refreshing my memory on the CV. I will try to think about what I like about this person from the CV that I want to see more of, and try to think of questions that will give them opportunity to impress in these areas. Equally, I will also look for areas that I dislike and try to think of questions that explore these.  Some examples of these I’ve had in the past:
  • A tester mentioned working closely with developers and managing the relationships with them - I’ve asked them to expand on that, what’s worked well, what hasn’t etc.
  • Some CVs have simply listed skills without description of what their level of experience or confidence with them is, or how they’ve used them. So I’ve targeted questions on those skills to try and explore where they really are with them. “I know Java” would usually prompt questions from me about how they’ve used it and how confident they are with it, even specific questions regarding it.
  • Some CVs have also described their previous testing experience mainly in terms of “producing Test Cases and Test Plans according to the specifications” which prompts me to probe quite a bit about the candidates feelings on exploratory testing and how they would handle an environment without many written test cases.
Due to the nature of everyone’s CV being different, this means I end up with a different set of questions each time. Currently I feel this is a little inadequate because I feel I end up with inconsistent or biased opinions on the candidates where I’ve asked better questions to some than others.
Interview format
Something that I’ve had not had much chance to experiment with yet is scripting or planning the interview format. But I feel there are several variables that can change and I could experiment with:
  • How many people are going to be involved in the interview?
  • How long will the interview be?
  • Will we include a technical test?
  • How many interviews will we conduct with each candidate (e.g. 2nd stage or 3rd stage interviews)?
  • Do we ask different questions or the same questions to each candidate? Do we stick to a script?
  • Do we ask the candidate to perform homework or a task before the interview?
  • Do we ask the candidate to conduct a task (such as a presentation) during the interview?
I’ve been in various interviews with a mix of the above and I’m undecided on what does and doesn’t work. However it’s worth considering and planning these things before the candidate walks through the door! I also feel I can improve how I learn from each interview and compare them. I would like to spend more time in future making sure the experience with each candidate is more consistent and keep better notes on them. In other words I feel I need to plan better how I am going to make a decision on which candidate to choose, rather than leaving it to gut feeling and all of its biases.

The interview itself

Think about your performance
Regardless of whether you are either the interviewer or the interviewee, my number one rule for interviews is to think of interviews as a two-way conversation. Both parties are interviewing each other to figure out if they like each other. As the interviewer I feel it’s important to respect this even if the candidate doesn’t and give them plenty of opportunities to ask questions. Not only that but I try to keep discussions as honest, informal and friendly as possible. If it can feel more like chatting casually in a cafe or a bar, the better, because both interviewer and interviewee are going to think of better questions and answers.

With this mind, I try to be cautious not to assault the candidate with lots of questions one after another. It’s not easy to describe when it makes sense to hold off and give the candidate space, it depends on several factors:
  • The personalities of everyone in the interview.
  • The mental state of the candidate.
  • How difficult the questions being asked are.
  • How the conversation has been going (i.e. sometimes the flow is so natural that we may be chatting fairly casually and rattling through lots of questions and that’s ok).
  • How much time we have.

I’ve noticed that people very rarely tend to ask questions after the interview, despite being told they can. While I still encourage this, I’ve taken this to mean it’s very important that the interviewee gets chance to ask as much as they can in the interview. If possible, I try to see if I can learn from their questions rather than from the answers they have for mine.

Multiple interviewers
All of the interviews I’ve conducted have been with other interviewers in the room, asking questions. The worst thing that can happen is where you trip over each other, interrupting or awkwardly looking at each other to ask the next question. This is why preparing the interview format and discussing a script or questions beforehand is important to me. For me you get so little time with candidates that you have to spend every minute, every second very carefully. I absolutely hate when an interviewer pursues a line of questioning which has been covered before or that I don’t consider very useful for this reason.

What would a script look like? Would it be a set of strict questions, one after another that we would follow to the letter? No of course not, as I said earlier it’s important to keep the interview casual and informal, letting it flow with the candidate, adapting all of the time. I would like to try scripts in future where we plan out what kinds of questions and discussions we would like have and assign each interviewer to “lead” each. So someone would conduct the introductions, outro and facilitate the interview, another would ask deeper questions on a topic, etc. I would still allow each interviewer to interrupt or go off script but the key is to try and make sure we get the most out of the interview while keeping it natural.

It’s all about opportunities, not tests
If you are thinking of including some kind of task, examination or test of the candidate to assess their skills, bear this in mind - do not look for failure. What do I mean by this? Interviews are very compromised things, there is a lot of pressure involved and people don’t perform anywhere near like they do when they work normally. It is rarely an accurate representation of what the person is like to work with. With this in mind, I try to view questions and tests as opportunities for the candidate to impress me. If the candidate misses or messes up these opportunities, I try to keep in mind that this may be due to the unusual pressure. I feel if I view it as a series of opportunities to impress, then I avoid placing too much emphasis on particular parts of the interview and look for more well-rounded candidates. It also means people have a chance to recover, where they may mess up the start of an interview, but relax and impress later. Or they may impress in their preparation but fluff up their performance because they are not comfortable with interviews. I’m also open to my own questions being terrible and the candidate impressing me in a way that I didn’t expect, on something I didn’t ask them about.

Life is continuous learning and lessons
Even if you don’t hire them, make sure to always give feedback to the candidate and if there are areas they didn’t know or understand, always take the opportunity to teach them if possible. You may not be hiring them but it can be impossible for the candidate to improve themselves if they never receive feedback. I used to find it very frustrating when no-one ever told me why I didn’t get the job, even if I had done nothing wrong it would been helpful for my confidence to know the reasons.

Interviewing testers

So what about testers? What do we talk about and discuss, what is important for testing? My first reference for this is Dan Ashby’s excellent interview mindmap found here:
‘Nuff said! But some additional thoughts for me:
  • Discussing definitions of “testing” and why people like testing are important because everyone has different ideas and understanding. This as much about making the candidate feel comfortable with what they are applying for as it is establishing they are the right fit for us.
  • Discussing “agile” or “devops” are also opportunities to make clear how we work. I’m not looking for people rattle off dictionary definitions of these words but I want to understand what they think it is and how they adapt to topics that affect testing. Its also for me to explain what I believe it is and how the company has interpreted or implemented those ideas. The discussion and understanding  is the important part, not testing the candidate for definitions.
  • In terms of technical tests or exams, I’m very skeptical. While there may be certain contexts where you are looking to hire testers with programming experience, I personally don’t view programming as a key testing skill. However, if I could design a technical test that gives a good picture of how capable of learning technical subjects, I would try that! I value testers with the right attitude and approach and the ability to learn a great deal, already knowing programming is useful but not critical. The critical ability is the capacity to learn. I’ve worked with and hired great testers who know little about programming and have contributed a lot of value, if not more value that those that knew programming.
  • I’ve experimented with tests of candidate’s testing abilities and seen different ideas, but I’m again unconvinced how much you can judge. You can try and assess them on bugs they find in an application or ask them to explore their lateral thinking skills with a task such as mind-mapping a pencil. I’ve seen some interesting results from these tasks but I’m concerned that these tasks bias us towards candidates that are great on the spot. I suspect there are great testers who don’t perform very well in these situations but are excellent given more time and less pressure.

Summary

  • Its rare that we trained how to interview so it’s worth spending time planning how you are going to learn and improve, because it is an area that has particular skills and considerations like any other.
  • I’ve got several areas I’d like to focus on improving or learning more about in future, particularly around planning and facilitating interviews.
  • It’s easy to feel interviews are about asking lots of questions and testing the interviewee, based on your experience as an interviewee. But the best interviews are where you make it a more natural and informal chat.
  • Opportunities to impress, not testing for failure!
  • Make sure to always take the time to give feedback, especially if you don’t hire the candidate. Tell them why you are not hiring them, so they can improve.