No liars please, we’re testers.

I’ve been prompted to write this after sifting through another load of dross CVs. I say another because I was recruiting heavily last year. A recruitment drive that took me twelve months to find just two test analysts.. hold that thought.

Its not that I didn’t get many applicants the first time round I did, hundreds in fact (quite literally) and I duly read and scored every CV personally and feedback too. Not only that, I also had two of my peers review the CVs too. You cant ask for more than that. If two out of three agreed, the decision was final.

The reason it took a year was down to the quality of the CVs. At least half of the CVs we read didn’t state the daily tasks that you would expect a career tester to be doing. Only if you were playing at being a tester would you make that mistake. Unfortunately sometimes the CV would tick all the boxes and we would invite the applicant in for interview. Imagine their horror when they were faced with a simple exercise to test their SQL skills even though on their CV they had waxed lyrical about how they practically spoke to their friends in SQL because they had used it so much they were fluent.

“so I have a table called customer, with the fields ID, Name, Address. How would I get all of the records form the table?” you should see some of the answers. Its often hard for me not to shout “just give up, I know you haven’t got a bloody clue despite what it says on your CV” they muddle on “GET ALL RECORDS FROM TABLE WHERE NAME == CUSTOMERS AND ID….”

Imagine also them recoiling in their seats when I ask them to draw out the V-model and label it.
“What’s that you say, you don’t know how? But here on your CV you state you are a senior test analyst and have and ISEB foundation certificate, how do you not know the V model?” they shuffle their feet and mumble how they studied at home, well not actually studied so much as bought a guide on how to pass that’s full of example questions.

I watch in wonder as their faces contort when I ask them “so what is the difference between white box testing and black box” I let them fumble through telling me how they have used both of those “methodologies” and I follow up with “can you give me some examples where you white box tested?”

Then come the questions on web testing (its what we do after all) “so whats a cookie I ask”, the smile, easy they think “it’s a virus you get from visiting sex sites” oh my! What should I do as a tester? “never ever accept cookies, they track all your movements, like little spys in your computer”.
I follow with a simple exercise about shopping carts and sessions to see if the candidate understands why a cookie maybe important here. “the system gets all that info from the cookie” but how did it get in the cookie? “from the internet” can I see it, I want to see my cookie “oh no you cant see them, they are secret”.

I have even had to terminate several interviews because it became apparent very quickly that the applicant in the chair didn’t actually know what was on their CV because they had just copied it off a (delete as appropriate) friend / colleague / LinkedIn profile.

It was so bad in the past that we setup an online quiz. It was very simple, multiple choice, some questions around testing, some around our domain. Some very easy questions “Which of the following is a search engine” with an obvious answer “google”. We discovered a side effect of an easy question like that is that we could see how fast the candidate answered the question when they knew it straight off the cuff (about 9 seconds for that question) and compare it to a testing related question that took 3 them minutes to answer (did they have to search for the answer?). The test was very easy for a career web tester, but not so easy for an IT support person or BA or even a developer who fancied a move into testing. Its only real purpose was to filter out the complete time wasters.

So here I am again, I’m hiring and I’m inundated with CVs and again 50% are pure wasted bandwidth (I don’t give them the luxury of printing them out). But this time I don’t have the online test, and I’m gnashing my teeth at some of the incredulous stuff in these CVS. Some of them read like horrible blog postings “on this job we had this challenge and so we had to X because of Y but then Z happened and so we used plan A…” blah blah blah “then the business wanted B but I wrote the very detailed spec of C” bletch grrr spit pfftt, its all I can do to stop myself posting these fetid monologues online for no other reason than ridicule and I hate myself for it.

So faced with the prospect of interviewing a load of (lets be blunt here) bullshit artists only to show them the door at the end of it isn’t a prospect I’m overjoyed with. I don’t want to spend two hours of my life demonstrating why the candidate is a liar. I don’t want to be associated with these bottom feeders in a professional sense either. I loathe them, and I loathe the arseholes that gave them a “consultant” badge at Logica (or any other faceless body shop), because now they think they are gods gift and we should roll out the red carpet for them.

I will continue to sift through the dross, the cream always floats and that’s what I’m after, the cream, the crème de la crème.

So if you are interviewing a tester that tells you that you gave them a much easier ride than in a previous interview they attended, you know I rejected them, and that you may want to make use of that probationary period.

But if you’re a testers who’s CV isn’t straight up and down, you may want to rethink applying for a job with me.

Oh and by the way, don’t put “I have a keen eye for attention to detail” then litter your CV with spelling mistakes poor grammar and mixed styling!

No liars please, we’re testers.

I’ve been prompted to write this after sifting through another load of dross CVs. I say another because I was recruiting heavily last year. A recruitment drive that took me twelve months to find just two test analysts.. hold that thought.

Its not that I didn’t get many applicants the first time round I did, hundreds in fact (quite literally) and I duly read and scored every CV personally and feedback too. Not only that, I also had two of my peers review the CVs too. You cant ask for more than that. If two out of three agreed, the decision was final.

The reason it took a year was down to the quality of the CVs. At least half of the CVs we read didn’t state the daily tasks that you would expect a career tester to be doing. Only if you were playing at being a tester would you make that mistake. Unfortunately sometimes the CV would tick all the boxes and we would invite the applicant in for interview. Imagine their horror when they were faced with a simple exercise to test their SQL skills even though on their CV they had waxed lyrical about how they practically spoke to their friends in SQL because they had used it so much they were fluent.

“so I have a table called customer, with the fields ID, Name, Address. How would I get all of the records form the table?” you should see some of the answers. Its often hard for me not to shout “just give up, I know you haven’t got a bloody clue despite what it says on your CV” they muddle on “GET ALL RECORDS FROM TABLE WHERE NAME == CUSTOMERS AND ID….”

Imagine also them recoiling in their seats when I ask them to draw out the V-model and label it.
“What’s that you say, you don’t know how? But here on your CV you state you are a senior test analyst and have and ISEB foundation certificate, how do you not know the V model?” they shuffle their feet and mumble how they studied at home, well not actually studied so much as bought a guide on how to pass that’s full of example questions.

I watch in wonder as their faces contort when I ask them “so what is the difference between white box testing and black box” I let them fumble through telling me how they have used both of those “methodologies” and I follow up with “can you give me some examples where you white box tested?”

Then come the questions on web testing (its what we do after all) “so whats a cookie I ask”, the smile, easy they think “it’s a virus you get from visiting sex sites” oh my! What should I do as a tester? “never ever accept cookies, they track all your movements, like little spys in your computer”.
I follow with a simple exercise about shopping carts and sessions to see if the candidate understands why a cookie maybe important here. “the system gets all that info from the cookie” but how did it get in the cookie? “from the internet” can I see it, I want to see my cookie “oh no you cant see them, they are secret”.

I have even had to terminate several interviews because it became apparent very quickly that the applicant in the chair didn’t actually know what was on their CV because they had just copied it off a (delete as appropriate) friend / colleague / LinkedIn profile.

It was so bad in the past that we setup an online quiz. It was very simple, multiple choice, some questions around testing, some around our domain. Some very easy questions “Which of the following is a search engine” with an obvious answer “google”. We discovered a side effect of an easy question like that is that we could see how fast the candidate answered the question when they knew it straight off the cuff (about 9 seconds for that question) and compare it to a testing related question that took 3 them minutes to answer (did they have to search for the answer?). The test was very easy for a career web tester, but not so easy for an IT support person or BA or even a developer who fancied a move into testing. Its only real purpose was to filter out the complete time wasters.

So here I am again, I’m hiring and I’m inundated with CVs and again 50% are pure wasted bandwidth (I don’t give them the luxury of printing them out). But this time I don’t have the online test, and I’m gnashing my teeth at some of the incredulous stuff in these CVS. Some of them read like horrible blog postings “on this job we had this challenge and so we had to X because of Y but then Z happened and so we used plan A…” blah blah blah “then the business wanted B but I wrote the very detailed spec of C” bletch grrr spit pfftt, its all I can do to stop myself posting these fetid monologues online for no other reason than ridicule and I hate myself for it.

So faced with the prospect of interviewing a load of (lets be blunt here) bullshit artists only to show them the door at the end of it isn’t a prospect I’m overjoyed with. I don’t want to spend two hours of my life demonstrating why the candidate is a liar. I don’t want to be associated with these bottom feeders in a professional sense either. I loathe them, and I loathe the arseholes that gave them a “consultant” badge at Logica (or any other faceless body shop), because now they think they are gods gift and we should roll out the red carpet for them.

I will continue to sift through the dross, the cream always floats and that’s what I’m after, the cream, the crème de la crème.

So if you are interviewing a tester that tells you that you gave them a much easier ride than in a previous interview they attended, you know I rejected them, and that you may want to make use of that probationary period.

But if you’re a testers who’s CV isn’t straight up and down, you may want to rethink applying for a job with me.

Oh and by the way, don’t put “I have a keen eye for attention to detail” then litter your CV with spelling mistakes poor grammar and mixed styling!

Hokey Cokey or Hocus Pocus

Back in September 2007 we released a new version of our search application.
The new version was step change for us. At that time we were powering the core of our search offering with an Oracle database, and a Java application that returned flat html. It was all very web 1.0, and we had begun to see issues with the performance of the site and discovered that throwing 8 more servers into the oracle grid didn’t gives 8 x more power. We took the oracle database out of the mix and brought in Endeca search.

The Endeca API allowed us to show the visitors how many of the things they were searching for were available before they submitted the search from. For example, if you were searching for a BMW 5-Series, the fuel type drop down on the search form would list the number available next to the drop down [Petrol (5), LPG (2)]. So a big change to the “build your search and submit it and hope it returns results” model we had previously used. To be able to allow this feature to work we had to use Ajax, or more specifically JSon. So as the user changed their criterion the relevant drop down were updated without refreshing the form. So like I said a step change for the front end, the back end and user behaviour.

The new version was released in stages, inviting visitors to the site to try the new version. This tactic has its own associated problems (for example only a certain type of person will follow a “try our new X” link, so your new application doesn’t get exposure to a good representation of your audience), once the visitors had interacted with the new search form, we invited them to give us some feedback, so that we could improve on what we had we had done. Below is a selection of that feedback:

It crashed 5 times and slow.
Takes longer, too complicated, should be kept simple
Too slow!!
Not as easy to use
Very slow to bring up menus, Spent time waiting.
It doesn’t work – my searches keep spontaneously disapearing (Cars)
is slow. maybe is my broadband problem.
I don’t want to know how many cars in the uk, I just want to know how many in my local area
It’s silly to have to click ‘show your results’ it was better on the previous version where it showed the results.
Too slow in uploads.
More criteria = more time.
Too many details to put in .
More options, as not 100% encyclopaedic knowledge of cars, the sub model option was difficult .


So, pretty damming stuff. But something didn’t make any sense. We had rigorously tested the performance of the system and were confident that it was faster than the old system. The Market leading browse back then was IE6 and given that we had engineered or built it for IE6 it positively flew in Opera or FireFox. So we were perplexed. That is until we did some usability testing (I wont discuss the fact that the usability testing was too late in the project to be really beneficial).

The usability testing did allow us to understand why we got so many “slow” comments on the feedback. Faced with all the feedback the new search form gave the users as they refined their search, they believed two things, 1. That they had to fill in all of the options and 2, that they couldn’t interact with the form until the animated counters stopped moving.

Manufacturer, Model, Variant, Trim, Colour, Fuel Type, Mileage, Age, Min Price, Max Price, Distance from the visitor. As the user slowly changed each of the drop down controls on the search form, some options would become unavailable (greyed out). This was because the back end had contracted them out of the possible results. If no Red BMWs were available, the colour Red would not be available for choice on the drop down control. So the user would change say model to 3-Series and find there wasn’t any Red available on the drop down, so they would back up and change 3-Series to 5-Series and so on. They didn’t realise you could just search for all the Red cars within 20 miles of their house, and drill down from there. To some extent they still don’t 2 years on.

It reminds me a little bit of when I was working on a project with BT and the then new System-X exchanges. The switches could support loads of (then) new features (things we take for granted today, like 1471 in the UK). Being a geek I was amazed at what I could do with a DTMF (touch tone) phone, and went out immediately and bought one. The next day I asked why BT hadn’t publicised any of the features and capabilities. Their response was immediate, dry and serious. “Our users won’t understand them”. I can still remember how I felt, almost like I had stumbled into some great conspiracy. BT wanted to keep people in the dark, and protect them from the nasty technology that might confuse them.
It was several years later that I received a booklet with my phone bill that explained the features and how to access them. Having used the features for sometime at hat point, I had great difficulty in understanding the booklet. Maybe BT were right, maybe it was all too confusing.

Fast forward to now, and my current project. Again another release and another step change. This time the look and feel of the site has been overhauled. The backend is still Endeca powered, but the Java app has been completely rewritten. And in the rewriting of the application we have taken the opportunity to bake testing in from the start. The JavaScript, cascading style sheets and html are all tested automatically. Regression should be a thing of the past (but that’s another blog post), the application has unit testing, functional and non functional testing applied at every check in. The functional testing has been expanded into “User Journey” testing, in which likely user scenarios are played out. All of this happens automatically before the application reaches QA. Then the QA team goes to town, with time to exercise their real skill, exploratory testing. So there you have it, never in the history of our company has a product been so well inspected. So we felt pretty confident when we were ready for Beta.

This time round, instead of us inviting user to try out our new site, we employed AB testing. 5% of our traffic was diverted to the new site, and once again, users were invited to leave feedback. I took the opportunity to setup a Google alert for spotting the use of the beta url in forum or blog posts, so I could keep track of what the community was saying.
Once again the feedback came in…

The used car search, the old one is much clearer to use and a lot better, . The new “improved” one is poor.
Preffered old site looked more proffesional and was easier to use.
The search criteria should be your main focus and keep that in a clear box format like your old site and allow people to search quickly but also as specifically as they want.
The old site is much better the new site is more complicated to use in the end I shut it down and went on to ebay please change back.
It looks much better than the previous website, but since I dont live in UK, I usually have to copy and paste the London postcode from the FAQ page. Unfortunately, I cannot find the page.
Bad design. Not as easy to use and selct options, not as clear and concise. the old one was perfect.

Erm, what? The old one was better? Perfect? Now we are confused.

So again we tackle the perceived issues of our users. We keep seeing comments of missing images, and we start pulling apart the application, the infrastructure and network. It turns out it was an Ad Blocker, that has decided that the way we format our image urls (cache busting) means they are adverts and blocks them.
People complain of slow loading time, so I begin to conduct some testing around that. I conclude they maybe right, so we engage with Gomez to find out for sure. Gomez shows something alarming. Users on a half decent (2Mb and above) broadband connection will get a decent experience. People on anything less are going to be pulling their hair out. The digital Britain report suggests that most of the UK has 3Mb broadband, so do our users just have slow connections? Regardless I have begun some work into improving the perceived page load times, and will roll those requirements into cross cutting requirements in the same way as we do for SEO and DDA compliance. We are going to lighten the page weight and strip out the heavy JQuery that is only used to titillate. We are going to build our own analytics into the front end that will allow us to see in real time what the users experience (current render times etc), we are moving some of the content so that it resides under a new host allowing it to be fetched in parallel by the browser. All of this should help the usrs with slow connections.

But what about the “it crashes my browser” comment? Our in page analytics trap JavaScript errors and reports them. And while our users suffer at the hands of errant JavaScript squirted into the page by 3rd party advert traffickers our own code is solid, so what’s this crash?

We contacted a customer who had left his details and asked him if he could walk us through the “crash” and we would follow along step by step in the office. At the point where he claimed his browser had crashed, we were watching the light box “melt away”, something we had designed in. His expectation was that the light box would work like a tab, and that he could tab between the photos and the detailed specification of the vehicle. Not melt away to the bottom of the screen. So now we will remove the animations on the light boxes (and other objects).

What have I learnt?

Three things:

1. Next project, I’m running the usability testing, with real scenarios and everything.
2. Perceived performance is more damaging than actual performance.
3. BT may have been right…

Hokey Cokey or Hocus Pocus

Back in September 2007 we released a new version of our search application.
The new version was step change for us. At that time we were powering the core of our search offering with an Oracle database, and a Java application that returned flat html. It was all very web 1.0, and we had begun to see issues with the performance of the site and discovered that throwing 8 more servers into the oracle grid didn’t gives 8 x more power. We took the oracle database out of the mix and brought in Endeca search.

The Endeca API allowed us to show the visitors how many of the things they were searching for were available before they submitted the search from. For example, if you were searching for a BMW 5-Series, the fuel type drop down on the search form would list the number available next to the drop down [Petrol (5), LPG (2)]. So a big change to the “build your search and submit it and hope it returns results” model we had previously used. To be able to allow this feature to work we had to use Ajax, or more specifically JSon. So as the user changed their criterion the relevant drop down were updated without refreshing the form. So like I said a step change for the front end, the back end and user behaviour.

The new version was released in stages, inviting visitors to the site to try the new version. This tactic has its own associated problems (for example only a certain type of person will follow a “try our new X” link, so your new application doesn’t get exposure to a good representation of your audience), once the visitors had interacted with the new search form, we invited them to give us some feedback, so that we could improve on what we had we had done. Below is a selection of that feedback:

It crashed 5 times and slow.
Takes longer, too complicated, should be kept simple
Too slow!!
Not as easy to use
Very slow to bring up menus, Spent time waiting.
It doesn’t work – my searches keep spontaneously disapearing (Cars)
is slow. maybe is my broadband problem.
I don’t want to know how many cars in the uk, I just want to know how many in my local area
It’s silly to have to click ‘show your results’ it was better on the previous version where it showed the results.
Too slow in uploads.
More criteria = more time.
Too many details to put in .
More options, as not 100% encyclopaedic knowledge of cars, the sub model option was difficult .


So, pretty damming stuff. But something didn’t make any sense. We had rigorously tested the performance of the system and were confident that it was faster than the old system. The Market leading browse back then was IE6 and given that we had engineered or built it for IE6 it positively flew in Opera or FireFox. So we were perplexed. That is until we did some usability testing (I wont discuss the fact that the usability testing was too late in the project to be really beneficial).

The usability testing did allow us to understand why we got so many “slow” comments on the feedback. Faced with all the feedback the new search form gave the users as they refined their search, they believed two things, 1. That they had to fill in all of the options and 2, that they couldn’t interact with the form until the animated counters stopped moving.

Manufacturer, Model, Variant, Trim, Colour, Fuel Type, Mileage, Age, Min Price, Max Price, Distance from the visitor. As the user slowly changed each of the drop down controls on the search form, some options would become unavailable (greyed out). This was because the back end had contracted them out of the possible results. If no Red BMWs were available, the colour Red would not be available for choice on the drop down control. So the user would change say model to 3-Series and find there wasn’t any Red available on the drop down, so they would back up and change 3-Series to 5-Series and so on. They didn’t realise you could just search for all the Red cars within 20 miles of their house, and drill down from there. To some extent they still don’t 2 years on.

It reminds me a little bit of when I was working on a project with BT and the then new System-X exchanges. The switches could support loads of (then) new features (things we take for granted today, like 1471 in the UK). Being a geek I was amazed at what I could do with a DTMF (touch tone) phone, and went out immediately and bought one. The next day I asked why BT hadn’t publicised any of the features and capabilities. Their response was immediate, dry and serious. “Our users won’t understand them”. I can still remember how I felt, almost like I had stumbled into some great conspiracy. BT wanted to keep people in the dark, and protect them from the nasty technology that might confuse them.
It was several years later that I received a booklet with my phone bill that explained the features and how to access them. Having used the features for sometime at hat point, I had great difficulty in understanding the booklet. Maybe BT were right, maybe it was all too confusing.

Fast forward to now, and my current project. Again another release and another step change. This time the look and feel of the site has been overhauled. The backend is still Endeca powered, but the Java app has been completely rewritten. And in the rewriting of the application we have taken the opportunity to bake testing in from the start. The JavaScript, cascading style sheets and html are all tested automatically. Regression should be a thing of the past (but that’s another blog post), the application has unit testing, functional and non functional testing applied at every check in. The functional testing has been expanded into “User Journey” testing, in which likely user scenarios are played out. All of this happens automatically before the application reaches QA. Then the QA team goes to town, with time to exercise their real skill, exploratory testing. So there you have it, never in the history of our company has a product been so well inspected. So we felt pretty confident when we were ready for Beta.

This time round, instead of us inviting user to try out our new site, we employed AB testing. 5% of our traffic was diverted to the new site, and once again, users were invited to leave feedback. I took the opportunity to setup a Google alert for spotting the use of the beta url in forum or blog posts, so I could keep track of what the community was saying.
Once again the feedback came in…

The used car search, the old one is much clearer to use and a lot better, . The new “improved” one is poor.
Preffered old site looked more proffesional and was easier to use.
The search criteria should be your main focus and keep that in a clear box format like your old site and allow people to search quickly but also as specifically as they want.
The old site is much better the new site is more complicated to use in the end I shut it down and went on to ebay please change back.
It looks much better than the previous website, but since I dont live in UK, I usually have to copy and paste the London postcode from the FAQ page. Unfortunately, I cannot find the page.
Bad design. Not as easy to use and selct options, not as clear and concise. the old one was perfect.

Erm, what? The old one was better? Perfect? Now we are confused.

So again we tackle the perceived issues of our users. We keep seeing comments of missing images, and we start pulling apart the application, the infrastructure and network. It turns out it was an Ad Blocker, that has decided that the way we format our image urls (cache busting) means they are adverts and blocks them.
People complain of slow loading time, so I begin to conduct some testing around that. I conclude they maybe right, so we engage with Gomez to find out for sure. Gomez shows something alarming. Users on a half decent (2Mb and above) broadband connection will get a decent experience. People on anything less are going to be pulling their hair out. The digital Britain report suggests that most of the UK has 3Mb broadband, so do our users just have slow connections? Regardless I have begun some work into improving the perceived page load times, and will roll those requirements into cross cutting requirements in the same way as we do for SEO and DDA compliance. We are going to lighten the page weight and strip out the heavy JQuery that is only used to titillate. We are going to build our own analytics into the front end that will allow us to see in real time what the users experience (current render times etc), we are moving some of the content so that it resides under a new host allowing it to be fetched in parallel by the browser. All of this should help the usrs with slow connections.

But what about the “it crashes my browser” comment? Our in page analytics trap JavaScript errors and reports them. And while our users suffer at the hands of errant JavaScript squirted into the page by 3rd party advert traffickers our own code is solid, so what’s this crash?

We contacted a customer who had left his details and asked him if he could walk us through the “crash” and we would follow along step by step in the office. At the point where he claimed his browser had crashed, we were watching the light box “melt away”, something we had designed in. His expectation was that the light box would work like a tab, and that he could tab between the photos and the detailed specification of the vehicle. Not melt away to the bottom of the screen. So now we will remove the animations on the light boxes (and other objects).

What have I learnt?

Three things:

1. Next project, I’m running the usability testing, with real scenarios and everything.
2. Perceived performance is more damaging than actual performance.
3. BT may have been right…