Internationalization was only the beginning…

November 2, 2009

Just a brief (and rare, unfortunately) post to share a terrific post on QA Hates You about the upcoming changes to web domains. Here is the post. Go read it. Go ahead, I’ll wait here…

This is the kind of thing that just gives me nightmares. Not only will it be fun worrying about validation on email address fields and the like, but trust me – it is SO MUCH FUN trying to explain the difference between “language,” “locale,” and “script” to the uninitiated. Especially when said uninitiated is in upper management and laboring under the delusion that “Mandarin” is an actual written language (for reference, it is a spoken dialect), and he is always right. Not that I’m bitter.

Other folks in the organization (management, developers, etc.) seem to not worry about issues like this until it’s too late. But here we are in QA, the old worry-warts, fretting away. Well, at least when everyone is in a tizzy over it some time next year, we can say “told you so!”

Snow Leopard and a Hard Drive Hissy Fit

September 4, 2009

I upgraded my Macbook Pro 17″ to Snow Leopard this week. It went pretty well. What they say is true – startup times are noticeably faster. For all the apparent tweaks that went into Snow Leopard, what it ended up costing me ($24.99 I think on Amazon) was definitely worth it. Except I also had to spend $49-ish to upgrade my Parallels to version 4.  But I probably should have done that a long time ago anyway. So I guess this time we’ll call it even.

In a story with not such a happy ending, my dev laptop at work went belly-up on Tuesday morning. I arrived to a BSOD, then it wouldn’t boot at all. IT took it and anaylzed it for me, and while the hard drive was OK physically, the file system was apparently corrupted. IT had to re-image my machine, which means I lost everything. (To give my IT dude credit, he did try to hook my drive up as a slave to another one, in order to recover any data he could, but it just wouldn’t work.)

THANK GOD FOR SUBVERSION. All my deliverable test documents, both final and draft versions, were checked in; along with my test logs, notes, JMeter scripts, and every other thing I thought might be important. However, I did have to re-install all of my software. I took the extra step of adding some free backup software and scheduling a daily backup of all my important data and settings to my network share, just in case. Apparently IT makes sure this is happening on my corporate machine, where I naturally do no important work at all; but when it comes to protecting my dev machine where the real work happens, I’m on my own.

OK then. Lesson learned.

Why Won’t “Sneakernet” Go Away?

July 26, 2009

A few weeks ago I was testing some code that still wasn’t completely finished. The functionality was mostly there, but it still needed a lot of polish, and some things just didn’t work at all. The developer I was working with at the time brought me an installer…on a thumb drive. Why didn’t he check it into Subversion, where I could easily find it? I don’t know. Why didn’t he email it to me on our secure internal network? I don’t know. Whatever, I copied the installer and ran it and got started.

It wasn’t long before I ran into problems, and had to call the developer. He fixed these problems, so I could continue, by…bringing me some additional files and things on a thumb drive. Grrr!! Now I’m getting frustrated. This started a day-long cycle of me trying to test, things not working, and the dev coming to my desk with more stuff on a thumb drive, sitting down and taking over my computer, and copying/tweaking/hacking to make things work.

I thought I was going to kill someone.

I hate it when someone decides he has to “drive” my computer in order to fix something. If it needs that much developer intervention, it’s not ready for testing, end of story. Take it back to YOUR desk, troubleshoot it, unit test it, fix the damn code, and bring it back when it’s ready. Strike that! When it’s ready, check it into Subversion and tell me where it is, so I can get it. Or at least email me a package that contains everything I need.

This isn’t the first time I’ve been given things to test or work with via a thumb drive. Sometimes it’s on a CD-R, but whatever the media, the point is – why won’t this “sneakernet” method of sharing data and code go away? It drives me crazy! I worked on a project last year for a while that had a single developer. He wasn’t checking stuff into Subversion at all. When I asked him about it, he said everything was on his local drive (*shudder*), and backed up to a – you guessed it – thumb drive. This wasn’t a small, personal project, either – it was a major IRAD effort which he had been working on mostly by himself up to that point. But it had requirements and a spec, and was soon to have three other developers (and a tester – me!) working on it as well.

Obviously, priority one was to get all the code checked into Subversion in an organized way. But why was a professional developer working that way in the first place? WHY WHY WHY? These kinds of questions drive me crazy. I hate it when an organization or project calls itself “quick and agile,” when it’s really just “lazy and undisciplined.”

And that is my tester’s rant for today.

Quite a Threesome

May 5, 2009

The developer, the project manager, and the tester. On most of the projects I’ve worked on, this is the dynamic. Sometimes there is more than one developer; sometimes the project manager is also the developer, or one of the developers. Larger projects might also include a requirements analyst or manager, a technical writer, and an interface designer. But on all but the very largest projects there’s usually only one tester (that’s me!). And on most of the projects that I’ve worked on in the last few years, which tend to be smaller and more agile, it’s just the unholy threesome: developer(s), project manager, tester.

It’s interesting how the different personalities of these roles take shape and present themselves. Since I can only speak to my own experiences, I am the constant in my observations and anecdotes. I have a certain philosophy of testing and it rarely varies, though it has certainly grown and changed subtley over the years. What’s surprising is that despite having worked with many different individuals, a lot of them seem to share a world view when it comes to software projects, based on what roles they fulfill.

The Project Manager: Some are good, some are bad. The best are organized, and aware of the importance of a schedule (which can be flexible) and project milestones. Occasionally you get a really awful one who is completely unwilling to commit to a schedule, work breakdown, milestones, or anything like that. In terms of their attitudes towards testing, a lot of project managers seem to consider only testing against the requirements, or verifying the basic functionality of a system. The concept of negative testing (attempting to create error conditions, less-than-perfect environments, unexpected user behavior, user error) seems novel to them.

The Developer: A study in contrasts. On one hand, most developers are very good at thinking of negative tests, and give suggestions for possible negative test cases (some useful, some completely labrynthine and over-the-top). What often surprises me about developers is, they’re so good at thinking of negative tests for ME to run, but so bad at putting error handling into their own code. I guarantee you, you can almost always trip up even the best developer with the simple trick of testing field constraints. Enter 1,000 characters into a username field that’s probably limited in the database to 15 or so, and watch the application puke. Fun!

The Tester (yours truly): I have to do both – test to requirements and verify functionality, but also test negative conditions and check for error handling and try to break the darn thing to see what happens. I have to point out the bugs in a polite and supportive way, so nobody’s feelings are hurt. I have to nudge the project manager to put together the schedule, or at least commit to some informal milestones, then keep track of them myself. Sometimes I have to beg for more hours.

Once, a project manager asked me to create test documentation for a system, and then test it, using only the assumed correct use case scenario. That is – assume a perfect environment as specified, a perfectly-trained user, and no error conditions, hardware failures, quirks or abnormalities. To me, that kind of testing is completely useless! What does it prove? It proves that in a perfect world, your software functions exactly as described. But there is no perfect world! We all know this, yes? Heck, in a perfect world, all code would be bug-free and all systems would be perfectly designed and built, and I’d be out of a job completely!

So until that perfect world comes about (and pigs fly, and hell freezes over, and I vote Republican, etc., etc.), I keep doing both kinds of testing, and fulfilling my role in the unholy threesome.


March 30, 2009

How can you tell when something has gained enough traction and acceptance in the software development field that it can definitely no longer be ignored or minimalized? When it’s popular enough to be the basis for a scam.

Here, Michael Bolton blogs about a supposed “agile testing certification.” It looks like a total scam to me right from the start – lack of details or links to other organizations, use of copyrighted images, and mostly the horrifically botched English spelling, grammar, and punctuation. But to his credit, Michael attempts to do some research into who the “World Agile Qualifications Board” is, and just why they have any authority to offer certifications such as “Agile Practitioner Certificate” and “Agile Master Certificate.” I won’t link to the scam site here; please read Michael’s post, where he provides a link.

Here’s what I predict will happen to anyone who actually tries to sign up for the first “training” offered:

1. Response will include instructions for paying the £990 fee, possibly via credit card but most likely via some obscure method such as money order or wire service.

2. Once they’ve tricked enough suckers into paying up front, right around May 2009, the website will disappear.

Pathetic? Yes. But also kind of cool – if someone has bothered to make a scam based on agile testing, it must be pretty well accepted in our industry, right? At least that’s my positive spin on it. 😉

QA or…?

January 30, 2009

Here is an interesting post on Evil Tester called, “Don’t call me a QA!” Fortuitous, as I was just thinking about this very issue, that of testers being referred to as “QA” or quality assurance.

I understand that in traditional models of software development, QA is NOT just testing, though testing can be a part of it. Thing is, at almost every place I’ve worked, 99% of what I have done is software testing, yet my function/team/group/title almost always uses the “QA” terminology. What this leads me to believe is that a lot of people in the IT space that I work in (specifically Federal contracting, for the most part, as well as some smaller commercial projects) misunderstand what QA is supposed to mean according to those traditional models.

Heck, maybe I misunderstand what it means myself. I have no problem being called QA (you can call me whatever you want, just don’t call me late for dinner), test, v&v (verification and validation), or whatever you want. There are a lot of terms that seem to be interpreted pretty subjectively in the field of testing, at least as it is understood by the people that hire and manage testers. “Regression testing,” “integration testing,” and “system testing” are some of the other ones. In my own experience, this only starts to matter when you’re out there interviewing for a new job! Because once you’re in a position, as long as you (as a tester) understand what’s expected of you, it doesn’t really matter all that much what it’s called.

Maybe at my next job, I’ll lobby to have my job called somethign else all together. Something more exciting. Bug Wrangler? Software Abuser? Requirements Punisher? LOL!

Schools of Testing?

January 25, 2009

So did you see all this hoo-ha last month about the various “schools” of testing? You’ve got your context-driven school, your…um…not-context-driven school (I guess), and so on, ad nauseum. Whenever I see these kinds of philosophical debates, my most frequent reaction is, “Wow…how come everybody’s arguing and nobody’s actually testing?”

I mean, seriously. It must be nice to have time to argue about which testing approach is best and what it should be called. Meanwhile, the vast majority of testers that I know are busy trying to justify our very existence to a software industry that still mostly misunderstands us and our role. What “school” of testing you consider yourself a member of really doesn’t mean squat when you’re told that “the client is refusing to pay for any more testing,” and when testing does need to occur, “we’ll just let the marketing guys pound on it for a few hours.” I’d love to know where these mythical projects and companies are, where testing is seen as integral to the effort, and there’s always time and money budgeted for it. Enough time, apparently, for the practitioners to engage in a war of words over whose school is best!

Call me cranky, but these days, I’m kind of just happy to have a job.