Google Analytics


To search for specific articles you can use advanced Google features. Go to and enter "" before your search terms, e.g. CSS selectors

will search for "CSS selectors" but only on my site.

Sunday, October 7, 2007

Good list of books you should read

A group of people/companies got together and created The SoftWare Engineering Body Of Knowledge or SWEBOK. It is basically a reference to software engineering covering everything from requirements to maintenance. There are chapters on all the stages of the software development life cycle.

A lot of that book is based on traditional, proven methodologies. A lot of what you see in Agile or eXtreme Programming environments is not well reflected in the SWEBOK. It is still a good starting point for anyone wanting to understand things like Testing or Software Quality Assurance.

The book is a list of known definitions and descriptions plus, and this is the important part, a reference to where the definition/description came from. It is the list of reference material at the back of each chapter which makes the SWEBOK most valuable. It is a great place to build a good library of books from.

When I started out in programming, I was self taught. I only knew things from trial and error or based on whatever books I could find in stores or the public library. Books like The Art Of Programming by Donald Knuth or even Algorithms by Robert Sedgwick were not available in computer stores. Once I found these books I started to realize how little I knew. To paraphrase Socrates, "I am smarter than most because I know I know nothing." Read the SWEBOK and you'll know just how much there is to know and hopefully you'll be able to realize how much more there is for you to learn.

Interview Help Sites

I recently when to Wikipedia and searched for Software Quality Assurance. The page for SQA contained two external links, both of which were the same site but different pages.

I went to the external links and found something I've found before, while trying to grow my knowledge of SQA. I found a site with Interview Questions.

The general idea behind these type of sites is a mish-mash of interview questions with answers. The site has some semblance of organization but as you go through it you will find the same questions with different answers. If I had to guess, someone set up a site to have people post questions they have been asked in interviews. The person is trying to remember the question and often forgets important details. So the questions are not well formed. On top of that, the answers to the questions are often from the people who failed the interview or various people trying to help the interviewee answer the question.

For example, there is a section on C language (NOT C++). The first question is "1.What is polymorphism?". Obviously not a C language question.

In some cases I wonder if the person who created the original question really knows what they are doing. For example,
10. What will be the output of the following code?

void main () {
int i = 0 , a[3];
a[i] = i++;

The answer posted notes that the a[0] will be assigned the value 0 then i will be incremented to a value of 1. The printf will attempt to reference a[1] but since nothing has been assigned to this, you will get back a random value.

This is very true. What should also be noted, if this is a C language question, is that the ANSI C standard requires main to return an int for defined behaviour. Declaring main as "void main()" is okay in C++ but not in C. In pre-ANSI C the keyword void did not exist. When you see something like:
printf("Hello world.\n");
return 0;

The default return type, when not explicitly indicated is an int. So the above snippet is the equivalent of:
int main()
printf("Hello world.\n");
return 0;

Many people wrongly assume no explicit return type means it returns void.

The questions on the interview web site have a lot of wrong answers. Oddly enough, I have conducted technical interview for hundreds of people on various languages and operating systems. I find a fair number of them seem to either have no knowledge of what they claim to know or they frequent these interview web sites and have bad knowledge of things they claim to know.

If you are surfing the web looking for answers to interview questions, think twice about the source of the information. Just surf the site and think are there things about the question which are questionable? Is the same question posted twice but with different answers? Are questions in the wrong section? Are there questions without answers? If the answer is yes to these questions then the person who is putting up the site probably knows as much or less than you.

Additionally, when I run a whois on the site, the owner of the site is hidden. If you don't know who owns the site, how do you know you can trust the information? Why don't they want you to know who they are?

Bottom line, if you try using these interview sites to make it through an interview you might get the job but you will not keep it. These sites are good for questions but you want to find out the answers for yourself and not trust the answers posted. I hang out on various forums and newsgroups. If you seem like someone who really wants to learn I'll help you out. If you just want to pass an interview I'll know it.

Tuesday, October 2, 2007

Is Automated testing development?

I'm not talking about unit testing. I am talking about regression testing. There are a number of automation tools out there and for some applications you can just use the record and playback feature. WinRunner, SilkTest, RationalRobot, etc. all have a feature where you can turn on a recorder, manually walk through an application then save the script. Later you can play the script back; if nothing has changed the script should execute without error.

This is the theory. The reality is that most projects change and the scripts fail. You then have to take the time to re-record the script or edit the code so it matches the change in the application. Additionally, the scripts tend to make the application do things but the tester still needs to add code to the script to confirm the right things happen, e.g. assert statements or capture points.

So testers are creating, maintaining, enhancing and debugging source code. This sounds a lot like development work. Yet in most the places I've seen people doing automation and with most the people I've interviewed (and some I hired), very few have knowledge of software development.

Yesterday I was talking to someone using an automated script. The script worked fine for the person developing it but did not for the person I was talking to. It turns out that the script assumes relative paths to other things. If you don't run it from the right directory (not the directory the script is in) it fails to work. To fix this flaw the 'developer' added a command line option to the script. The logic was "If there is a $1 parameter, cd $1 else assume you are in the correct directory."

There was no comments in the script, they did not reassign the $1 variable to something more sensible and they checked for $1 deep in the script, i.e. not at the top.

The person I spoke with spent an hour trying to figure out what was wrong. She even spoke with the creator of the script and he couldn't figure out she was doing wrong.

A good development practice is a coding style guideline. Using appropriate comments, parsing input parameters near the beginning of the script and possibly writing it as a function. Developers working on a team have learned that a consistent style makes it easier for everyone to take over someone else's code. At first a new developer might want everyone to switch to their standard but once they come around everyone benefits.

Creators of automated regression tests never create a coding standard. In many cases they don't use source control. Additionally, they will pick automation tools that have poor or no debugging capabilities. Developers like Visual C++ or Java because the IDEs are so advanced. Once you get familiar with Eclipse or NetBeans, you could never imagine using Java from the command line again.

If developers are using powerful tools like Eclipse to develop their code, how is an automated tester going to keep up? Every time the developer makes a required change/enhancement to the application, the tester will have to maintain their scripts. If the developer can make the change in minutes but the tester takes hours, the cost of maintaining the automation will not be worth it.

I like the idea of pair programming where one person does the development and the other person codes the tests. Agile programmers are more thinking about unit testing when they describe this concept but why not have an integration or system level tester be a person with development skills?

I'm not saying that developers should start doing automation testing. Developers have a different mindset then a tester. I'm suggesting that testers should have some fundamental development training. If you hire QA or testing staff with development experience you will probably get better automation.

Additionally, if you are an automation tester, learn development techniques and apply them to your script development. Become more efficient. In many companies, they get automation tools but end up abandoning them because they become a maintenance nightmare. Maybe you will be the person who saves the tools and keeps the company using them.

Finally, automated testing is not WHAT you want to test. It is HOW you want to test. If I'm creating an application, I'll first come up with a list of requirements and design WHAT I want to create. If I decide I'm going to write the application in Java or C++, the language does not, for the most part, dictate WHAT I'm going to create. The automation tool you use comes at the implementation stage. You still need a test strategy, test plans and a list of test cases. Only then should you be looking at HOW you are going to automate the test cases.