I am intrigued by the increasing flow of surveys that come across the CNBR list. I wonder how many practitioners subscribe to academic construction research lists? Worse, how do you write up a survey when you have not defined a sampling frame? What would you state about the population, the sample and the return rate? I recall recently seeing on CNBR correspondent getting quite irate with us because not enough people had completed his survey.
One recent survey was sent to the CNBR list from someone called Cameron Grogan (who signed himself Salman). It was an interesting case in point. His first question, to an international list of construction academics is: “Have you ever worked on a construction project outside of the United States?” It is clear from the context of this question that he identifies overseas as being outside the United States. But if I had never worked overseas, I would have to answer yes to this question, since I only worked on projects at home, in UK. So, having answered yes to this question, because I have worked on projects outside the United States, the next screen asks a whole load of questions with drop-down options that make little sense to me, but clearly make a great deal of sense to the researcher.
I particularly wondered about the question “Have you worked in any country which was affected by terrorism?” I guess that the UK has been affected by terrorism – but is this really what the researcher is looking for, given that his next question asks if my family travelled with my while I worked abroad? Very confusing. Until this point I was answering questions about working on sites in the UK, a country that has been affected by terrorism (as has the USA). Now, when I think about a question about taking my family with me while I worked abroad, I have to think about when my family came with me when I travelled as an academic to countries that were not affected by terrorism. In other words, all of the questions about my working abroad elicit answers that are nothing to do with the survey. So I aborted it.
I think it is important that before sending surveys out to mailing lists, students and researchers should be encouraged to think about the traditional steps in designing surveys. I wonder what we are teaching our students that leads them to make so many errors in the design of a simple survey. There are so many good books on this topic, but, for now, here are a couple of web resources to help researchers to get to grips with the basics of survey design:
(This last one has a long list of useful links at the end)
The reason that I am going into this detail and posting these links is because so many surveys we receive in papers submitted to Construction Management and Economics are so badly designed that they contribute nothing to our collective knowledge. I frequently think that these poorly designed surveys do nothing other than confirm what the researcher thought in the first place, and as such they are simply a waste of paper.
I look forward to the day when we see fewer badly designed surveys...