Wednesday 14 March 2007

4 /8 "User Surveys and Evaluation " The Survey

We have come along nicely with our design process, and reached a point that we are happy with the design, it fulfils its specification, as it were, we have been struggling to find things to improve, and hence are ready to move on from our personal scrutiny of it and get an outside perspective to guide us further. This is to be done by surveying the market place (as far as we have access to it), in order to get a real insight into how the public rate our ideas.

Designing the survey had to be done with care so that we could get the right sort of information back from our audience. We do not want to waste their time with unnecessary questions, but we do not want to miss important information either. Our target is the parents of children within the age bracket we are designing for (18 to 49 months), who we fortunately have quite good access to through family who teach. This should be a useful link as not only is it access to the market, but as it is friends and colleagues of family the response should at least be more well received than it could have been, although it does make the sample pool limited to one region (Buckinghamshire) and only two schools (on the plus side one public and one private).

We wanted to keep it simple, and to not ask any leading questions, where the question implies some answer is expected. We wanted to be asking about every aspect of the product, from the physical design to the software on board. What is important at this point is if the design is going to be suitable for children of this age range at all, as our decisions and ideas have so far been formulated based on secondary research with no real input from parents or children. We want to know if the size is ok, the shape appealing the layout intuitive, the input usable. We want to know if the software will be entertaining, whether it will be educational, whether the child would enjoy it or not. We want to know if the parents like the idea of being able to decide what is on the console and how they want to control it. In addition we want to know if there is anything the parents would like to see added, removed or adapted. But at the same time we don’t want to ask too much, or any long-winded questions that would frustrate or bore the parents while they fill out the survey, as it is important that they keep an open mind ad positive demeanour throughout.

Here is a link to what we came up with…

SURVEY

To accomplish what we were after we split the survey into progressive sections, asking what we saw as only the relevant information for each. We tried to keep the questions as simple as possible, limiting them largely to three different types;-

Option questions - Questions with a definite answer from a known finite set; yes or no, ‘this one’ or ‘that one’, were asked with the possible answers following in a list, for one to be circled. The format of these questions was maintained consistently throughout the survey.

Quantifiable questions - Where questions could be restricted to where individual properties could be rated, this was done so in order to allow us to keep the answers as quantifiable as possible. This was done by providing a scale of the property from the opposite extremes. Each property used the same length scale, going from the negative extreme to the positive to maintain consistency throughout the survey.

Qualitative questions – Where opinions were needed in such a way that they could not be quantified, such as in asking for suggestions or criticism, the answers are taken in as open ended answers within text boxes, allowing the answer to be as expressive and unrestricted as possible. These kinds of questions can take a long time to do, and if asked to do a lot of them they can really put a person of doing a survey, hence all questions of this format were asked as a conditional statement, i.e. “Please specify if there is anything in particular you like about this application…” rather than “What did you lie about this particular part of the application”.

Consistency was treated as a very important part of creating the survey. Consistency of question format means that they become very familiar quickly and the user will know exactly how to answer the question when they read it rather than having to figure out how the answer is wanted. It is very important when repetitive questions are required, as when a user sees that the questions on this page are the same as those on the previous, just a different aspect is being queried they don’t even need to read the questions, they can just fill it out. This is a good way to avoid frustrating the user and speeding up their task of filling out the survey. In terms of our survey we had to ask repetitive questions, as we wanted to know the same things about each of our different pieces of software. So we have kept the exact same format for each, which should make the mundane repetitive nature very easy to get through with minimal irritation.

No comments: