Analysis: Delib's guide to preparation and activity design

Getting started with analysis

Whether you're an analyst by profession or analysing an activity for the first time, this guide aims to give you an overview of how to run a successful process, alongside tips and tricks for getting the right mix. We've developed these articles to provide guidance for our customers and to share key lessons we've learnt. If you are an analysis expert and feel this guide is missing anything please feel free to get in touch - we love continuous improvement.

What is Analysis?

Analysis involves taking data collected and turning it into meaningful and actionable information. This information can be used in order to make effective adjustments and improvements. Analysis is one of the most important parts of the consultation process. Respondents have taken valuable time to respond to an activity and are owed an effective response back.

We’ve split our analysis success guide into three parts to break down the process:

1.    Preparation and activity design

2.    Getting to know your data

3.     Producing analytical reports.

There's also our quick start analysis guide, which is a good first port of call if you need to turn the results of a recently closed activity into meaningful data and don't quite know where to start.

Preparing for your activity

The reason some people feel they don’t get on with analysis is that often the planning stage is missed out. Effective planning and preparation makes it easier to produce useful analysis. It’s important to approach the task of analysis in a structured way. You should think carefully about who will actually be doing the analysis, what they’re going to do and when they’re going to do it. Time can be a constraint and good preparation will help you focus on doing what needs to be done in the time available. Another part of good preparation is in the design of the activity itself – if you don’t ask the right questions in the right way, you’ll find it very hard to get useful data out of your responses.  Lastly, there are a lot of things you can do to improve your capacity to analyse, and get yourself and colleagues in a good condition to deal with data.

Taking a structured approach

Figure out what you have available for your analysis

In this context, we mean people and time. Analysis will always involve an element of people time. If you know you are not going to have much time available to analyse, and it's appropriate for your activity, you might have to try and maximise your quantitative data collection. Do you have the skills to analyse the data you’ll get in your responses, or is there someone in your department such as a social researcher who may be able to help advise?

Top Tip: Depending on the length of your activity and types of questions asked, we have found that for up to 300 responses, it will on average take two people two days to do the analysis.  If you are looking at more than 500 responses, it will be a larger piece of work and you may want to consider approaching an external company, allocating more time, or drafting in additional people.


If you are analysing the data yourself or within a small team, think about what tools you have at your disposal. If you have access to a second screen, use it - Excel can become mind boggling after a while. If you can keep a clear record of everything you are doing - perhaps via an audit log for example. If your system permits it (it can often be blocked by security protocols), Google Docs can be a great tool for collaborative working or assigning different colleagues to different tasks.

Be clear about what you want to achieve, and establish success metrics.

Before you start your activity, you should be clear about what you want to get from the exercise. These considerations should play a big part in how you design the activity. Remember that an unnecessary or poorly-thought out question will not only reduce the chances of a respondent completing the activity, it will also create more work for you at the analysis stage.

To guide and focus your work, it’s useful to establish some success metrics for your activity. Examples could include:

  • Number or responses: Think about whether you need a very high volume of responses or a smaller number of high-quality responses.
  • Quality of response: If you are aiming to reach a very large audience, you may want to focus your analysis on quite high-level figures, (ie. 80% of 750 respondents were in favour of the plans, except for group x, of whom only 40% of 55 respondents approved the plans). If you want to focus on a smaller group, you might want to get above a certain number of qualitative statements from each (see 'qualitative vs quantitative questions' later in this article).
  • If you have one or two ‘key’ questions in the activity, how many people answered these questions?
  • How well did the responses reflect the demographics of the audience?
  • How well planned was the activity and what could be improved upon for next time?
  • How long did the analysis take and how did this compare to the planned allocated time? If you audited each others’ work, (such as reviewing any codes applied), how did this go?

Designing an effective activity

Make sure the activity fits your purposes

It is important to make sure your questions really match what you need to get out of the exercise. Avoid unnecessary questions: everything you ask should be geared towards producing useful data outputs, which really answer your organisation’s need. It can be tempting to ‘fill out’ an activity with extra questions if it feels a bit short – avoid this. Your respondents will not enjoy answering questions they don’t need to and it's just more work to analyse. If there are only a few questions you need to ask, focus on making them as high quality as possible. If the person/team designing the activity and the person/team producing the analysis are not the same, make sure you are joined up and talking clearly about this.

When asked, you should be able to easily articulate why an activity is being run. Has anything already been run in this area before which could be included as a reference point? If it is part of a phase in the policymaking cycle make this clear. How much difference will this activity actually make? Citizen Space guides you to think about this a part of the design by providing space in 'Edit Activity Details' for an Overview of the context of the activity and a 'Why your views matter' section.

Be aware of your audience – and use analyst-only questions to your advantage

Be open about the possibility of bias in your results, and seek to address it. You may not get a representative selection of responses through a consultation. Those with strong views on your policy are more likely to take the time to respond. Alongside being mindful of your population sample, using analyst only questions in Citizen Space can help bring out key information once you have read and digested the response. This type of question allows you to add additional data to help you catalogue responses, for example, if you get a disproportionate amount of data from one group in your audience. These questions can also help you with your overall analysis process such as using them to log who analysed the response, whether there is follow-up work to do, and so on. This article gives more information about analyst only questions.

Quantitative vs Qualitative questions

When designing your activity, it’s important to distinguish whether you want to get quantitative and qualitative data out of a question. Think about the different responses you will get to the same question if it is posed using a quantitative component (radio buttons, checkboxes, etc), or a qualitative one (comments box, text field etc):

An example question with both a checkboxes quantitative component and a qualitative comments box.

In your responses to the first question (the quantitative checkboxes), you'd be able to clearly see which is the preferred choice of respondents as well as the correlation between each of the choices (as you can tick more than one). The second question on the other hand, (the qualitative comments box) will get free text responses that will need to be read by a human to be interpreted. They could all say completely different things.

This is essentially what we mean here by 'qualitative'  and 'quantitative'

Sometimes it is very clear, but it can also sometimes be a hard choice whether a question should be given a quantitative or qualitative component.

Things to consider are:

  • In general, qualitative questions are not as well-suited to producing statistical data, such as charts, graphs, and numerical figures.
  • However, qualitative questions give the respondent much more control over their answer, and allow for a much wider range of responses and sentiment.
  • Using the tagging feature of Citizen Space (which performs what is called ‘coding’ in statistics), qualitative data can effectively be turned into quantitative data. However, you should be aware that:
    • This can require a lot of time and effort from analysts.
    • There is the risk or introducing bias in the analysis process through your coding. Getting a clear method in place before you start can help
  • Qualitative questions are suited to questions that have a limited number of answers. Therefore before making a question qualitative and ‘open’, think about whether it really is open, or whether there are only a few answers that you will get. This can save you time tagging responses after they come in.
  • In activities where you are looking for general and open responses (for example on a detailed documents, plans or new regulation), qualitative questions can be good for drawing out a nuanced response. This is particularly case where you have an audience of ‘experts’. In these cases, it may help your analysis to give some guidelines in the question to guide responses – for example, ask respondents to answer specific questions rather than just ‘do you have any comment on x’?
  • At their best, qualitative questions will let you tell a compelling story of what your respondents think. The best quantitative questions will produce numerical data that can be turned into compelling visualisations and subjected to detailed statistical analysis.
  • Ultimately, when reporting back on the results of an activity, you are using the data collected to tell the story of what was asked and what was said. A mixture of stats to give clarity, and explanation to provide nuance often works well.

Building Capacity

There are a lot of things you can do to make sure you and your organisation are as well-equipped as possible to analyse your responses:

Audit your existing capacity to analyse

  • Could it be that someone in your team already has the skills that you are looking for, or is there someone has an interest in the topic, and could be made a ‘lead’ on data analysis?
  • Have a look around the likely places to see if anyone in your organisation has produced useful documents or guidance on analysis. 

Document your work

We do this at Delib all the time, we rely on repeatable processes and we save time by writing down how we did something tricky, so that our team mates don't have to spend time doing the same investigations that we had to.

  • If they are not already available, create some useful documents that can be shared around your organisation
  • Produce an analysis guidance booklet, which references the internal tools and resources available to help. Widely promote this.
  • Create a standard report template, if you don’t have one already. This can help if people without an analytical background will be in charge of reporting, and they don’t know where to start.

Run drop-in sessions or promote online tutorials

We hope these articles are a useful starting point, but there is plenty more information online: Youtube videos, Microsoft tutorials, etc. For a start, in terms of general best practice, you can have a look at this from the Scottish government.

Seminars or drop-in sessions can be a great way of passing what you have learned onto colleagues. They may even offer to teach you something in return! You can promote ‘skills swap’ in your organisation, where people volunteer to teach each other a skill. Alternatively, a ‘learning hour’ – where you take out an hour every week for one team member to teach everyone about something they are an expert in – can be both useful and fun, (depending on what your team members’ expertise is in…).

Work collaboratively, Audit each others' work & evaluate

  • Having at least two people working on data analysis is a good way of avoiding your analysis becoming too subjective, especially if you have a lot of qualitative data. Getting two eyes on the data will make your interpretation more objective, and help avoid errors. 
  • Analyst only questions can be used in Citizen Space in order to indicate who has analysed a particular response, creating accountability.
  • Analysis is an inherently subjective process and can take a large amount of time to complete. Ensuring more than one individual in an organisation is familiar with the data can help reduce the risk on this.

Top Tip: As a quality and objective check - ensure a colleague who has not done the analysis, but is involved in the your project, is available to check every 10th response analysed.