Data can be collected and analysed based on the planning of the evaluation, which can be laid out in an evaluation plan and/or in the evaluation chart.
Before you develop data collection tools such as questionnaires and interview guidelines yourself (which requires considerable resources), you should first check to see whether data collection tools can be borrowed from other projects, adapted if necessary, and used for your own project. Where possible, and when appropriate to the question at hand, valid data collection tools should be used (in other words, tools that have been proven to be suitable in scientific tests).
It is recommended that you show data collection tools that you have developed yourself to key people in the field of investigation (contracting party, experts) before using them in order to obtain their expert opinion.
Before beginning data collection, newly created tools (questionnaires, interview guidelines, observation grids, document analysis grids) should be tested (pre-test) by representatives of the people to be surveyed. This allows any problems with comprehensibility and procedural issues to be cleared up. This also provides indications concerning the cost and time required. There is still time to adjust the tools based on the findings of the pre-test.
The process and timing of data collection are generally dictated by the chosen methodology and depend on the course of the project. Individual data collections require clear, written instructions. This is important when multiple people are collecting data so that distortions due to differing individual procedures are kept to a minimum. Putting the data collection process into writing clarifies it and ensures that it is transparently documented. With qualitative methods, written or personal contact is usually made with the respondents and interviewees before data is collected to inform them of the purpose of data collection, motivate them to participate and obtain their consent. The respondents are usually guaranteed the anonymised use of data and, if possible, access to the results (in certain cases, unless agreed otherwise).
Once data have been collected, e.g. in interviews, questionnaires or observations, they must be processed, structured and analysed. The data are summarised, counted, calculated, interpreted and represented and discussed in a clearly laid-out manner, according to the chosen methodology. This requires the corresponding expertise in research methods.
Numeric data are first summarised in corresponding tables to prepare for analysis – in the case of questionnaire analyses, the columns contain the possible answers, and each row a new questionnaire. Computer programs such as Excel, R Project and SPSS facilitate the analysis and representation of numeric data.
Statistical methods for representing data in the form of diagrams, simple tables and individual variables are known as descriptive statistics. Descriptive statistics require quantitative measurements or the counting of statements and observations. Frequency tables, percental distributions and mean values are described in words and peculiar results highlighted.
However, quantitative analyses do not rely exclusively on descriptive statistics. Statistical tests and analyses are used to investigate the relationship between individual, independent variables (age, gender, intervention intensity, etc.) and dependent variables (effect parameters such as structural developments, changes in settings and behaviour). The use of statistical tests requires a great deal of expertise in research methods and experience and is therefore usually not suitable for self-evaluations.
Qualitative analyses mainly focus on generating and investigating textual, image and film data (documents, interview protocols, etc.) with regard to specific questions. Interviews are usually recorded on audio tape or video and then logged. In simple evaluations, interviews do not have to be transcribed verbatim nor do the contents of the transcripts have to be subsequently analysed. Good logs containing the key statements, which can be given to the interviewees to cross-check or validate, are in this case sufficient to render the important contents manageable for further analyses.
The available data are structured and summarised to work out answers to the evaluation questions.
Qualitative analysis methods that go beyond this kind of simple content analysis and systematically code and analyse literally transcribed interviews in an interpretive manner require specific research expertise and experience, which is why such interpretive processes are usually not suitable for self-evaluations. Nowadays there are good computer programs for qualitative analyses which facilitate coding, categorising and analysing verbal or visual data (e.g. Atlas.ti, MAXQDA, HyperRESEARCH, nVivo).
Interpretation not only involves the description of verbal data, tables, figures and images, it also involves placing them into the context of the overall project as well as a broader context (experience with other projects, study results, theories) as a way to assess and comment on them.
In quantitative methods, data interpretation must be kept strictly separate from analysis. In qualitative methods, analysis and interpretation are linked in a cyclical investigation process.
Quantitative and qualitative methods attempt to typify the studied aspects, target groups, settings, etc. These typologies can later be used to develop specific intervention approaches for the various types.
Consider carefully in advance how the collected data are going to be analysed. Who is going to do it, when and how? Obtain professional support for data analysis if you do not possess the expertise yourself.