How we developed the quality criteria for resources

Francisco Sanz
April 30, 2020, 3:38 p.m.

Card image

One of the major Work Packages (WP3) within the EU-Citizen.Science project has been to determine how we can ensure that the resources for citizen science projects that are shared and profiled on the platform are indeed of good quality, and valuable to the community. 

The central task therein has been to develop a framework to identify and facilitate the collection and sharing of high-quality resources and best practices for citizen science. This has meant developing a set of quality criteria, and a system for applying those in a way that supports a well-balanced and sustainable repository of resources. This work has been led by Dilek Fraisl and the team at IIASA, and involved the entire consortium throughout the whole process. We have also consulted with a number of other currently running projects (namely, Research and Innovation Actions (RIAs) within the Science with and for Society (SwafS) funding programme of Horizon 2020).

In order to build on existing knowledge and experience in the field, we started our initial research phase by conducting a literature review of what is available regarding “citizen  science resources” and any criteria to define or evaluate their quality. Then we looked at other platforms that could be used as examples, such as the Citizen Science Network Austria’s Catalogue for quality criteria for the Österreichforscht platform, RRI Tools, and the SciStarter Tool Finder platform. You can find our deliverable report about the whole process in ‘Framework Report Describing Criteria and Rationale for Sharing and Selecting State of the art Citizen Science Resources’.

The best indication of whether a resource is truly useful and therefore of good value should come from the community of practitioners who make use of that resource, and therefore all of the resources that are profiled on the platform invite you to rate them according to a 5 star system, and to leave more detailed and contextual feedback on their use in practice.

Defining what resources are

The  first  step  in  identifying  quality  criteria  for  citizen  science  resources  was  to  agree  on  what we  mean  by “citizen  science  resources”  and  “good  quality  citizen  science  resources’’.


We defined citizen science resources as “resources and practices that could be used for help and support in the context of citizen science”. Citizen science resources can help an individual, a project or an organization to understand, plan, implement and evaluate citizen science and citizen science practices, and demonstrate the value of citizen science to different audiences.

Good-quality citizen  science  resources are “resources that are easy to access, implement and  adapt; well structured; clearly described; written with a clear language and ideally have an impact (e.g., on science, policy or society, etc.); and therefore useful to the citizen science community and beyond”.


Once  we  agreed  on  the  definitions,  the  next  step  was  to  identify  the  categories  of  these  resources, and to define them: 


Tools: Any software or hardware to help perform a particular task or work in citizen science initiatives or initiatives relevant to citizen science (eg.  water quality equipment, air quality sensors, etc.).


Guidelines: A set of rules and instructions that could be helpful in designing, implementing or evaluating citizen science or initiatives relevant to citizen science. Guidelines are written texts such as reports, deliverables, briefings, etc


Training Resources: Some form of instructional material in relation to citizen science often related to ‘how to do’ citizen science. Some examples include MOOCs, (online) workshops, webinars, gamified training, quizzes, etc. Media nonspecific.


Other Materials: These are resources other than “tools”, “guidelines” and “training resources” that are about or relevant to citizen science. Other materials include websites, podcasts, videos (e.g. promotion, instructions for projects), libraries, toolboxes, figures, diagrams and texts, among others. The subcategories of other materials are libraries, scientific pubs, websites (platforms), reports, audio, visuals and miscellaneous. Here are the working definitions of these subcategories:


  1. Libraries: An organized set of resources such as databases, repositories, toolkits, toolboxes that brings together relevant documents for a particular purpose in citizen science initiatives such as the A collection of Citizen Science guidelines and publications, Living Knowledge Toolbox, etc. 

  2. Scientific publications: Publications where scientific knowledge on citizen science is shared such as this article.

  3. Websites: Websites, platforms, webpages where citizen science related content is published such as the citsci.org, scistarter.org, geo-wiki.org

  4. Reports: A document that presents information on citizen science or on topics relevant to citizen science.

  5. Audio: Any resource with sound that includes citizen science related content such as podcasts, audio books, radio broadcasts, etc. 

  6. Visuals: Any resource that includes visual content such as videos, diagrams, figures, illustrations such as the FotoQuest Go promotion video.

  7. Miscellaneous: Any resource that does not fit the definitions of the first 6 subcategories under “other materials”.


(Please note that these definitions are not designed to be exclusive, but just as guidance to help decide on the category for each resource, as one resource may fall under different categories and terms.)

The need for quality criteria

The core purpose of the resources section of the platform is to gather the wide range of resources for citizen science available in the literature in one place, bring to light resources that may be buried on project websites or not yet formally published, and to help both newcomers and experienced practitioners to navigate those resources, so that they can find what they need and know where to start.


Anyone using our platform must therefore be able to trust the quality of what they can find here, in terms of their ability to read it (or hear it, or view it), understand its content, and apply it to their own situation or context. We therefore need a filter and moderation mechanism for the resources that are shared here.

The different levels of criteria

In our consortium discussions about the quality criteria, it was clear that we would need to differentiate between mandatory criteria and suggested or guidance criteria. Also, some criteria should be overarching for all resources shared on the platform, and some criteria should be very specific to the resource itself.

Overarching Required Criteria

Our overarching criteria that are required for all resources are that they must be:

  1. About citizen science or relevant to citizen science, and

  2. They must have sufficient information provided about the resource to enable users of the platform to see whether it is useful or relevant to them

What this means in practice for the latter requirement, is that there are a number of mandatory fields in the resource profiles that must be filled before the resource is shared. The mandatory data fields are:

  • Title of the Resource

  • URL

  • Abstract

  • Resource category (i.e. guideline, tool, training resource, etc)

  • Resource audience

  • Keywords

  • Author (or project, or leading institution)

  • Language

  • Theme (i.e. engagement, communication, data quality, etc)


The first requirement is more difficult to evaluate, because there are no hard and fast definitions of what is citizen science and what is not - nor should there be - citizen science should always remain a broad and inclusive concept. 

For guidance on what to consider citizen science, we therefore turn to the recent work conducted by the citizen science practitioner community (created by a task group of the European Citizen Science Association, and led by Muki Haklay of UCL) to describe the Characteristics of Citizen Science. These will soon be published, and we will share them here on the platform (both as guidance, and as a resource in its own right).

What this means in practice, is that when a resource profile is submitted and goes through our moderation process, the moderator will use the Characteristics of Citizen Science as their guide in assessing that the resource is indeed relevant to citizen science. (see Our Moderation Process<link> for more information)

Overarching Suggested Criteria

In our consortium discussions about the quality criteria, we wanted to use the 10 Principles of Citizen Science as guidance for what can determine the quality of a resource, but recognised that it is not possible to review a resource against those principles as if they are a checklist. The 10 Principles will simply not be applicable in many cases - for example, water quality monitoring equipment that is a ‘Tool’ category of resource may not relate directly to any of the Principles. 

During the moderation process we therefore keep it in mind that, if applicable, the resources should ‘engage’ with the relevant Principles in such a way that they are compatible and consistent with the aims and ethos of the Principles.

Resource Specific Criteria

We consider good quality citizen science resources to be those that are easy to access, implement and adapt, are well structured, are clearly described and written in clear language, and ideally improve or support the desired impact of the initiative (eg. on science, policy or society, etc).  


The following resource-specific criteria have been developed to cover and assess those aspects of the quality of the resources:

  1. Access to the resource

    • The resource should be easy to access, i.e. it doesn’t require registration, and is not behind a paywall. 


  1. Readability and Legibility

    • The resource should be clearly structured according to the type of the resource. For example, a scientific paper or report should include an introduction, methodology, results, discussion and/or conclusions, and methodology documents should include an introduction, audience description, step by step methodology, and an example.

    • The resource should be written in clear language that is easy to read and understand for the intended target audience, and should be concise, unambiguous, and avoid the use of unusual words and jargon. Where technical terms are used, their meaning should be explained clearly. 

    • The resource should pay attention to basic formatting, such as clear titles and paragraphs, correct grammar and spelling, a legible font of large enough size to read, and clearly marked references.  


  1. Content

    • The resource should clearly describe its aims, goals and methods, so that it is easy for readers to understand how to apply the resource in their own context.


  1. Applicability

    • The resource should be easy to implement, ideally with descriptions of how it can be implemented, the contexts that it is useful for, and recommendations for further use or development.

    • The resource should be easy to adapt to different cases, ideally with an explanation of any limitations of the resource and the contexto in which it could be useful, and with guidelines or recommendations for its adaptation to different cases.


  1. Object

    • If the resource is an audio object, it should be clearly audible, with no interruptions or background noise.

    • If the resource is a video, an image or illustration, the quality should be good enough to see clearly, with a sharp focus. 


The way that these Specific Criteria work in practice, is that the above criterion have been set up as a checklist, with a 5-point rating system for each question of whether that criterion is met.

When a resource profile is submitted to the platform, it will go through our moderation process to be rated in relation to the relevant criteria. If the total rating exceeds 50% of the total possible points for that resource (for example, if all aspects are applicable, then the highest score is 40 points and the threshold is 20 points), then it will be listed on the platform as a good quality resource. 


Resource Supporting Criteria

And finally, we have developed a set of supporting criteria that describe the use of the resource in practice and the existence of an evaluation of the resource, that can enhance the case for including a resource during the moderation process.  

These aspects enhance the value and the quality of the resources a great deal, and we would encourage anyone producing such a resource to include that information when possible, but we do not make them mandatory because they simply will not be available in many instances.

These criteria are:


  1. Evaluation

    • Has the resource been used in the context of citizen science or in a relevant initiative, or is it currently being used in a citizen science initiative - and what is the outcome of that use?

    • Has the resource been evaluated in terms of the content, methods and/or results - and what is the outcome of that evaluation?


  1. Impact

    • Does  the  resource  refer  to  any  impact that it could have (or has had) on science,  policy,  society,  etc?

    • If the resource refers to an impact, has this been measured somehow?



x
This website is using cookies. More info. That's Fine