Norstat’s answers to the ESOMAR37 questions

37 questions to help buyers of online samples. The questions identify the key issues to consider, introduce consistent terminology, explain why each question should be asked, and note the issues buyers should expect to be covered in an answer.

Company profile

  1. What experience does your company have in providing online samples for market research? How long have you been providing this service? Do you also provide similar services for other uses such as direct marketing? If so, what proportion of your work is for market research?

We have been offering data collection and fieldwork services to marketing professionals since 1997 and started building our first online panel in 2002. Our emphasis has always been on the intermediation between the market researcher and the panel member, as the expectations from both sides are important to us. Our focus is solely on research and data collection; we are data collection specialists, and do not offer any other services like direct marketing.

 

  1. Do you have staff with responsibility for developing and monitoring the performance of the sampling algorithms and related automated functions who also have knowledge and experience in this area? What sort of training in sampling techniques do you provide to your frontline staff?

Our team of developers are highly experienced and have developed and tuned our sampling algorithms over time. We can do highly efficient sampling with our own system for sampling. All our project managers are trained in using the Norstat sampler optimally to ensure the best possible fit with the target group and representativity, including basing sampling on response rate experience data for subsegments. 

 

  1. What other services do you offer? Do you cover sample-only, or do you offer a broad range of data collection and analysis services?

We offer a broad range of data collection services, and while we are happy to do sample-only studies we are proud of our programming capabilities, and can also offer advice on questionnaire design. We do not analyse data, but we can provide all kinds of data collection, including passive data collection, mystery shopping, CAPI and CATI and even self-service solutions for groundbreakingly quick responses to simple surveys.

Sample sources and recruitment

  1. Using the broad classifications above, from what sources of online sample do you derive participants?

We own proprietary panels across 19 European countries. In the countries where panel size allows, we derive all participants from our own proprietary panels. This would generally be the case in the Nordic countries as well as the Baltics, Poland and Germany. In cases where a planned study has sample size requirements exceeding the feasibility of our nationally representative sample, we may collaborate with high quality panel partners to fulfil quota and target group requirements. We will always be able to tell you which will be the case for any study. 

 

  1. Which of these sources are proprietary or exclusive and what is the percent share of each in the total sample provided to a buyer?

All our panel members are members of our proprietary panels.

In cases where we need to use panel partners, we will always inform clients about the share of the sample comprised by external respondents, which will vary from study to study.  

 

  1. What recruitment channels are you using for each of the sources you have described? Is the recruitment process ‘open to all’ or by invitation only? Are you using probabilistic methods? Are you using affiliate networks and referral programs and in what proportions? How does your use of these channels vary by geography?

Our recruitment channels are by invitation only. We utilize a diverse range of recruitment sources to ensure that the resulting panel composition is representative of all segments of the population in the countries we operate. Recruitment happens via computer assisted telephone interviewing (CATI), from social media sources and recruitment via affiliate networks. In the Nordics, Baltics and Poland where we have CATI interviewing departments, CATI is an important sample source. To supplement the main recruitment channels and reach a wider variety of target demographics, we also use other sources from time to time, like SMS recruitment campaigns or collaboration with clients.

 

  1. What form of validation do you use in recruitment to ensure that participants are real, unique, and are who they say they are? Describe this both in terms of the practical steps you take within your own organization and the technologies you are using. Please try to be as specific and quantify as much as you can.

We are validating new signups by mobile phone verification to prevent fictive or multiple accounts, as well as to keep user accounts secure. We always use two-step verification.

Additionally, we have both manual and automated processes in place to check for fraudulent or duplicate accounts based on e-mail addresses, etc. Our panel managers also review lists of new signups to check for plausibility, with system support highlighting possible suspicious e-mail addresses or names based on logical rules and human review of the highlighted suspect signups.

This is in addition to the continuous monitoring of response quality that happens throughout the panelist lifecycle (see below).

 

  1. What brand (domain) and/or app are you using with proprietary sources? Summarise, by source, the proportion of sample accessing surveys by mobile app, email or other specified means.

Norstatpanel.com is the brand of our proprietary panel, with members in 19 countries across Europe. Norstatpanel members are invited to surveys via e-mail, and also have the option to download the Norstatpanel mobile app and answer surveys through the app.

App usage varies across our panels, but on average we see around one third of completes being collected through our app – the other two thirds are completed from links in e-mail invitations. We encourage adoption of the app, as we notice that especially among younger participants the attention to e-mail is waning, and this is an important target demographic to reach in order to have a representative panel.

In general, the same surveys are available from both e-mail and mobile app, however if surveys are not device agnostic, we will mark them as “desktop only” to allow only responses from desktop computers.

 

  1. Which model(s) do you offer to deliver sample? Managed service, self-serve, or API integration?

We deliver sample using all these models, depending on the project requirements. The default model is managed service, but we can also offer highly efficient self-service solutions for shorter, less complex projects and API integration for clients that we work closely with.

 

  1. If offering intercepts, or providing access to more than one source, what level of transparency do you offer over the composition of your sample (sample sources, sample providers included in the blend). Do you let buyers control which sources of sample to include in their projects, and if so how? Do you have any integration mechanisms with third-party sources offered?

Intercepts are not part of our offering. If we utilize sample from third party providers, this will always happen with the client’s pre-approval. We do not have any integration mechanisms in place with third-party sources.

 

  1. Of the sample sources you have available, how would you describe the suitability of each for different research applications? For example, Is there sample suitable for product testing or other recruit/recall situations where the buyer may need to go back again to the same sample? Is the sample suitable for shorter or longer questionnaires? For mobile-only or desktop- only questionnaires? Is it suitable to recruit for communities? For online focus groups?

Norstatpanel sample is suited to all different research applications. Recontacting the same sample over multiple waves is possible, as long as this happens with pre-approval from the participant. We can do shorter and longer studies, and qualitative studies as well.

We have a large number of profiling variables in place to help us identify participants that are suitable for participation in e.g. online focus groups, IHUT studies or product testing.

Norstatpanel sample can be used to recruit to communities that are hosted by Norstat, but we do not allow recruiting Norstatpanel members to other panel pools.

Sampling and project management

  1. Briefly describe your overall process from invitation to survey completion. What steps do you take to achieve a sample that “looks like” the target population? What demographic quota controls, if any, do you recommend?

We typically start by sending a first wave of invites to a sample where the composition of the sample is as similar as possible to the composition of the target population (for instance, a nationally representative sample), based on gender, age and region. Later, we send one or more additional waves of invites where we take into account which groups have lower representation, so that we send relatively more invites to segments with lower response rates. This adjustment is handled by our sampling algorithms, and helps avoid over-inviting groups that are already well represented. We may also send reminder invites to boost response rates, especially if we are looking for a narrower segment of the population – or if the client would like us to.

Typical demographic quota controls would be based on age, gender and geographical location.

 

  1. What profiling information do you hold on at least 80% of your panel members plus any intercepts known to you through prior contact? How does this differ by the sources you offer? How often is each of those data points updated? Can you supply these data points as appends to the data set? Do you collect this profiling information directly or is it supplied by a third party?

We hold information about age, gender and geographical belonging on all of our panel members. When signing up, all members are invited to complete a profile survey collecting data on frequently used background variables. The profile survey has questions about household composition, number and age of children living at home, educational background, occupational situation, industry and personal and household income. Most panel members complete the profile survey. For the minority that does not, we have an additional process in place for collecting the same information over time, and also for updating this information periodically. The update frequency depends on the type of variable, for instance household income will typically be updated annually, whereas variables related to consumer behavior may be updated more frequently than that. We can also use this same process to pre-screen participants for participation in future surveys and determine the incidence of a phenomenon in the population.

Data points about panel members can be included in our delivery, as long as this is in an unidentifiable form.

Our profiling information is collected directly from our panel members.

 

  1. What information do you need about a project in order to provide an estimate of feasibility? What, if anything, do you do to give upper or lower boundaries around these estimates?

In order to provide an estimate of feasibility, we need to know how many completes are needed, length of interview and the estimated incidence of the target group in the overall population. If the incidence is unknown, we can seek to find out based on our prescreening tools. We do not have a fixed upper or lower boundary for acceptable feasibility, but we always seek an open discussion with our client about feasibility, what is possible to accomplish or not, and about the sample definition. Field period and information about quota structure is also relevant information when we are estimating feasibility.

If incidence is highly uncertain, we will be open about any doubts about feasibility and enter into a dialogue with the client to seek ways to ensure that the project can be completed – whether this means looking again at the target group definition, target size of the project or quota structure, the field period or LOI.

 

  1. What do you do if the project proves impossible for you to complete in field? Do you inform the sample buyer as to who you would use to complete the project? In such circumstances, how do you maintain and certify third party sources/sub-contractors?

Most typically we will enter into a dialogue with the client to see if the target group definition can be changed or if there are other ways to reach the desired study outcome– for instance, by employing a different method. If there are very specific target groups where we know we will not be able to provide the desired sample, we may recommend other providers that specialize in that area. Sub-contractors are primarily used in those cases where we are conducting multinational studies where sample is needed from markets other than those where Norstat has panels. In such cases we apply additional quality control checks to make sure that the quality of responses is sufficient.

 

  1. Do you employ a survey router or any yield management techniques? If yes, please describe how you go about allocating participants to surveys. How are potential participants asked to participate in a study? Please specify how this is done for each of the sources you offer.

Norstat does not employ a survey router, but we can utilize our prescreening process whereby we determine belonging to a target group, by posing a target group question at the end of a different survey. We can then use this information when inviting participants to avoid unnecessary screenouts.

Potential participants are asked to participate in a study by means of an e-mail invite, which will also appear in the Norstatpanel app for those sampled. Selection is made by random draw from within the target demographics needed for the study.

 

  1. Do you set limits on the amount of time a participant can be in the router before they qualify for a survey?

Not applicable, since we do not employ a survey router.

 

  1. What information about a project is given to potential participants before they choose whether to take the survey or not? How does this differ by the sources you offer?

We always inform potential participants about the length of the interview in minutes, and the incentive they will receive for participation. We also include information about whether or not the study is device agnostic.

With quantitative studies, we will typically not reveal the topic of the study, because of the concern that this might influence representativity and lead to only those with a special interest in the topic to enter the survey. When we approach a potential participant with regard to screening them for participation in a qualitative study, on the other hand, we will typically share some more information about the study, such as the topic and context of the study, since if the participant is not interested in the topic they will hardly agree to participate in the qualitative study which is more time consuming and requires more effort than responding a quantitative questionnaire.

 

  1. Do you allow participants to choose a survey from a selection of available surveys? If so, what are they told about each survey that helps them to make that choice?

We invite participants to surveys based on random draws within population segments, but for each survey the participant is invited to, they are free to choose whether or not they want to participate in the study. In most cases their decision will have to be based on survey length alone for quantitative studies, although for some studies where method allows we may also inform participants about the topic of the study. In projects recruiting participants for qualitative studies we will typically divulge more information early in the recruitment process.

Participants that are eligible and have been sampled, will be able to choose between all studies they have been sampled to. Their choice will then be made based on interview length and the incentives offered. They will also be able to answer more than one study, as long as they are in the sample for the study.

 

  1. What ability do you have to increase (or decrease) incentives being offered to potential participants (or sub-groups of participants) during the course of a survey? If so, can this be flagged at the participant level in the dataset?

Incentive size can be changed over the course of the survey as needed – for instance if the LOI is longer than predicted, response rates are lower than predicted or the field period is short and there is some urgency in fulfilling targets. It will be possible to flag in the dataset on a participant level what incentives have been received.

In the majority of cases, though, there is a close correlation between length of interview (LOI) and incentive size. For particularly long projects incentive size may be increased to prevent survey dropouts.

 

  1. Do you measure participant satisfaction at the individual project level? If so, can you provide normative data for similar projects (by length, by type, by subject, by target group)?

Participant satisfaction is measured at the individual project level. After each completed questionnaire, the participant is asked to rate the survey they just completed on the factors of survey design, survey length and topic as well as an overall satisfaction score. Whenever a participant gives a low rating to a study, we ask them to give us some feedback on what should be improved. We benchmark the satisfaction score against other projects in the same country. Whenever a project satisfaction score is lower than expected we use the feedback to see if there are issues like errors in the questionnaire, whether the survey length deviates from the predicted survey length or there are other issues that the participants point out, and see what can be done to improve the situation in collaboration with our clients.

We are able to provide normative data for similar projects.

 

  1. Do you provide a debrief report about a project after it has completed? If yes, can you provide an example?

We are able to report on the following parameters, as needed: Response rate, dropout rate, screenouts, survey satisfaction scores on length, topic, design and overall.

We do not, however, routinely provide a debrief report, although we are happy to help provide answers to any questions the client may have about the fieldwork, according to their specification.

While it is not standard practice to provide a debrief report, it is our standard practice to take a more proactive approach by contacting the client as early as possible in the data collection phase if there are any unexpected trends in the results. Typical examples of issues that may come up from time to time would be a high dropout rate, low response rate, LOI that is different than anticipated, low satisfaction scores. We follow the progress of data collection and get in touch with the client if there are particular issues that need to be addressed.

Data quality and validation

  1. How often can the same individual participate in a survey? How does this vary across your sample sources? What is the mean and maximum amount of time a person may have already been taking surveys before they entered this survey? How do you manage this?

We are mindful of and track survey load, which varies a lot across different countries and demographic segments, and we run recruitment campaigns designed specifically to boost the demographic segments where survey load is highest. There are, however, no hard limits for how often any individual panel member can participate in the survey.

To ensure that survey load on each panel member stays sustainable, we put a lot of recruitment effort towards ensuring that our panels are nationally representative, and boosting those segments where or where representation is lower or utilization is particularly high. Average survey load on a Norstatpanel member is on around 7-8 invites per month, although this differs among different target demographics.

 

  1. What data do you maintain on individual participants such as recent participation history, date(s) of entry, source/channel, etc? Are you able to supply buyers with a project analysis of such individual level data? Are you able to append such data points to your participant records?

We maintain meta data over the whole lifespan of the panel membership; we know the recruitment source and signup date, and we have a full history of participation for each panel member including what surveys they were invited to, and the outcomes of the invite, as well as a history of incentives assigned and redeemed. Should the project require it, we can provide such data points as needed, as long as this is done in a way that preserves participant anonymity.

 

  1. Please describe your procedures for confirmation of participant identity at the project level. Please describe these procedures as they are implemented at the point of entry to a survey or router.

Invites to our panel members always take the form of unique links connected to the panel member identity. This means that it is not possible to respond more than once to the same survey, per panel member. Participant identity is confirmed in the signup process by two-step verification, and the unique survey link is sent to their verified e-mail address. When answering surveys from the app or Norstatpanel page, panel members need to be logged in to their user account.

As a further security mechanism preventing the same person from entering a survey twice, we run automated checks to see if more than one panel member is responding to surveys from the same browser session, and flag panel members for further investigation where suspicious patterns are observed.

 

  1. How do you manage source consistency and blend at the project level? With regard to trackers, how do you ensure that the nature and composition of sample sources remain the same over time? Do you have reports on blends and sources that can be provided to buyers? Can source be appended to the participant data records?

Not applicable, as we do not blend sample sources.

 

  1. Please describe your participant/member quality tracking, along with any health metrics you maintain on members/participants, and how those metrics are used to invite, track, quarantine, and block people from entering the platform, router, or a survey. What processes do you have in place to compare profiled and known data to in-survey responses?

We track the quality of our members on many different parameters, for the purpose of identifying and eliminating members leaving low quality responses from the panel. The main concerns we focus on are speeding, random responding and duplicate accounts, and we have a diverse approach to these issues.

If we determine that a panel member is not leaving high quality responses on several occasions, we will suspend their panel account, and the system will no longer permit sending invites or allowing redemption of incentives for these panel members – they will no longer be allowed to log in to the Norstatpanel website.

We track response rates, completion rates and response times for all panel members over time.

 

  1. For work where you program, host, and deliver the survey data, what processes do you have in place to reduce or eliminate undesired in-survey behaviours, such as (a) random responding, (b) Illogical or inconsistent responding, (c) overuse of item non- response (e.g.,“Don’t Know”) (d) inaccurate or inconsistent responding, (e) incomplete responding, or (f) too rapid survey completion?

When it comes to speeding, we track response times and compare them to the median response times on the project. Consistently responding much quicker than everyone else on the same survey will raise a red flag and be grounds for exclusion. We also have in-survey quality control questions designed to detect whether the respondent is paying attention.

Additionally, we manually review open-ended responses, and if a panel member is leaving nonsensical responses to open-ended question, that too may lead to exclusion. Some checks, like survey speeding, are handled by the system, but our local panel managers responsible for each of our panels also evaluate cases where there is doubt about the quality of responses holistically.

Policies and compliance

  1. Please provide the link to your participant privacy notice (sometimes referred to as a privacy policy) as well as a summary of the key concepts it addresses.

You can find our privacy policy here

The privacy policy explains what types of personal data we collect, and how we use it. It also addresses the rights of the data subject and how to exercise these rights.

 

  1. How do you comply with key data protection laws and regulations that apply in the various jurisdictions in which you operate? How do you address requirements regarding consent or other legal bases for the processing personal data? How do you address requirements for data breach response, cross-border transfer, and data retention? Have you appointed a data protection officer?

We stay informed about key data protection laws and regulations by cooperating with legal offices in the countries where we operate. For our panel members, the legal basis for contacting them with invites is consent that we collect from them as part of the signup process. If necessary, for instance in the case of special category data or in the case of re-using survey data across multiple waves of a study, we ensure that additional consent to this special processing is collected as part of the study.

We have a GDPR incident response team ready to respond if an incident involving personally identifiable data should happen. Data transfers are regulated by data protection agreements. When it comes to data retention, we have automated systems that ensure the deletion of personally identifiable data once it is no longer required.

Norstat has appointed a Data Protection Officer that is responsible for following up all GDPR related topics.

 

  1. How can participants provide, manage and revise consent for the processing of their personal data? What support channels do you provide for participants?

Consent is collected as part of the sign-up process for our panel members, and when logged in to their panel membership page they can see and change or revoke all consents they have given directly in the membership page. We have support in the local language for all our panel members, and support can be reached via a contact form on our panel pages as well as via a link in each survey invite. For requests about data subject access rights or questions about privacy, is also possible to email our DPO directly via the e-mail address listed in the privacy policy.

 

  1. How do you track and comply with other applicable laws and regulations, such as those that might impact the incentives paid to participants?

The managing director of each Norstat subsidiary is responsible for staying up to date on all regulations that pertain to the market where they work.

 

  1. What is your approach to collecting and processing the personal data of children and young people? Do you adhere to standards and guidelines provided by ESOMAR or GRBN member associations? How do you comply with applicable data protection laws and regulations?

When it comes to collecting and processing the personal data of children and young people, we have internal guidelines based on ESOMAR guidelines, to ensure that special care is taken to protect the rights and needs of minors.

The age limits for membership in our panels is in line with national recommendations, and is 15 or 16 years of age, depending on the country of residence. It is still possible to reach respondents younger than 15 years of age, by addressing surveys to parents with children in younger target groups, and interviewing them with the help and consent of their parents.

When it comes to data protection laws and regulations, the data protection officer has an overall responsibility for ensuring that applicable laws and regulations are both known and complied with.

 

  1. Do you implement “data protection by design”(sometimes referred to as“privacy by design”) in your systems and processes? If so, please describe how.

All our systems implement privacy by design.

  • We limit access to data to those in our organization that have a professional need for it, using the principle of least privilege to guide how we set up access rights.
  • We collect and use only the information we need.
  • We have systems in place that ensure deletion of personal information once it is no longer needed.
  • By default, all our deliveries consist of anonymous data only.

  1. What are the key elements of your information security compliance program? Please specify the framework(s) or auditing procedure(s) you comply with or certify to. Does your program include an asset-based risk assessment and internal audit process?

We have an internal control system for IT Security in place to ensure that

our technical and organizational measures follow state of the art best practices. Our system is built on the best practice recommendations found in the ISO27001 and guidelines from data protection authorities.

 

Key elements of the program are:

  • A security organization with clearly defined responsibilities, including appointment of a GDPR compliance team, a DPO and an incident response team.
  • Risk assessments of all assets performed by the GDPR group.
  • Internal control checks that are scheduled and followed up on a regular basis
  • Internal audits
  • Training of employees
  • Business continuity plan

Our system for internal control of IT security is subject to independent auditor’s assessment based on the ISAE3000 framework, and the independent auditor’s assessment is available on request.

 

  1. Do you certify to or comply with a quality framework such as ISO 20252?

Norstat is certified according to the ISO9001:2015 standard for quality systems in the Nordic region and for our back-office functions that work across the organization such as IT, data processing, finance and quality management, and has been certified since 2011.

Furthermore, Norstat Finland, Norstat Netherlands and Norstat Germany are all certified on the ISO20252:2019 standard for Market, opinion and social research.

Metrics

  1. Which of the following are you able to provide to buyers, in aggregate and by country and source? Please include a link or attach a file of a sample report for each of the metrics you use.

 

Yes

No

Not applicable

Average qualifying or completion rate, trended by month: 

X

 

 

Percent of paid completes rejected per month/project, trended by month

 

X

 

Percent of members/accounts removed/quarantined, trended by month

X

 

 

Percent of paid completes from 0-3 months tenure, trended by month

 

X

 

Percent of paid completes from smartphones, trended by month

X

 

 

Percent of paid completes from owned/branded member relationships versus intercept participants, trended by month

 

 

X

Average number of dispositions (survey attempts, screenouts, and completes) per member, trended by month (potentially by cohort)

X

 

 

Average number of paid completes per member, trended by month (potentially by cohort)

X

 

 

Active unique participants in the last 30 days

X

 

 

Active unique 18-24 male participants in the last 30 days

X

 

 

Maximum feasibility in a specific country with nat rep quotas, seven days in field, 100% incidence, 10-minute interview

X

 

 

Percent of quotas that reached full quota at time of delivery, trended by month

 

X