37 questions to help buyers of online samples. The questions identify the key issues to consider, introduce consistent terminology, explain why each question should be asked, and note the issues buyers should expect to be covered in an answer.
We have been offering data collection and fieldwork services to marketing professionals since 1997 and started building our first online panel in 2002. Our emphasis has always been on the intermediation between the market researcher and the panel member, as the expectations from both sides are important to us. Our focus is solely on research and data collection; we are data collection specialists, and do not offer any other services like direct marketing.
Our team of developers are highly experienced and have developed and tuned our sampling algorithms over time. We can do highly efficient sampling with our own system for sampling. All our project managers are trained in using the Norstat sampler optimally to ensure the best possible fit with the target group and representativity, including basing sampling on response rate experience data for subsegments.
We offer a broad range of data collection services, and while we are happy to do sample-only studies we are proud of our programming capabilities, and can also offer advice on questionnaire design. We do not analyse data, but we can provide all kinds of data collection, including passive data collection, mystery shopping, CAPI and CATI and even self-service solutions for groundbreakingly quick responses to simple surveys.
We own proprietary panels across 19 European countries. In the countries where panel size allows, we derive all participants from our own proprietary panels. This would generally be the case in the Nordic countries as well as the Baltics, Poland and Germany. In cases where a planned study has sample size requirements exceeding the feasibility of our nationally representative sample, we may collaborate with high quality panel partners to fulfil quota and target group requirements. We will always be able to tell you which will be the case for any study.
All our panel members are members of our proprietary panels.
In cases where we need to use panel partners, we will always inform clients about the share of the sample comprised by external respondents, which will vary from study to study.
Our recruitment channels are by invitation only. We utilize a diverse range of recruitment sources to ensure that the resulting panel composition is representative of all segments of the population in the countries we operate. Recruitment happens via computer assisted telephone interviewing (CATI), from social media sources and recruitment via affiliate networks. In the Nordics, Baltics and Poland where we have CATI interviewing departments, CATI is an important sample source. To supplement the main recruitment channels and reach a wider variety of target demographics, we also use other sources from time to time, like SMS recruitment campaigns or collaboration with clients.
We are validating new signups by mobile phone verification to prevent fictive or multiple accounts, as well as to keep user accounts secure. We always use two-step verification.
Additionally, we have both manual and automated processes in place to check for fraudulent or duplicate accounts based on e-mail addresses, etc. Our panel managers also review lists of new signups to check for plausibility, with system support highlighting possible suspicious e-mail addresses or names based on logical rules and human review of the highlighted suspect signups.
This is in addition to the continuous monitoring of response quality that happens throughout the panelist lifecycle (see below).
Norstatpanel.com is the brand of our proprietary panel, with members in 19 countries across Europe. Norstatpanel members are invited to surveys via e-mail, and also have the option to download the Norstatpanel mobile app and answer surveys through the app.
App usage varies across our panels, but on average we see around one third of completes being collected through our app – the other two thirds are completed from links in e-mail invitations. We encourage adoption of the app, as we notice that especially among younger participants the attention to e-mail is waning, and this is an important target demographic to reach in order to have a representative panel.
In general, the same surveys are available from both e-mail and mobile app, however if surveys are not device agnostic, we will mark them as “desktop only” to allow only responses from desktop computers.
We deliver sample using all these models, depending on the project requirements. The default model is managed service, but we can also offer highly efficient self-service solutions for shorter, less complex projects and API integration for clients that we work closely with.
Intercepts are not part of our offering. If we utilize sample from third party providers, this will always happen with the client’s pre-approval. We do not have any integration mechanisms in place with third-party sources.
Norstatpanel sample is suited to all different research applications. Recontacting the same sample over multiple waves is possible, as long as this happens with pre-approval from the participant. We can do shorter and longer studies, and qualitative studies as well.
We have a large number of profiling variables in place to help us identify participants that are suitable for participation in e.g. online focus groups, IHUT studies or product testing.
Norstatpanel sample can be used to recruit to communities that are hosted by Norstat, but we do not allow recruiting Norstatpanel members to other panel pools.
We typically start by sending a first wave of invites to a sample where the composition of the sample is as similar as possible to the composition of the target population (for instance, a nationally representative sample), based on gender, age and region. Later, we send one or more additional waves of invites where we take into account which groups have lower representation, so that we send relatively more invites to segments with lower response rates. This adjustment is handled by our sampling algorithms, and helps avoid over-inviting groups that are already well represented. We may also send reminder invites to boost response rates, especially if we are looking for a narrower segment of the population – or if the client would like us to.
Typical demographic quota controls would be based on age, gender and geographical location.
We hold information about age, gender and geographical belonging on all of our panel members. When signing up, all members are invited to complete a profile survey collecting data on frequently used background variables. The profile survey has questions about household composition, number and age of children living at home, educational background, occupational situation, industry and personal and household income. Most panel members complete the profile survey. For the minority that does not, we have an additional process in place for collecting the same information over time, and also for updating this information periodically. The update frequency depends on the type of variable, for instance household income will typically be updated annually, whereas variables related to consumer behavior may be updated more frequently than that. We can also use this same process to pre-screen participants for participation in future surveys and determine the incidence of a phenomenon in the population.
Data points about panel members can be included in our delivery, as long as this is in an unidentifiable form.
Our profiling information is collected directly from our panel members.
In order to provide an estimate of feasibility, we need to know how many completes are needed, length of interview and the estimated incidence of the target group in the overall population. If the incidence is unknown, we can seek to find out based on our prescreening tools. We do not have a fixed upper or lower boundary for acceptable feasibility, but we always seek an open discussion with our client about feasibility, what is possible to accomplish or not, and about the sample definition. Field period and information about quota structure is also relevant information when we are estimating feasibility.
If incidence is highly uncertain, we will be open about any doubts about feasibility and enter into a dialogue with the client to seek ways to ensure that the project can be completed – whether this means looking again at the target group definition, target size of the project or quota structure, the field period or LOI.
Most typically we will enter into a dialogue with the client to see if the target group definition can be changed or if there are other ways to reach the desired study outcome– for instance, by employing a different method. If there are very specific target groups where we know we will not be able to provide the desired sample, we may recommend other providers that specialize in that area. Sub-contractors are primarily used in those cases where we are conducting multinational studies where sample is needed from markets other than those where Norstat has panels. In such cases we apply additional quality control checks to make sure that the quality of responses is sufficient.
Norstat does not employ a survey router, but we can utilize our prescreening process whereby we determine belonging to a target group, by posing a target group question at the end of a different survey. We can then use this information when inviting participants to avoid unnecessary screenouts.
Potential participants are asked to participate in a study by means of an e-mail invite, which will also appear in the Norstatpanel app for those sampled. Selection is made by random draw from within the target demographics needed for the study.
Not applicable, since we do not employ a survey router.
We always inform potential participants about the length of the interview in minutes, and the incentive they will receive for participation. We also include information about whether or not the study is device agnostic.
With quantitative studies, we will typically not reveal the topic of the study, because of the concern that this might influence representativity and lead to only those with a special interest in the topic to enter the survey. When we approach a potential participant with regard to screening them for participation in a qualitative study, on the other hand, we will typically share some more information about the study, such as the topic and context of the study, since if the participant is not interested in the topic they will hardly agree to participate in the qualitative study which is more time consuming and requires more effort than responding a quantitative questionnaire.
We invite participants to surveys based on random draws within population segments, but for each survey the participant is invited to, they are free to choose whether or not they want to participate in the study. In most cases their decision will have to be based on survey length alone for quantitative studies, although for some studies where method allows we may also inform participants about the topic of the study. In projects recruiting participants for qualitative studies we will typically divulge more information early in the recruitment process.
Participants that are eligible and have been sampled, will be able to choose between all studies they have been sampled to. Their choice will then be made based on interview length and the incentives offered. They will also be able to answer more than one study, as long as they are in the sample for the study.
Incentive size can be changed over the course of the survey as needed – for instance if the LOI is longer than predicted, response rates are lower than predicted or the field period is short and there is some urgency in fulfilling targets. It will be possible to flag in the dataset on a participant level what incentives have been received.
In the majority of cases, though, there is a close correlation between length of interview (LOI) and incentive size. For particularly long projects incentive size may be increased to prevent survey dropouts.
Participant satisfaction is measured at the individual project level. After each completed questionnaire, the participant is asked to rate the survey they just completed on the factors of survey design, survey length and topic as well as an overall satisfaction score. Whenever a participant gives a low rating to a study, we ask them to give us some feedback on what should be improved. We benchmark the satisfaction score against other projects in the same country. Whenever a project satisfaction score is lower than expected we use the feedback to see if there are issues like errors in the questionnaire, whether the survey length deviates from the predicted survey length or there are other issues that the participants point out, and see what can be done to improve the situation in collaboration with our clients.
We are able to provide normative data for similar projects.
We are able to report on the following parameters, as needed: Response rate, dropout rate, screenouts, survey satisfaction scores on length, topic, design and overall.
We do not, however, routinely provide a debrief report, although we are happy to help provide answers to any questions the client may have about the fieldwork, according to their specification.
While it is not standard practice to provide a debrief report, it is our standard practice to take a more proactive approach by contacting the client as early as possible in the data collection phase if there are any unexpected trends in the results. Typical examples of issues that may come up from time to time would be a high dropout rate, low response rate, LOI that is different than anticipated, low satisfaction scores. We follow the progress of data collection and get in touch with the client if there are particular issues that need to be addressed.
We are mindful of and track survey load, which varies a lot across different countries and demographic segments, and we run recruitment campaigns designed specifically to boost the demographic segments where survey load is highest. There are, however, no hard limits for how often any individual panel member can participate in the survey.
To ensure that survey load on each panel member stays sustainable, we put a lot of recruitment effort towards ensuring that our panels are nationally representative, and boosting those segments where or where representation is lower or utilization is particularly high. Average survey load on a Norstatpanel member is on around 7-8 invites per month, although this differs among different target demographics.
We maintain meta data over the whole lifespan of the panel membership; we know the recruitment source and signup date, and we have a full history of participation for each panel member including what surveys they were invited to, and the outcomes of the invite, as well as a history of incentives assigned and redeemed. Should the project require it, we can provide such data points as needed, as long as this is done in a way that preserves participant anonymity.
Invites to our panel members always take the form of unique links connected to the panel member identity. This means that it is not possible to respond more than once to the same survey, per panel member. Participant identity is confirmed in the signup process by two-step verification, and the unique survey link is sent to their verified e-mail address. When answering surveys from the app or Norstatpanel page, panel members need to be logged in to their user account.
As a further security mechanism preventing the same person from entering a survey twice, we run automated checks to see if more than one panel member is responding to surveys from the same browser session, and flag panel members for further investigation where suspicious patterns are observed.
Not applicable, as we do not blend sample sources.
We track the quality of our members on many different parameters, for the purpose of identifying and eliminating members leaving low quality responses from the panel. The main concerns we focus on are speeding, random responding and duplicate accounts, and we have a diverse approach to these issues.
If we determine that a panel member is not leaving high quality responses on several occasions, we will suspend their panel account, and the system will no longer permit sending invites or allowing redemption of incentives for these panel members – they will no longer be allowed to log in to the Norstatpanel website.
We track response rates, completion rates and response times for all panel members over time.
When it comes to speeding, we track response times and compare them to the median response times on the project. Consistently responding much quicker than everyone else on the same survey will raise a red flag and be grounds for exclusion. We also have in-survey quality control questions designed to detect whether the respondent is paying attention.
Additionally, we manually review open-ended responses, and if a panel member is leaving nonsensical responses to open-ended question, that too may lead to exclusion. Some checks, like survey speeding, are handled by the system, but our local panel managers responsible for each of our panels also evaluate cases where there is doubt about the quality of responses holistically.
You can find our privacy policy here
The privacy policy explains what types of personal data we collect, and how we use it. It also addresses the rights of the data subject and how to exercise these rights.
We stay informed about key data protection laws and regulations by cooperating with legal offices in the countries where we operate. For our panel members, the legal basis for contacting them with invites is consent that we collect from them as part of the signup process. If necessary, for instance in the case of special category data or in the case of re-using survey data across multiple waves of a study, we ensure that additional consent to this special processing is collected as part of the study.
We have a GDPR incident response team ready to respond if an incident involving personally identifiable data should happen. Data transfers are regulated by data protection agreements. When it comes to data retention, we have automated systems that ensure the deletion of personally identifiable data once it is no longer required.
Norstat has appointed a Data Protection Officer that is responsible for following up all GDPR related topics.
Consent is collected as part of the sign-up process for our panel members, and when logged in to their panel membership page they can see and change or revoke all consents they have given directly in the membership page. We have support in the local language for all our panel members, and support can be reached via a contact form on our panel pages as well as via a link in each survey invite. For requests about data subject access rights or questions about privacy, is also possible to email our DPO directly via the e-mail address listed in the privacy policy.
The managing director of each Norstat subsidiary is responsible for staying up to date on all regulations that pertain to the market where they work.
When it comes to collecting and processing the personal data of children and young people, we have internal guidelines based on ESOMAR guidelines, to ensure that special care is taken to protect the rights and needs of minors.
The age limits for membership in our panels is in line with national recommendations, and is 15 or 16 years of age, depending on the country of residence. It is still possible to reach respondents younger than 15 years of age, by addressing surveys to parents with children in younger target groups, and interviewing them with the help and consent of their parents.
When it comes to data protection laws and regulations, the data protection officer has an overall responsibility for ensuring that applicable laws and regulations are both known and complied with.
All our systems implement privacy by design.
We have an internal control system for IT Security in place to ensure that
our technical and organizational measures follow state of the art best practices. Our system is built on the best practice recommendations found in the ISO27001 and guidelines from data protection authorities.
Key elements of the program are:
Our system for internal control of IT security is subject to independent auditor’s assessment based on the ISAE3000 framework, and the independent auditor’s assessment is available on request.
Norstat is certified according to the ISO9001:2015 standard for quality systems in the Nordic region and for our back-office functions that work across the organization such as IT, data processing, finance and quality management, and has been certified since 2011.
Furthermore, Norstat Finland, Norstat Netherlands and Norstat Germany are all certified on the ISO20252:2019 standard for Market, opinion and social research.
|
Yes |
No |
Not applicable |
Average qualifying or completion rate, trended by month: |
X |
|
|
Percent of paid completes rejected per month/project, trended by month |
|
X |
|
Percent of members/accounts removed/quarantined, trended by month |
X |
|
|
Percent of paid completes from 0-3 months tenure, trended by month |
|
X |
|
Percent of paid completes from smartphones, trended by month |
X |
|
|
Percent of paid completes from owned/branded member relationships versus intercept participants, trended by month |
|
|
X |
Average number of dispositions (survey attempts, screenouts, and completes) per member, trended by month (potentially by cohort) |
X |
|
|
Average number of paid completes per member, trended by month (potentially by cohort) |
X |
|
|
Active unique participants in the last 30 days |
X |
|
|
Active unique 18-24 male participants in the last 30 days |
X |
|
|
Maximum feasibility in a specific country with nat rep quotas, seven days in field, 100% incidence, 10-minute interview |
X |
|
|
Percent of quotas that reached full quota at time of delivery, trended by month |
|
X |
|