What is your definition of a digital service?
The Digital Trust Label denotes the trustworthiness of a digital service. Our definition of digital service is aligned with the official definition by the European Commission. Digital services include a large category of online services, from simple websites to internet infrastructure services and online platforms. Digital services come in many forms and are omnipresent. We are expecting certifiable services in three categories:
- Low complexity services (e.g. newsletter subscription)
- Medium complexity services (e.g. instant messaging apps)
- High complexity services (e.g. Blockchain infrastructure and banking services)
It does not make sense to label every digital service, but especially applications where sensitive data are shared and/or a decision is taken by an algorithm.
What services should the Digital Trust Label cover?
The Digital Trust Label is especially suitable for digital services that involve sensitive data and where automated decisions are taken with significant consequences. Industries that are of particular relevance for the label are healthcare, public, media, HR, education, as well as banking & insurance. The label primarily targets B2C services.
What do I have to do to get the label?
Step 1) Check out our label criteria catalogue and obtain a first overview of the label audit process incl. cost. Generally, the label cost is between 22’000 – 45’000 CHF and highly depends on the complexity of the service to be audited.
Step 2) Internally identify a suitable digital service matching the criteria catalogue. A B2C service with a broad user impact and/or an automated decision making process is recommended.
Step 3) Contact the DTL Project Manager, Diana (email@example.com) to set up a scoping call with the auditor (SGS) to discuss your chosen digital service and plan the next steps of the auditing process.
Is a code review part of the audit?
No code review will be conducted as part of the audit. Instead, conducted code reviews by well recognized third-party providers can be accepted as evidence. In case such document does not exist, “snippets” of the code, hence “source code spot checks” can be conducted as part of the audit.
How do you ensure the safety of a digital service as part of the audit?
The security of a digital service is primarily assessed within the categories of security, data protection and reliability. As part of the audit, checks on e.g., state-of-the-art password protection; best practices for user authentication as well as processes for fighting cyber-attacks will be assessed. Pen Tests, Monitoring, Coding Standards, Test Plans and SLAs with third-party providers thereby serve as evidence. However, it is important to highlight that 100% security can never be granted (impossible).
How do you assess criteria on AI / Fair User Interaction?
The check for the fair user criteria is focusing on the fair and unlimited access to a digital service across all populations, genders etc. In short, no person must be systematically disadvantaged or excluded from the digital service. This will be assessed by analyzing the e.g., decision flow of a digital service, conducted user tests, analysis or certification from credible third-parties or a live demonstration of the service during the audit.
Will the label become obsolete once new regulations are being introduced on the European Level?
The Digital Trust Label clearly signals to users that the provider of the certified digital service is willing “to go the extra mile”, going beyond what is legally required. As such, the label will not necessarily be made obsolete by new regulations but it will probably mean significant changes to the label over time. Of course, SDI very much welcomes a standardized approach that regulates digital services and increases transparency and trust for the consumers. Until a standardized legislation is in place, we believe that the Label can serve this important cause as a soft-law instrument and we look forward to working together with other initiatives to advance digital trust.
How did you end up with the label catalogue criteria? Do you plan on further developing the catalogue criteria?
- A high-level Label Expert Committee has been responsible for defining the label catalogue criteria and two public consultation processes have given all interested stakeholders the possibility to give feedback and inputs in defining “trustworthy digital services”.
- Currently, our 35 defined label criteria are spread across four categories (Security, Data Protection, Reliability and Fair User Interaction). Adjustments or additions to the categories and/or criteria are possible as the label develops. We are keen to provide a challenging criteria catalogue t. The label must carry a strong purpose; however, we also need to consider the trade-off between developing a challenging label and making it practical and affordable enough for companies to conduct an audit.
- The label is understood as an ongoing and collaborative effort for strengthening transparency, trustworthiness and understandability of digital applications.
- The release of the first version is a starting point and the Label needs to continuously develop.
Why do you focus on four criteria dimensions instead of doing one thoroughly? Is it even doable to audit this complex and diverse number of criteria?
- The criteria catalog and the dimensions of the label are based on various studies and the work of a the Label Expert Committee.
- We conducted research on the factors determining digital trust (- all results published here) – showed that the four categories strike a balance between.
- Drawing from these findings, we decided for a holistic approach for the label criteria. Four key dimensions constitute the core of the label criteria. The operationalisation of the core principles is done through precise technical, legal or administrative specifications that can be externally verified.
- The catalogue is built on existing standards, such as ISO 27001 (information security management system), ISO 22301 (business continuity management system) and GDPR (European data protection legislation). It does not cover the P-D-C-A system, but all topics directly related to the security of the product
How long does an audit usually take?
The length of the audit depends on several parameters such as the completeness of the evidence provided to the auditor, resource availability and the findings as part of the audit, e.g., if there are any non-Conformities that must be solved prior to proceeding the audit. We recommend planning 14 full working days for the audit which includes the gathering of all relevant documents and evidence, interviews with the auditor and any further potential clarifications from the auditor.
What happens when digital service providers are violating the label, can this be reported to the SDI?
We are looking to involve an independent Ombudsperson who will be in charge of taking on any potential violations and examine them on a case by case basis in the future. Until then, we are taking any feedback and potential violations via the contact form.
What is the role of SDI in the auditing process?
SDI is the label owner, hence defines the label criteria and steps to obtaining the label and makes the final label award decision. The main interaction throughout the label process will be with SDI and the auditor. Once the audit report of the candidate company has been received, independent experts will conduct the technical review to grant the label certificate. SDI has decided to conduct the technical review to provide an additional layer of trustworthiness to the auditing process to underline the credibility of the label
What happens if not all criteria is applicable for a digital service?
Not all of the label criteria may be applicable depending on the digital service to be audited. In such cases, the auditor will evaluate and discuss the list of applicable criteria with the organization directly. If a majority of the label criteria are not applicable to the digital service to be audited, the label may be deemed unsuitable for that digital service. In this case, SDI has the right to discontinue with the auditing process to safeguard the credibility and standard of the label.
Which evidences will be requested during the Digital Trust Label audit?
For the first phase of the Digital Trust Label audit primarily documents on policies and procedures describing the general implementation of the respective requirements of the Digital Trust Label standard are required.
This can e.g. include the internal Information Security Standard, Data Privacy Standard or the risk-assessment. Also the Desaster Recovery plans or legal approvals can be of relevance. If such documentation is not available also informal descriptions can be provided.
In the second phase, i.e. the actual audit, the implementation of the policies and procedures must be proven with the respect to the service in scope. For this the auditors will rely as much as possible on already existing evidences, e.g. other existing (ISO 27001) audit reports, penetration test reports or test reports. Also detailed checks and demonstrations on the live system or checks of configurations (e.g. for servers) are possible. For external services for example SLAs oder contracts with suppliers can be reviewed.
What is the USP of the Digital Trust Label?
With the Label, we bring trust back into tech. By using a clear, visual, plain, and non-technical language, the Digital Trust Label denotes the trustworthiness of digital services in a way that everyone can understand.
By combining the dimensions of security, data protection, reliability and fair user interaction, the DTL takes a holistic approach when it comes to addressing the complex question of digital trust. Furthermore, the Label was developed under a multi-stakeholder approach with representatives from the private and public sector as well as civil society.
The Digital Trust Label acts as a concrete soft law instrument and is built on existing standards such as ISO and GDPR.
Therefore, the Digital Trust Label enables companies to combine renowned standards with an additional user-centered approach which is achieved primarily through the Fair User Interaction category.