TDL members exchange ideas, about leading edge technology, methodologies and services. Sometimes it leads to bilateral business contacts and it can also provide leads for new insights and innovation. The association has created several vehicles to fulfil these expectations: one are the sprints, another is a forum for sharing and evaluating innovative software solutions.

TDL’s developed a generic trust architecture as part of an EC-funded project (ATTPS) that defines requirements, functionalities and a set of building blocks and core components to deliver the main targeted functionalities for trustworthy services, including mobile service and platform integrity, trusted stack and data life cycle management.

TDL members can offer, use and validate trustworthy elements (e.g. technology components) and have the possibility to “play around” with technology that is offered and provide feedback to the element provider (i.e. the publisher). The requirements for element publishers to deploy trust elements for download are the provision of a stand-alone service element, first level support, and optional online questionnaires to be filled in by users to gain insights. Each element is evaluated by a TDL committee for final deployment approval.

The motivation for this activity comes from the three complementary perspectives of technology, legal and business, for:

  • Technology parties to deploy their applications or concepts in a regulated and trusted environment to receive feedback on different aspects of functionality.
  • Developers to identify bottlenecks (technical, legal, business models) hampering adoption. It is better to identify these in this generic fashion in order improve their usability in trustworthy ICT solutions.
  • Element providers to receive more insights to support their investment decisions and receiving valuable feedback on user requirements,

The overall concept is to provide validation of the overall technology, concepts and generic architecture to provide input for standardisation bodies on a European level such as ITU-T SG 17, ISO/IEC/JTC1/SC27, ETSI, ENISA, 3GPP SA3 etc.

Access Control: enables the management of specific permissions and policies to resources allowing different access levels to users. XACML defines a language to express, administer and manage access control policies and defines the following components: PEP (Policy Enforcement Point), Policy Decision Point (PDP), Policy Information Point (PIP), Policy Retrieval Point (PRP) and Policy Administration Point (PAP). Access control is always a complex part to implement, especially for non-security developers, because it involves advanced security concepts (Identity-based, RBAC, ABAC, etc.). Most of the time embedded in application code it is highly recommended to externalize the authorization logic to make it easier to maintain, evolve and integrate with external services providing extra authorization attributes.

Identity Management: covers a number of aspects involving users’ access to networks, services and applications, including secure and private authentication from users to devices, networks and services, authorization & trust management, user profile management, privacy-preserving disposition of personal data, Single Sign-On (SSO) to service domains and Identity Federation towards applications. The Identity Manager is the central component that provides a bridge between IdM systems at connectivity-level and application-level. Furthermore, Identity Management is used for authorizing foreign services to access personal data stored in a secure environment. Hereby usually the owner of the data must give consent to access the data; the consent-giving procedure also implies certain user authentication.

Security Monitoring: active observation of the security state of an ICT system, detecting potential attacks or…

Trustworthy Data Storage: guarantees that the original data is not altered, it relies on relevant techniques and especially automatic data encryption with secure key management.

Trustworthy Data Processing: guarantees that the used hardware and software is in fact processing the data in a way it was supposed to do. As such it offers the mechanisms to prove that the data is processed in a trustworthy manner. It can here rely either on an outsourced trust to the data processing entity or alternatively only computation on encrypted data, which, for example, can be achieved by homomorphic encryption or multi-party computation.

Data Privacy: terms and conditions should be detailed to the end user in simple terms that they understand how their privacy is safeguarded and how and where personal information is used for.

Consent: obtaining explicit permission from a user to a specific request to share personal attributes, formulated in a clear and easy to understand way, will improve trust in the system. This way, the user will be sure that he/she is in control of the data and additionally minimise attribute release.

E-signing: scheme for demonstrating the authenticity of a digital message or document, indicating it has…

Off-boarding: for the end user, de-provisioning of credentials and, for the service provider, revoking of access privileges when the user is no longer part of the system due to a change of roles, subscription status.

On-boarding: obtain correct user credentials to enrol into the service in an easy and secure manner leveraging the availability of existing user’s information to simplify the on-boarding process. For example, some of the columns in a registration form can be pre-filled with user’s existing account info form social log-in. In addition, leveraging online user verification services will improve significantly the user on-boarding process. This can be achieved by having attribute exchange services. For example, a service provider can verify a specific attribute of a user, or there could be an attribute from a device that can trigger fraud alarm etc. Having such as service in place will definitely improve the trust in the system. The process of identity proofing and Validation will define the level of assurance of the user credential and that is a significant factor in the trust architecture.

Strong Authentication: a strong multi-factor authentication mechanism will establish the validity of an attribute of a single piece of data and enhance the security and trust of the platform.

Trustworthy Infrastructure / Cloud: an additional set of components on top of the basic infrastructure/cloud, working together to enable key features such as users trusting that their virtual machines are deployed on computing nodes that satisfy their integrity requirements. This also calls for enhancing existing cloud computing software platform such as open stack for free and open source cloud platform in other dimensions like confidentiality, audit, isolation and resilience. Others aspects such verifiable integrity of the remote resources, protection against insider attacks, trustworthy isolation of virtual computing, storage and networking resources as well as pervasive information flow should also be covered and enabled.

Trustworthy Factory: focuses on processes offered to stakeholders (designers, developers, customers) but also control mechanisms to ensure customers that the hardware and/or software delivered only includes what they initially wanted to have (also can be certified as such).