ATTPS

(Achieving The Trust Paradigm Shift)

ATTPS was a 40-month project, initiated by TDL and co-funded by the European Commission, that finished in October 2015 and involved a majority of TDL member organisations in the 12 partner consortium. Trust is an essential prerequisite for effective digital transactions. It builds on elements like security, privacy, transparency, accountability and reputation. ATTPS supported TDL in addressing the relationship and balance between the business, legal, social and technical aspects of a public trust platform as well as pragmatic actions such as developing and the testing of generic trust architectures and integration pilots. ATTPS strengthened TDL’s road-mapping by implementing and supporting the SRA and actively contributed to the awareness raising for trustworthy ICT solutions.

Priority was given to stimulating and organising the interplay between technology development and legal, social, and economic research through multi-disciplinary research communities; promoting standards, certification and best practices as well as the coordination of national RTD activities.

Generic Trust Architecture

TDL’s developed a generic trust architecture as part of an EC-funded project (ATTPS) that defines requirements, functionalities and a set of building blocks and core components to deliver the main targeted functionalities for trustworthy services, including mobile service and platform integrity, trusted stack and data life cycle management.  Each software component in the overall  architecture corresponds to at least one, sometimes more, of the building blocks.

The architecture can be used by projects as a reference to comply with, to further extend and contribute to. Firstly, a research perspective to further advance and / or develop enabling technologies as well as from a development perspective to come up with components that are highly desired but so far reported as missing or with a limited set of features. This calls for components to be interoperable, packaged, documented and easy to use. This allows gaining of concrete feedback from usage, further improvement of these components and / or development of new components.

Core Components

Screen Shot 2018-01-29 at 09.05.43

Access Control: enables the management of specific permissions and policies to resources allowing different access levels to users. XACML defines a language to express, administer and manage access control policies and defines the following components: PEP (Policy Enforcement Point), Policy Decision Point (PDP), Policy Information Point (PIP), Policy Retrieval Point (PRP) and Policy Administration Point (PAP). Access control is always a complex part to implement, especially for non-security developers, because it involves advanced security concepts (Identity-based, RBAC, ABAC, etc.). Most of the time embedded in application code it is highly recommended to externalize the authorization logic to make it easier to maintain, evolve and integrate with external services providing extra authorization attributes.

Identity Management: covers a number of aspects involving users' access to networks, services and applications, including secure and private authentication from users to devices, networks and services, authorization & trust management, user profile management, privacy-preserving disposition of personal data, Single Sign-On (SSO) to service domains and Identity Federation towards applications. The Identity Manager is the central component that provides a bridge between IdM systems at connectivity-level and application-level. Furthermore, Identity Management is used for authorizing foreign services to access personal data stored in a secure environment. Hereby usually the owner of the data must give consent to access the data; the consent-giving procedure also implies certain user authentication.

Security Monitoring: active observation of the security state of an ICT system, detecting potential attacks or intrusions.

Trustworthy Data Storage: guarantees that the original data is not altered, it relies on relevant techniques and especially automatic data encryption with secure key management.

Trustworthy Data Processing: guarantees that the used hardware and software is in fact processing the data in a way it was supposed to do. As such it offers the mechanisms to prove that the data is processed in a trustworthy manner. It can here rely either on an outsourced trust to the data processing entity or alternatively only computation on encrypted data, which, for example, can be achieved by homomorphic encryption or multi-party computation.

Data Privacy: terms and conditions should be detailed to the end user in simple terms that they understand how their privacy is safeguarded and how and where personal information is used for.

Consent: obtaining explicit permission from a user to a specific request to share personal attributes, formulated in a clear and easy to understand way, will improve trust in the system. This way, the user will be sure that he/she is in control of the data and additionally minimise attribute release.

E-signing: is a scheme for demonstrating the authenticity of a digital message or document, indicating it has managed specific permissions and policies to resources allowing different access levels to users.

Off-boarding: for the end user, de-provisioning of credentials and, for the service provider, revoking of access privileges when the user is no longer part of the system due to a change of roles, subscription status.

On-boarding: obtain correct user credentials to enrol into the service in an easy and secure manner leveraging the availability of existing user’s information to simplify the on-boarding process. For example, some of the columns in a registration form can be pre-filled with user’s existing account info form social log-in. In addition, leveraging online user verification services will improve significantly the user on-boarding process. This can be achieved by having attribute exchange services. For example, a service provider can verify a specific attribute of a user, or there could be an attribute from a device that can trigger fraud alarm etc. Having such as service in place will definitely improve the trust in the system. The process of identity proofing and Validation will define the level of assurance of the user credential and that is a significant factor in the trust architecture.

Strong Authentication: a strong multi-factor authentication mechanism will establish the validity of an attribute of a single piece of data and enhance the security and trust of the platform.

Trustworthy Infrastructure / Cloud: an additional set of components on top of the basic infrastructure/cloud, working together to enable key features such as users trusting that their virtual machines are deployed on computing nodes that satisfy their integrity requirements. This also calls for enhancing existing cloud computing software platform such as open stack for free and open source cloud platform in other dimensions like confidentiality, audit, isolation and resilience. Others aspects such verifiable integrity of the remote resources, protection against insider attacks, trustworthy isolation of virtual computing, storage and networking resources as well as pervasive information flow should also be covered and enabled.

Trustworthy Factory: focuses on processes offered to stakeholders (designers, developers, customers) but also control mechanisms to ensure customers that the hardware and/or software delivered only includes what they initially wanted to have (also can be certified as such).