4.1.2 Interactions between Cloud Actors Related to Accounts
First we consider project framing (to set out the general context in which accounts are produced in areas of focus for A4Cloud project) and then the process of generating and verifying an account.
4.1.2.1 Project framing
As discussed further within A4Cloud deliverable D:32.1 [1], a cloud actor (accountor) is accountable to certain other cloud actors (accountees) within a cloud ecosystem for:
- Norms: the obligations and permissions that define data practices; these can be expressed in policies and they derive from law, contracts and ethics.
- Behaviour: the actual data processing behaviour of an organisation.
- Compliance: entails the comparison of an organisations actual behaviour with the norms.
For the project scope, the accountors are cloud actors that are organisations (or individuals with certain responsibilities within those) acting as a data steward (for other peoples personal and/or confidential data). The accountees are other cloud actors, that may include private accountability agents, consumer organisations, the public at large and entities involved in governance.
Contracts express legal obligations and business considerations. Also, policies may express business considerations that do not end up in contracts. Enterprise policies are one way in which norms are expressed, and are influenced by the regulatory environment, stakeholder expectations and the business appetite for risk. By the accountor exposing the norms it subscribes to and the things it actually does, via an account, an external agent can check compliance.
4.1.2.2 Accounts shown to different data protection roles
Generally speaking, the sort of information that an organisation needs to measure and demonstrate in such an account includes: policies; executive oversight; staffing and delegation; education and awareness; ongoing risk assessment and mitigation; program risk assessment oversight and validation; event management and compliance handling; internal enforcement; redress [22] [12]. Existing organisational documents can often be used to support this analysis [13]. Measurement of the achievement needs to be done in conjunction with the organisation and the external agents that judge it, which is dependent upon the circumstances, and to other entities that may need to be notified. Some examples of accounts that may be provided to cloud actors fulfilling certain data protection roles in a given context are shown in Table 16.
Type of Account |
Data Protection Roles |
Example Cloud Actor producing the Account |
Account for self-certification/verification |
Data Controller (DC), for Data Protection Authorities (DPAs) and their customers |
Organisational Cloud Customer |
Periodic internal reviews (to check that mechanisms are operating as needed and update if required) |
DC or Data Processor (DP), for themselves or auditors |
Organisational Cloud Customer, Cloud Provider |
Evidence provided by risk analysis, PIAs and DPIAs (including assessment along the CSP chain and how this was acted upon) |
DC, for DPAs and their customers |
Organisational Cloud Customer |
External certification e.g. BCRs, CBPRs, CSA OCF level 3, privacy seals, accountability certifications, security certifications |
DC or DP, for certification bodies (evidence for certification) or for customers (evidence of certification) |
Organisational Cloud Customer, Cloud Provider |
External audit (ongoing) |
DC or DP, for auditors (evidence) or customers (audit output) |
Organisational Cloud Customer, Cloud Provider |
Verification by accountability agents |
DC to agent, output to DPA |
Organisational Cloud Customer |
Evidence about fault if data breach |
DC to DS, DC to DPA, DP to DC, DP to DP |
Organisational Cloud Customer, Cloud Provider |
Table 16: Accounts provided by whom to whom and in what circumstances.
4.1.2.3 Verification of accounts
It is not just a question of interaction between actors in the provision of accounts, but also in the verification of accounts. Verification methods may differ across the different forms of account in the cloud, as considered further below. As briefly mentioned in section 3, the company Nymity [13] has provided an example structure for evidence and associated scoring mechanism for accountability based on existing documentation that can form some of these types of accounts but some organisations may want to take a different approach and so this should not be regarded as a standard. The Nymity accountability evidence framework is intended for collecting evidence in a single organisation and for demonstrating accountability that is structured around 13 privacy management processes [13].
There are different levels of verification for accountability, as proposed by Bennett [23], which correspond to policies (the level at which most seals programmes operate), practices and operations. It is very weak to carry out verification just at the first of these levels instead, mechanisms should be provided that allow verification across all levels. Most privacy seal programmes just analyse the wording in privacy policies without looking at the other levels, and thus provide verification only at this first level (of policies). The second level relates to internal mechanisms and procedures, and verification can be carried out about this to determine whether the key elements of a privacy management framework are in place within an organisation. Few organisations however currently subject themselves to a verification of practices, and thereby being able to prove whether or not the organisational policies really work and whether privacy is protected in the operational environment. To do this, it seems necessary to involve regular privacy auditing, which may need to be external and independent in some cases.
In terms of the verification process, there are various different options about how this may be achieved. There could for example be a push model in terms of the account being produced by organisations or else a pull model from the regulatory side; the production of accounts could be continuous, periodic or triggered by events such as breaches. In general, there should be spot checking by enforcement agencies (properly resourced and with the appropriate authority) that comprehensive programmes are in place in an organisation to meet the objectives of data protection. There could in some cases be certification based on verification, to allow organisations to have greater flexibility in meeting their goals.
It is often regarded as underpinning an accountability-based approach that organisations should be allowed greater control over the practical aspects of compliance with data protection obligations in return for an additional obligation to prove that they have put privacy principles into effect (see for example [22]). Hence, that whole approach relies on the accuracy of the demonstration itself. If that is weakened into a mere tick box exercise, weak self-certification and/or connivance with an accountability agent that is not properly checking what the organisation is actually doing, then the overall effect could in some cases be very harmful in terms of privacy protection. As Bennett points out ( [23] p. 45), due to resource issues regulators will need to rely upon surrogates, including private sector agents, to be agents of accountability, and it is important within this process that they are able to have a strong influence over the acceptability of different third party accountability mechanisms.
In particular, it is important that the verification is carried out by a trusted body that does not collude with the accountor, and that it is given sufficient resources to carry out the checking, as well as there being enough business incentive (for example, via large fines) that organisations wish to provide appropriate evidence to this body and indeed implement the right mechanisms in the first place.
The overall process around verification of an account is summarised within Figure 14.
First of all, there is a certain context in which the 'start' - labelled (1) in Figure 14 - would apply, in other words the context in which an organisation might need to give an account, or might wish to do this voluntarily. Broadly speaking, these situations requiring or involving production of an account may be characterised as follows:
- Regulatory obligation: The most typical situation where there is a legal obligation to produce an account is where governmental bodies or regulatory agencies enforce rights or obligations, by means of an investigation, a request for information or a spot check by a Data Protection Authority (DPA).
- Contractual undertaking: A legal obligation could instead come from the organisation itself, for instance from a contractual obligation to give an account. The cloud service provider may have given a contractual obligation in its terms of service or in a SLA that it would provide an account (for example, a data breach notification procedure) or that it would demonstrate compliance in some way. Another situation may be that the Cloud Service Provider (CSP) has undertaken to get third party certification for compliance or for some process and so is required by the third party to give an account of certain processes in order to get certification.
- Voluntary undertaking to give an account: The CSP may just state (in a policy published on its website for example) that it would provide an account in certain circumstances or make best efforts to do so. Many policies published in this way are not legally binding or may not be incorporated into the contract between the CSP or the customer, so the CSP can refuse to give the account or may claim that it cannot do so and has made a 'best effort'.
Next, supposing this context is in place, the organisation (as accountor) is supposed to give account of not only its actions, but also its results and intentions to the accountee - cf. (2) in Figure 14. Exactly what must be provided will vary according to the context; for example, specific information will be expected in the case of the accountor wishing to be certified.
If an organisation gives no account in the first place, there should be repercussions about this that might include the obligation to give a refined account, defined according to the accountees or assessors needs- cf. (3) in Figure 14. For example, in the case of regulatory requests, the consequences could be fines. In the case of contractual undertakings, failure to produce an account would be a breach of contract that entitles the customer to damages, or service credits (for breach of SLA) or gives a right to the customer to terminate the contract without notice. Failure to produce an account needed for a third party certification of compliance would mean that the CSP could not obtain the certification. This may have direct legal consequences for the relationship between the CSP and its customer (depending on whether this was a condition of the contract) because the customer may decide to terminate or not to renew the contract. In the case of a voluntary undertaking, although there would be no legal redress for the customers, the consequences of refusal to give an account may involve damage to its reputation by disgruntled customers.
If the organisation does provide an account, this can result in one or more documents being provided, or information being captured by other means, as the account provided by the organisation could be written or oral- cf. (4) in Figure 14. For further information, see for example [27], which expands upon real life cases in which multiple accounts can be created by a Data Controller for presentation to a regulator.
The accountee then assesses the account (5), potentially making reference to additional information (6). The level of satisfaction with the account is gauged (7), in the sense that the account may be judged to show that the organisation is compliant (if appropriate), or else may be judged to provide a satisfactory explanation about a data breach event. On the other hand, the accountee may judge the organisation to not be compliant (and hence for example, not issue a certificate of compliance) (9), or wish to have additional information about the event. Especially in the case of a data protection report, the accountee probably requires more than just information, in other words clarification, explanation, updating and also most probably corrective action. Hence, even if the account process is complete in the sense that the accountee may accept the account is accurate and may be satisfied with it, it could be that they are not satisfied in the sense that the account shows that some action/omission has caused and is causing harm and needs additional action. For this reason, the ‘End of account’ (10) may only be the start of another process, even if the accountee is satisfied with the account. ‘Next steps based on account’ reflects that this process may follow; it could include for example remediation, actions based on the account, further investigation, etc. After all, an account of a breach should contain something about ongoing corrective action.
Accountability agents or other third parties could be used to provide verification of accounts, and serve as an intermediary to the ultimate accountees, some of whom may impose sanctions (8). If, as considered within D:C-2.1, there is a good trust relationship between such an agent and the accountee, then the agents account is likely to be directly accepted by the other accountees.
The account process is taken to finish (10) if either an account has been provided that is found to be satisfactory by the accountee or an agent acting on its behalf, or the account is not found to be adequate and appropriate actions are taken by the accountee against the accountor. However, this notion of finishing is too coarse-grained, as discussed above. Furthermore, accountability is not a binary state, but has a certain level of maturity. Correspondingly, accounts have a certain effectiveness and appropriateness. Depending on the maturity an accountee may be satisfied or not, and the threshold of this maturity might differ depending on the accountees or the event about which one is asked to give an account. Hence, more mature account might be provided, or different ones for different accountees, events, etc., so this is another reason why 'End of account ' is not necessarily an end state, but the process might be repeated from the start with a different degree of maturity or threshold.
Sanctions might be applied at several points, notably if the organisation does not provide an account in the first place (3), if it fails to respond adequately to the dialogue with the assessor, or if the assessor is not satisfied in respect to the accounts produced (9). In fact, the use of the word sanction, here meaning a consequence of an inadequate or non-provision of an account, is avoided within Figure 14 because in legal terms sanction refers to a punishment imposed by a legal or regulatory authority, for example fine, imprisonment of penalties for disobedience, whereas we also want to include non-regulatory actions imposed by the accountee, which is perhaps the customer, and this could for example mean contract termination or perhaps a contractual penalty for failure to produce a report. Such consequences or repercussions are therefore represented quite broadly in Figure 14 as actions by the accountee against the accountor.
The process of providing an account could be quite complex, and this is just a generic overview of that process. There could be multiple documents that in the form described here provide an account, but each of which may be viewed as an individual account, and perhaps even have a slightly different process flow. For example, MS:D-4.4 provides an example of how multiple accounts provided by different parties within an organisation are aggregated by a senior officer, who acts as a communication interface with the accountee (in this case, the regulator); this officer would interact further if needed with the various internal teams that produced the accounts if further information is required.
The element of responsiveness is not necessarily in the account itself, yet in the interaction between what the account should be about (and how it should be refined if deemed inappropriate) and in the establishment of the account objects, i.e. the norms that need to be compared with actual behaviour (compliance). Part of the norms to which actual (system) behaviour is compared should be defined in a two-way communication (dialogue) between cloud providers and external stakeholders, which includes cloud users, regulators and the public at large.
The process of generating and verifying accounts for certification could be more specialised than the flow shown in Figure 14 (for example, it could involve assessment by multiple parties) and would need to be adapted as the purpose of verification of the account and possible outcomes would differ, i.e. result in a certain level certification, or no certification being given.
This flow shown in Figure 14 is a generic flow that could apply in range of contexts and is not cloud-specific. With regard to cloud contexts, as with other service provision delivery contexts involving a chain of providers, provision of an account might involve chaining of accounts. For example, an account provided by an organisation using the cloud that is acting in the capacity of a data controller, to a data protection authority might be constructed using accounts that had previously been provided to it from the cloud service providers that it was using.
Figure 14: High level view of the provision and verification of an account.
Download the preliminary release of the Cloud Accountability Reference Architecture and the relevant A4Cloud Toolkit.