Data responsibility

Register your interest in this issue

Does your organisation collect, process and use data responsibly?

EXCELLENT Answers

No EXCELLENT answers have been published for this question.

GOOD Answers

No GOOD answers have been published for this question.

OKAY Answers

No OKAY answers have been published for this question.

POOR Answers

No POOR answers have been published for this question.

Technological advancements have increased the quantities of data that can be collected, processed and stored, and the speed at which this happens. Data is now an incredibly valuable and useful tool available to organisations with a range of applications. However, with this comes ethical considerations that go beyond just protecting the privacy of those whose data is being collected.

Consumers are becoming increasingly concerned about the extent that companies are able to collect data on them without their knowledge, how this data is used, and the potential insights that can be gleaned from seemingly innocuous information. More and more voices are challenging the amount of control companies can wield over individuals’ lives as a result of the data they have gathered.

Businesses that engage in data processing and storage face considerable fines for breaches or lapses in privacy protection. Risks to reputation and losing stakeholder trust and confidence may be considerably greater still. As such, the financial implications for getting things wrong are great.

At the same time, good data used well has enormous potential for positive impact on people and planet. More and more businesses are looking beyond mere risk and liability minimisation to explore how their data activities might contribute to the public good.

Responsible Data Forum defines handling data ‘responsibly’ as: “The duty to ensure people’s right to consent, privacy, security and ownership around the information processes of collection, analysis, storage, presentation and reuse of data, while respecting the values of transparency and openness.” It aims to do no harm and to use the outcomes of data processing to create products and services that benefit individuals and society at large.

Using customers’ data can help businesses to target advertising, tailor services and innovate new products - it has a positive impact on the bottom line. Consumers can also benefit through improved services, new products that meet particular needs, or personalised support, such as recommendation algorithms and financial planning assistance. However, collecting and processing data has an unavoidable impact beyond the organisation.

The increasing reliance on data use can also exacerbate digital inequality. Digital illiteracy or lack of access to technologies can mean that some groups cannot share data, cannot control what data is being shared, do not benefit from its results and do not understand their rights on the information that is collected on them. For instance, those on low incomes may not be able to afford smartphones or important security updates to platforms, making them more at risk of financial crime, and less able to access online services. Companies have also been criticised for collecting data on consumers and using that data to develop products and services without compensating them. Data is valuable, and the ‘if it's free you’re the product’ model creates concerns about who should benefit from an individual’s data.

Similarly, the insights gained from data analysis can entrench divisions. Criticism has been levelled at businesses seeking to make their products addictive in order to monetise users’ time, attention and networks, or to advertise products that they don’t need or are harmful, for example debt.

The gathering and storing of data of individuals and groups may also facilitate human rights abuses. These can range from aiding government surveillance, enabling internment, facilitating discrimination of ethnic groups or marginalised communities or even directly contributing to genocide. Whilst these effects may be unintended from the point of view of businesses, they should be aware of the possible ramifications of collecting certain data, who they share it with and under what circumstances. Companies have already been criticised about illegally sharing data with governments.

Taking an ethical approach to data starts before any data has been collected, and progresses throughout the project or product design and implementation process. Issues to be considered include:

  • Data minimisation. This avoids the collection of data for collection’s sake. By carefully identifying what data is needed, an organisation can avoid gathering, and taking on responsibility for, unnecessary data. This also minimises the risks of data loss and data theft, and the environmental impact of data storage.
  • Privacy by design. Building privacy measures in at the beginning of a project is easier and more effective than tacking it on at the end. For this to be successful, privacy is something key individuals across the organisation need to understand fully, not just the IT department or other internal siloes.
  • Cautious approach. An organisation taking a cautious approach will seek to understand fully the risks and possible repercussions of collecting data and, ultimately, if these cannot be fully grasped, this data may not be gathered. This may mean that some functions cannot be carried out, but ultimately protects data subjects from unforeseen negative effects. This can apply at any stage of the data process.
  • Informed consent. For consent to be truly informed, it must meet several criteria. Firstly, the consumer must be truly informed, so information must be available and clear. Consent not only applies to data being gathered, but also how it will be used. Pre-ticked boxes and opt-outs are not sufficient for gaining truly informed consent. Additionally, refusing consent cannot result in loss of services, so alternatives must be available, including non-digital options. All-or-nothing options that currently exist, such as those consumers ‘agree’ to to use social media or apps, do not meet this criteria.
  • Data security. An organisation must take adequate measures to ensure the privacy of data subjects. This may include protecting hardware as well as databases, and implementing measures such as strong passwords or clear-desk policies to prevent human error. Safeguards and measures such as encryption can help to avoid the worst if errors or breaches do happen. Securing data properly is not just about protecting individuals’ privacy, but may also be necessary to protect groups of people, especially if data on these groups could cause them harm or disadvantage, such as communities at risk of repression, violence or exploitation.
  • Sensitive data. It is especially important that companies protect sensitive data, which can include, but is not limited to, medical records, personal financial data, personal communications etc. Companies should be aware that what is ‘sensitive’ data varies from person to person and can change with circumstances.
  • Combining datasets. Linking or combining datasets can lead to greater insights and understanding, but the increase in the number of data points can threaten individuals’ anonymity. This is known as the Mosaic Effect. Additionally, initial consent may not extend to this new use, particularly if anonymity is compromised, so if data collected is going to be used in this way, it should be made clear to data subjects.
  • Processing. Any kind of processing of data has been subject to human decision making, and therefore risks biases and errors, which can then be magnified and reproduced. Involving diverse groups of people at this stage can go some way to mitigating this, such as consulting groups whose data is being used.
  • Value chain. Data security is something that can be compromised at any stage of the value chain, whether this be buying or selling data itself or the tools used to collect, store and analyse it.
  • The benefits and risks of sharing data. Data has the potential to provide valuable insights and solutions for any number of challenges, yet keeping these internal to the organisation that has analysed the data limits their impacts. Sharing information between institutions, such as tech companies and governments, can improve public services and increase efficiencies, or publicly owned data can assist organisations in tackling and understanding issues concerning public health, infrastructure or behaviour change. However, this sharing also comes with risks. Citizens are sometimes not aware that public bodies are sharing their data, and this data can be used simply to serve the profits of companies. Alternatively, there are civil and human rights concerns with companies sharing data with governments, repressive or otherwise.
  • Ownership. The current model assumes that data is owned by those that collect it. This includes the insights gained from it. An ethical approach to data understands that individuals should have ownership over the data about themselves, and therefore have a say in what happens to it. Insights from aggregate data can also be shared more widely or be made public to serve the common good, ensuring adequate protection for the individuals involved.
  • Accountability. Organisations can use the data gathered on subjects to categorise and label them. This information is rarely available to the consumers themselves and means that they cannot challenge or correct this process. Responsible data promotes transparency and accountability in how data is used instead, and allows data subjects more control over this process.
  • Secure disposal/archiving. Once data is no longer in active use, or is to be deleted, care must be taken to ensure this happens effectively, and that any archive is protected against breach, error or future changes, such as the company being sold, or future government intervention.

While regulations exists regarding data use, these often focus more on minimising corporate liability than protecting the rights of data subjects, and have been criticised for being insufficient or unable to keep up with technological advancements. Organisations going beyond regulatory minimums have clear policies and practices to guide their actions and have clear guidance on what is and is not acceptable when it comes to data.

The EU’s GDPR (General Data Protection Regulation), set to come into force on 25 May 2018, is a huge development in strengthening regulation, granting further rights to the individual, and redressing the power imbalance between data subjects and the companies who hold data on them. The GDPR will affect any organisation that controls or processes data on EU citizens. It will affect how organisations run their day to day operations, putting a higher burden of proof of responsibility onto those organisations.

Included in the changes are:

  • Mandatory breach reporting (with some exemptions) and fines of up to £20 million, or 4% of annual turnover, whichever is higher
  • Broadens and clarifies definitions of personal data, e.g. genetic and biometric data
  • Grants individuals the right to be forgotten, which includes throughout an organisation’s supply chain
  • Grants individuals the right to data portability, enabling them to share their data with who they like, and to specify the amount of time it is shared for, e.g. open banking
  • Strengthens the definition of informed consent.
Big data

Extremely large data sets that may be analysed computationally to reveal patterns, trends, and associations, especially relating to human behaviour and interactions.

Data minimisation

'Data Minimisation' is a principle that states that data collected and processed should not be held or further used unless this is essential for reasons that were clearly stated in advance to support data privacy. In the General Data Protection Regulation (GDPR), this is defined as data that is:

  • Adequate
  • Relevant
  • Limited to what is necessary for the purposes for which they are processed.
Data subject

The 'Data Subject' is a living individual to whom personal data relates

Digital divide

A 'digital divide' is an economic and social inequality with regard to access to, use of, or impact of information and communication technologies.

Informed consent

'Consent' must be freely given; this means giving people genuine ongoing choice and control over how their data will be used. Consent should be obvious and require a positive action to opt in, and requests must be prominent, unbundled from other terms and conditions, concise and easy to understand, and user-friendly. Data subjects must also understand what they are consenting to, so information should be freely available, easy to access and in plain language.

GDPR

The 'General Data Protection Regulation' (GDPR) (Regulation (EU) 2016/679) is a regulation by which the European Parliament, the Council of the European Union and the European Commission intend to strengthen and unify data protection for all individuals within the European Union (EU). It also addresses the export of personal data outside the EU. The GDPR aims primarily to give control back to citizens and residents over their personal data and to simplify the regulatory environment for international business by unifying the regulation within the EU.

Mosaic effect

When the combination of different pieces of information on an individual can identify them, even if no piece specifically identifies them alone. This effect becomes stronger the more data points there are to combine.

Internet of things

The interconnection via the Internet of computing devices embedded in everyday objects, enabling them to send and receive data.

Privacy by design

'Privacy by Design' is an approach to systems engineering which takes privacy into account throughout the whole engineering process.

Responsible Data

Responsible Data Forum defines 'responsible data' as: “The duty to ensure people’s right to consent, privacy, security and ownership around the information processes of collection, analysis, storage, presentation and reuse of data, while respecting the values of transparency and openness.”

Answering YES

All Businesses MUST

Outline how data is used within their organisation

Describe the tools and processes that are in place to collect, store and process data

Outline their approach to responsible data use and how this is carried out in practice at all stages of data processing

Describe the risk analysis process regarding data, and what steps are taken to mitigate these risks

All Businesses MAY

State any philosophies and beliefs they hold relating to data collection and use

State how the rights of data subjects are protected and promoted

Describe any partnerships they have with other organisations to collect and use data

Outline how the data they process and its results can serve the public good

Describe who is responsible for any aspects of data responsibility within the organisation

Describe how data responsibility is considered in partnership with other organisations, including supply chain

Confirm that responsible data is considered by all levels of the organisations, including board and leadership

Confirm that staff throughout the organisation understand the importance of data responsibility to their role, and describe training if applicable

Outline how staff are engaged in data responsibility issues

Outline how GDPR and other regulations are being met

Describe if and how they influence others to practice data responsibility

Answering NO

All Businesses MUST

Describe if and how they influence others to practice data responsibility

All Businesses MAY

Describe any efforts regarding data responsibility that do exist, even though all the requirements to answer YES to this question are not met

Mention any future intentions regarding this issue

DON'T KNOW is not a permissible answer to this question

NOT APPLICABLE is not a permissible answer to this question

Version 1

To receive a score of 'Excellent'

Data responsibility is fundamental to the business and rigorous policies and practices are in place. Best practice on data responsibility is of strategic importance.

Examples of policies and practices which may support an EXCELLENT statement (not all must be observed, enough should be evidenced to give comfort that the statement is the best of the four for the business being scored):

  1. Is innovating and leading the industry response to GDPR and how data can be used for the good of the user and society
  2. Publicly commits to data responsibility
  3. The business ensures that the organisations it sources technologies and algorithms from are reputable and transparent
  4. Potential ethical issues are carefully considered at all stages of data handling
  5. Company has stringent security measures in place regarding use and access to sensitive data
  6. The company guarantees the protection of privacy of data subjects through strict rules
  7. Ethical data considerations are extended to employee data as well as customer
  8. Data subjects have easy access to any data held on them, including employees, and data is in an accessible format
  9. Data subjects, including employees, can correct any errors and contest categorisations or how data has been used
  10. Efforts are made to collect only what data is deemed necessary
  11. Regular screening of data to securely delete any non-essential data collected
  12. Care is taken when combining or linking data sets to ensure anonymity of data subjects
  13. Consideration of risks and benefits are made when collecting and processing, and when risks are considered too high, the action is not taken
  14. Environmental impacts are considered when making decisions about data storage and processing
  15. Transparency does not compromise privacy of data subjects
  16. Data is securely disposed of when no longer needed, including along supply chain, in archives or backups
  17. Archives of data are secured and protected
  18. Consent must be actively given, i.e. passive methods of getting consent such as pre-ticked boxes are not used
  19. Privacy agreements and Terms & Conditions are easily accessed, concise and written in clear language
  20. Users are informed about what data will be collected and how it will be used
  21. Efforts are made to renew consent if use of data changes significantly from what was originally agreed
  22. Viable alternatives are offered to users who are not willing to consent to data collection, e.g. analogue access of services
  23. Consideration is given to the ability of users to consent, especially vulnerable groups
  24. Collection and use of data does not contribute to digital divide, inequalities or exclusion
  25. Efforts made to increase digital inclusion and digital literacy through education or outreach programmes, etc
  26. Insights gained and solutions reached as a result of data processing are used to tackle social problems, such as public health, infrastructure, emergency response, etc.
  27. Data subjects maintain control over their own data and can choose what data is collected and how it is used
  28. Data subjects have access to insights gained from aggregate data, not just results from their own data
  29. Privacy considerations and measures are extended to groups as well as individuals
  30. Practices ‘Privacy by Design’
  31. Efforts are made to ensure and maintain accuracy of data throughout processing and collection
  32. Recognition that processing is subject to human input and efforts made to reduce biases and protect against human error
  33. Has clear 'right to be forgotten' policies in place, and requests can be easily carried out
  34. Ensures individuals can easily transport and share the data held on them with other organisations in a secure and accessible way
  35. Puts the requirements of the individual data subject at the centre of their data policies, and ensures the dignity, privacy and rights of individuals are protected
  36. Use of data aims to serve the needs of data subjects as well as the organisation
  37. Data subjects are engaged in data collection, including identifying which data is considered sensitive
  38. Effective measures are taken to protect and secure data, including encryption, safeguards against human error etc
  39. Engages stakeholders to best understand risks and repercussions of data collection and processing
  40. Does not sell or share data without explicit permission from data subjects
  41. Clearly identifies individuals and teams responsible for different aspects of data with organisation
  42. Ethical data issues are clearly communicated to staff, and staff are engaged in the issues
  43. Training is provided to staff on ethical data issues
  44. Organisation’s policies are continuously reviewed and evolve to reflect new standards
  45. Commitment to data responsibility is integrated across the organisation’s operations
  46. The organisation lobbies and campaigns for others to follow suit, or for practice/legislation to be improved
  47. There is a clear chain of responsibility on data, with a named person on the board holding ultimate responsibility.
To receive a score of 'Good'

The business demonstrates a clear commitment to data responsibility and has numerous best practices in place

Examples of policies and practices which may support a GOOD statement (not all must be observed, enough should be evidenced to give comfort that the statement is the best of the four for the business being scored):

  1. The board understands data in the context of its business, and takes some responsibility
  2. The company has a large degree of oversight of the chains of ownership of its algorithms, but does not have full control
  3. The company makes efforts to ensure their online facilities (apps, websites) are secure and built by ethical developers
  4. Practices data minimisation
  5. Implements thorough risk analysis on data use policies and practices
  6. Engages stakeholders regarding use of data
  7. Aims to minimise inaccuracies and biases in data and data processing
  8. Employees are made aware of risks to the company and client in case of data breach by employees or third parties
  9. Company trains employees on responsible data issues
  10. Complies with GDPR and has trained staff about its implications on the company and service users
  11. Data subjects have an opt out option when it comes to sharing certain personal data
  12. Recognises that regulation is a minimum and goes beyond legal requirements
  13. Customers can access data held on them, but it is not necessarily easily accessible or manageable
  14. Informed consent practices are in place and data subjects have control over what data is collected
  15. Data subjects can withdraw consent
  16. Has alternative services and products for those who do not wish to consent to data collection
  17. Ethical issues that may arise due to use of data are considered at several stages of projects/product life cycles
  18. Data is protected to an industry standard level, and is eliminated if not needed
  19. The company protects data privacy well, and has failsafes in place for worst case scenarios
  20. The company is open about the data it collects and how it is used
  21. The organisation has strong policies in place regarding several aspects of data responsibility, e.g. collection, storage
  22. Performance is regularly monitored and measured against targets, and can be reported publicly
  23. The organisation supports efforts to improve standards and practices across industry
  24. Uses insights from data analysis to further the public good and to benefit data subjects
  25. Supports efforts to improve industry standard practice
To receive a score of 'Okay'

The business supports efforts regarding data responsibility on an ad hoc basis OR has some relevant policies and practices but the issue is not a priority for the business OR the issue is not relevant to the business

Examples of policies and practices which may support an OKAY statement (not all must be observed, enough should be evidenced to give comfort that the statement is the best of the four for the business being scored

  1. Data responsibility efforts are ad hoc, with little systematic implementation
  2. Monitoring or review of policies and practices is ad hoc
  3. At senior levels there is understanding of need for data privacy
  4. Some efforts are made to have oversight of chains of ownership of data
  5. Company complies with the GDPR and other relevant legislation
  6. Company complies with industry standards on data security
  7. Data considerations are targeted at minimising risk
  8. The company is willing to inform data subjects about data held about them, but only if they actively seek out the information
  9. Complies with freedom of information (FOI) requests
  10. Guarantees that no data is passed on to third parties without user’s express consent
  11. Implementation of insights from data fail to reduce inequalities, digital divide or similar issues
  12. Responsible data issues do not take precedence over other business considerations
  13. Little effort is put in to promoting the public good
  14. Work on responsible data use is focused within certain teams or individuals, and is not integrated in to full process
To receive a score of 'Poor'

The business acknowledges performance below expectations, either through conscious decisions, negligence or through inadequately implementing or executing efforts to attain acceptable performance standards OR the policies and practices the business has adopted are inadequate or otherwise not fit for purpose

Examples of policies and practices which may support a POOR statement (not all must be observed, enough should be evidenced to give comfort that the statement is the best of the four for the business being scored):

  1. Chains of ownership of data put the company’s stability, reputation and strategy at risk
  2. There is no clear position of responsibility for data within the company
  3. The company has no oversight or control of the data inputted to the algorithms used and their impacts and insights
  4. Company does not heed warnings on data security, or rectify problems
  5. Engages in risky data brokering activities, such as failing to fully protect anonymity, or selling their data without consent
  6. Data is collected and stored on customers without their consent
  7. Stakeholders have no access to their personal data
  8. The company prioritises their use of collected data over worker and user rights and interests
  9. Company does not train workers on what they can and cannot do with client/user data and there is little oversight
  10. The organisation fails to meet minimum standards or regulations
  11. The company actively engages in influencing to reduce regulation on data use
  12. Organisation does not engage with stakeholders nor is it responsive to their concerns
  13. Organisation collects data indiscriminately without regard for how it will be used or to its relevance
  14. No regard paid to Mosaic Effect or how combining data sets could expose data subjects and compromise anonymity
  15. Has faced criticism over data collection and processing practices from consumer groups, charities, governments etc
  16. No or inadequate policies and practices on data disposal or archiving
  17. No or inadequate consideration given to future risks regarding data held
  18. Privacy agreements or Terms & Conditions are overly long and/or technical, rendering them incomprehensible, or impossible realistically expect consumers to read or understand them
  19. Constantly changes privacy agreements or terms & conditions without notifying consumers
  20. Uses passive consent tactics
  21. Fails to provide alternative services or products to those unwilling to consent to privacy agreements or Terms & Conditions
  22. Uses 'all-or-nothing' privacy agreements or Terms & Conditions policies and practices on data contribute to digital divide, e.g. services are only offered to the digitally literate, data on vulnerable individuals is collected while more well-off can opt out at a cost, information gathered is used to target potentially harmful products such as debt to the low-income
  23. Information is only gathered on certain groups, resulting in the challenges faced by marginalised communities are not addressed, e.g. data gathered from smart phones excludes those who do not or cannot use them
  24. Consumers cannot control what data is collected or how it is used when using a product or service, have no right of ownership
  25. Data subjects have no access to information held on them, or it is difficult/convoluted to access it
  26. Privacy policies are inadequate
  27. Little effort made to maintain accuracy of data
  28. Little or no consideration given to potential biases in processing of data
  29. Data processing results in biased or inaccurate findings
  30. No input from stakeholders on how data is stored, processed or gathered
  31. Vulnerability of data subjects is increased through collection of certain types of data
  32. ‘Right to be forgotten' is inadequately enforced
  33. Consumers' data is in format that is not easily transported or shared with other organisations
  34. Organisations' data practices harms public good, e.g. contribute to digital divide or inequalities
  35. Organisations' data practices violates civil or human rights, e.g. sharing data with governments without due process, publication of data puts individuals or groups at risk
  36. Little or ineffective protections in place to prevent data breach
  37. Fails to share data with public institutions that could improve services or products
  38. Uses data held by public institutions for commercial or exploitative purposes only
  39. Seeks to maximise amount of data shared using questionable methods, e.g. actively making products addictive in order to capitalise on attention or time of consumers with little regard to health, wellbeing or financial ramifications
  40. No clear ownership of the issue within the organisation