Responsible technology

Register your interest in this issue

Does your organisation take a responsible approach to the development and ownership of its technology based products and services?

Question collaborator: Doteveryone

EXCELLENT Answers

No EXCELLENT answers have been published for this question.

GOOD Answers

No GOOD answers have been published for this question.

OKAY Answers

No OKAY answers have been published for this question.

POOR Answers

No POOR answers have been published for this question.

Every new technology has the potential for unintended consequences that impact our communities and institutions. Whilst technology has improved people’s lives in a myriad of ways, it is also responsible for negative impacts on individuals, communities, society and the environment. Companies ought consider their role by determining the greater implications of their technology and gain confidence from consumers that they’ve done so.

Responsible technology considers the social impact of technology and seeks to understand and minimise its potential unintended consequences. To create or support responsible technology, companies ought to:

  • Look beyond the individual user and take into account technology's potential impact and consequences on society as a whole
  • Share how value is created in a transparent and understandable way
  • Ensure best practice in technology that accounts for real, messy humans

Foundations and Infrastructure

Too many internet services are currently insecure and/or unreliable, as well as poorly architected, designed and maintained. As the digital world starts to affect more of our lives, poor internet services become increasingly problematic. Insecure systems and technologies don’t just affect their users — they affect others too. For example, hacked webcams could form a botnet, or children’s data could be leaked from Internet-connected toys. Firms need to ensure their technology is safe and secure with good practices surrounding its use.

Technology can be frustrating: it can change frequently, demand to be upgraded, or stop working entirely. This state of affairs leads to greater insecurity, increased obsolescence, and disproportionate effects for poorer communities and individuals. This goes along with the rapid pace of development which means that investment in new things doesn’t last long. More importantly, though, we don’t have incentive models that resource digital infrastructure and maintenance of useful tools and systems. Huge services rely on tiny open source projects, maintained by handfuls of volunteer developers; everyone wants the benefits of well maintained desktop software, but few are willing to pay for it. Therefore, it is necessary to find ways to ensure our digital infrastructure is maintained and supported.

These concerns are especially pressing - although not limited to - the use of artificial intelligence (the focus of a separate R100 scorecard) and algorithms. Algorithms, or automated processes, are broadly used in all kinds of activity to inform and to make decisions, including by the public sector, companies, educational institutions. This isn’t particularly new. However, the amount of data available today is much greater, as entire firms have been established to collect data. Further, algorithms are much more powerful and machine learning systems are increasingly complex and opaque, which leads to these technologies have greater impact—for example,  through predictive policing or autonomous vehicles. Responsible companies will seek to ensure algorithms and AI are accountable and impartial whilst acknowledging the limitations of what they provide and making them easier for users to understand.

Consequences

Internet technologies operate within old systems that disenfranchise already vulnerable and marginalised people. A £1,000 smartphone will almost certainly have stronger encryption than a £50 smartphone; visions of autonomous car futures rarely include or account for people who can only afford older, second-hand cars. When things go wrong, well-off people are more likely to secure refunds and compensation. Responsible companies should seek to prevent safety, security and consumer protection from becoming luxury services.

Additionally, the Internet offers the potential for information to be freely available to all, and for everyone to create and share information, too. This creates incredible opportunities for learning, fulfilment, new ideas, innovation, and art. However, much information still isn’t available online; academic papers are locked behind paywalls, and government and public data isn’t available in many places. Other information is hard to find because search tools are weak or biased, because there’s much misinformation (accidental or deliberate), or because of censorship, manipulation, social bubbles and more. This makes it harder to access trustworthy, accurate information.

The ad- and data-fuelled Internet can be invasive, attention-seeking, manipulative, and pervasive. The way information is presented also often isn’t accessible to everyone; apps require new smartphones, websites aren’t very accessible, servers get turned off and their information lost. Responsible approaches to technology seek to make the most of the Internet’s potential while still making sure the right data gets to the right people at the right times.

Further, ever more interconnected systems are changing the value of data about us, and making that data harder to understand and control. The downsides of information sharing —when information reaches people whom you would rather not have it — are often hard to perceive, and also vary depending on who you are, your situation, the data in question, and how it’s being used by others. The long term impact of information being available , legally or otherwise ,  is very hard to assess. Responsible technology companies should seek to ensure appropriate levels and types of control, with protections and redress where needed, balanced with harnessing the reward and collective value of such data.

Overcoming these challenges requires firms to commit to develop and use responsible technology. This means innovation should consider people and planet. Responsible technologies recognise and respect everyone's dignity and rights, give people confidence and trust in their use and should never knowingly create or deepen existing inequalities.

Algorithms

A process or set of rules to be followed in calculations or other problem-solving operations, especially by a computer. The most popular example is on Facebook, in which the platform generates the user’s newsfeed.

Artificial Intelligence (AI)

'AI' is ‘the designing and building of intelligent agents that receive precepts from the environment and take actions that affect that environment’. The most critical difference between AI and general purpose software is in the phrase “take action”. AI enables machines to respond on their own to signals from the world at large, signals that programmers do not directly control and therefore can’t anticipate.

Big Data

Extremely large data sets that may be analysed computationally to reveal patterns, trends, and associations, especially relating to human behaviour and interactions.

Dominant Network Platforms

A small number of platforms are hugely dominant in their areas, like  Google, Facebook, Apple, and Amazon. They wield incredible power in the markets they choose to be active in (including in buying potential competing companies, in recruitment, and in R&D). Due to the Internet’s network effects, big platforms may be inevitable — but their governance and accountability is far from ideal today.

Encryption

'Encryption' is the process of taking an unencrypted message (plaintext), applying a mathematical function to it (encryption algorithm with a key) and producing an encrypted message (ciphertext).

Internet of Things (IoT)

The interconnection via the Internet of computing devices embedded in everyday objects, enabling them to send and receive data.

Machine Learning

A type of artificial intelligence (AI) that allows software applications to become more accurate in predicting outcomes without being explicitly programmed. 'Machine learning' algorithms are often categorised as being supervised or unsupervised.

Personal Data

Data is the lowest level of abstraction from which information and knowledge are then derived. 'Personal data', then, is data specific to an individual or group. Data is collected and analysed so as to create information, while knowledge is derived from extensive amounts of experience dealing with information on a subject.

Privacy by Design

'Privacy by Design' is an approach to systems engineering which takes privacy into account throughout the whole engineering process.

Answering YES

All Businesses MUST

Describe how you identify, anticipate, measure and account for the potential impacts your technology could have on society as a whole and the institutions, communities and relationships that are a part of it

Outline the critical interactions and value relationships within the technology system and how you ensure they are at a minimum transparent and understandable

Outline your approach to inclusivity and security in the design, maintenance, and support of your technology throughout its existence

All Businesses MAY

Explain how you engage with those who might offer criticism or an outside evaluation of the impact of your technology

Explain the trade-offs and decisions you make that may cause impact to communities or users

Explain the ownership and control models you have in place in order to maintain a high level of public understandability of your business, its principal business model(s), and your products and services

Explain how people with different needs and abilities are accounted for in the design and support structures for your technology

Explain how you anticipate and adapt to new needs and threats as they emerge

Answering NO

All Businesses MUST

Explain why they do not or cannot answer YES to this question and list the business reasons, any mitigating circumstances or any other reasons that apply

Mention any future intentions regarding this issue

All Businesses MAY

List any practices that are relevant, but not sufficient to answer YES

DON'T KNOW is not a permissible answer to this question

NOT APPLICABLE is not a permissible answer to this question

Version 1

To receive a score of 'Excellent'

Developing and using technology responsibly is fundamental to the business OR is a critical strategic issue for the business, and the right support, funding and oversight goes into addressing the core concepts of responsible tech.

Examples of policies and practices which may support an EXCELLENT statement (not all must be observed, enough should be evidenced to give comfort that the statement is the best of the four for the business being scored):

Policy

  1. The company has an ambitious, rigorous policy in place in order to develop or use technology responsibly
  2. Responsible use and development performance is monitored, measured and reported publicly, clearly and effectively
  3. Performance matches claims, purported values, and public statements of the company
  4. The organisation's policies are continuously reviewed and evolve to reflect new thinking on responsible technology
  5. Awareness and compliance with using and developing technology responsibly is widespread among staff, and there is a commitment to the issue from leadership
  6. Commitment to the issue is integrated across the firm’s operations
  7. The firm goes above and beyond the standards of various recognised frameworks and legislation
  8. The firm champions their staff to be aware of and implement industry best practices in relation to technology
  9. The firm lobbies and campaigns for others to follow suit, or for practice/legislation to be improved and actively engages with governments and outside experts to determine standards and address gaps or concerns
  10. There is a formal and robust stakeholder engagement process around the risks and trade-offs of their technology; firm takes into account stakeholder feedback in decision-making
  11. Firm encourages open internal conversation on the social impact of their technology products or services and has clear codes of conducts in place that reflect the values of responsible technology

Open, Impartial information and responsible use of personal data

  1. Firm puts in place appropriate levels and types of control over personal data and its use, providing protections and redress where needed, appropriately balanced with utilising the collective value of such data.
  2. Information and knowledge (for example search results and news) is presented to users in a consistent, unbiased way (seeking to avoid divisiveness or manipulation for commercial or political purposes)
  3. The firm seeks to make sure information, knowledge and content is openly accessible and freely available online
  4. Firm openly publishes the various exchanges of value the technology system has in an understandable way; offers the user a fair exchange and agency in their decision to use the technology

Safety and Reliability

  1. Functional, reliable and safe service or technology is provided to all users not only those who can pay a premium or afford new hardware (e.g the latest smartphone)
  2. Firm seeks to promote, support or maintain digital infrastructure
  3. Architecture, design and maintenance ensures reliable, equitable provision of services.
  4. Technology is secure from surveillance, fraud and manipulation.
  5. The Firm considers the social impact of new technology and seeks to prevent it entrenching inequality.
  6. The Firm offers education to all of its staff on safety and security issues and risks as appropriate to their roles and empowers its on-the-ground staff to be adaptable to threats and issues as they arise

Responsible AI

  1. Firm seeks to retain control of its algorithyms and machine learning, as well as transparency about how they function.
  2. New AI is deployed with caution.
To receive a score of 'Good'

The business demonstrates a clear commitment to responsible technology and has many good practices in place.

Examples of policies and practices which may support a GOOD statement (not all must be observed, enough should be evidenced to give comfort that the statement is the best of the four for the business being scored):

  1. The firm has strong policies in place, and can usually be held accountable for them
  2. Performance generally lives up to policies, but occasionally falls short
  3. Performance is regularly monitored and measured against targets, and can be reported publicly
  4. The commitment of the firm is demonstrated across most of its activities, products/services and policies
  5. There is a commitment from the firm to improve upon its performance in this area and a clear roadmap to achieving this
  6. The firm does not necessarily prioritise Responsible Technology, and in some cases, it allows the problem to persist to further some other aim (e.g. prioritises gaining users over transparency)
  7. The firm is actively responsive to feedback and public-facing debate, supporting, if not contributing to, progressive dialogue
  8. Firm makes a demonstrable effort to disseminate and enforce these best practice policies across its workforce and stakeholders
  9. The firm generally carries out policies and best practices, but is not a leader in its sector
  10. Firm meets the standards of various recognised frameworks and legislation, and may exceed them
  11. Where negative impacts are demonstrably unavoidable, various practices and policies are in place to mitigate these effects
  12. The firm may be signed up to lobbying organisations active on issues around responsible technology, but it follows other firms, rather than influencing them
  13. Firm adopts good practices on issues that are highly material to its operations. It also supports issues of low materiality
  14. Firm often engages in dialogue with stakeholders. There is not yet a formal structure for engaging with stakeholders but there is a clear plan of action for this
  15. Firm is aware of how its tech is being used and has plans or policies in place to identify unintended uses or users
  16. Firm is transparent about its business model, but does not share it prominently or in an easily discoverable or understandable way
  17. Firm has strong support structures in place for its users, but they must have their concerns escalated several times to have them addressed
To receive a score of 'Okay'

Business supports efforts regarding responsible tech on an ad hoc basis OR has some responsible tech policies and practices although they are not a priority for the business OR responsible tech is not relevant to the business

Examples of policies and practices which may support an OKAY statement (not all must be observed, enough should be evidenced to give comfort that the statement is the best of the four for the business being scored):

  1. The importance of the issue is recognised and there is some effort to comply on an ad hoc basis, but measurement methods and monitoring need developing and require a more consistent application throughout the entire organisation
  2. Firm has adopted policies which seek to address the problems and social harm which might arise from but those policies lack clear enforcement and/or monitoring
  3. The adopted policies are stated in general, indefinite terms that cannot be measured
  4. Firm cherry-picks which aspects of its responsibilities it should report
  5. Recognises if there is a weak reporting system in place. Firm can demonstrate future strategies to improve the reporting system
  6. Provides satisfactory explanation that issue is not relevant or applicable to the business
  7. The firm commits to disseminating and enforcing best practice policies across its workforce and stakeholders. However, follow-through may be inconsistent or ineffectual
  8. Firm complies with legal and regulatory requirements
  9. Engages in some dialogue on the issue and displays a willingness to learn from better / best practices
  10. The firm engages in lobbying of campaigning on an ad hoc basis, or devotes minimal effort, or undertakes this action in response to public pressure stakeholders
To receive a score of 'Poor'

The business acknowledges performance below expectations/ no evidence of Responsible Tech policies or practices.

Examples of policies and practices which may support a POOR statement (not all must be observed, enough should be evidenced to give comfort that the statement is the best of the four for the business being scored):

  1. The firm has adopted some practices that seek to address responsible technology but these practices are very poor, ineffectual or fail to be monitored or measured
  2. There are inadequate or no channels for the firm to be held accountable, either because the issue is not measured or reporting is not made publicly available
  3. The firm fails to meet minimum standards or regulations
  4. The policies and practices the firm pursues perpetuate social or environmental harm
  5. The company actively undermines its critics, lobbies the government or is engaged in PR activity to deny there is problem around irresponsible technology.
  6. Firm does not engage with stakeholders nor is it responsive to their concerns about responsible technology.