The NASUWT has set out 12 principles for the ethical development and application of artificial intelligence and digital technologies in education
Introduction
NASUWT principles
- A public good and human right
- Legal and regulatory frameworks
- Legal and regulatory frameworks: data protection and privacy
- Legal and regulatory frameworks: equality, equity and inclusion
- Consultation, negotiation and agreement
- Teacher professionalism and agency
- Curriculum, assessment and pedagogy
- Good work - protecting jobs and decent working conditions
- Good work - training, professional development and learning
- Good work - managing performance
- Accountability for and governance of digital technologies
- Commercial and third-party providers - protecting education interests
Explainers and definitions
Further information
Introduction
Our principles and accompanying advice and guidance are intended to help NASUWT Representatives to judge whether digital technologies, including artificial intelligence (AI) technologies, are being designed, developed, procured and implemented in ways that secure and protect the lawful rights and interests of teachers, school leaders and learners.
This includes their educational and human rights, as well as their rights in relation to privacy and data protection, equality, employment and decent working conditions.
The UK General Data Protection Regulation (UK GDPR) provides an important means for challenging unlawful policies and practices relating to the use of digital technologies.
It is important to identify when personal data is being collected and processed through digital technologies and whether the data is being or could be used to monitor, predict or judge behaviours and performance.
There is also a need to establish:
-
who makes decisions about the design, development, procurement and application of the system or technology;
-
how decisions are made;
-
who controls the use of the system or technology;
-
who has access to the datasets generated by the technology or application; and
-
what happens to those datasets.
Our principles and related advice and guidance will help NASUWT Representatives to play an active role in supporting and challenging the introduction and use of digital technologies in the school or setting.
NASUWT principles
A public good and human right
-
Education is a public good and a human right, therefore digital technologies must serve the broader goals and objectives of education. This includes ensuring high-quality inclusive and equitable education for all, recruiting, developing and retaining a high-quality teaching workforce, and building a just, sustainable society [1]
This means that:
-
decisions about how digital technologies, including AI, are designed, developed, procured and applied in education are based on the lawful rights, including the human rights, and interests of learners, teachers and the wider school community;
-
digital technologies provide clear and demonstrable benefits for both learners and teachers;
-
digital technologies support the design and delivery of high-quality inclusive education and those technologies should be accessible. They must not create barriers to learning or disadvantage some groups of learners;
-
there is a strategic plan to address the design, development, procurement and application of the digital infrastructure and the technologies that sit within it. The plan is subject to meaningful consultation with staff and negotiated and agreed with recognised workforce unions. The plan should address the financial implications of developing, procuring and maintaining the technologies;
-
digital technologies are developed and applied in ways that are environmentally friendly, sustainable, support global citizenship and promote social justice;
-
in the case of AI-based technologies, there is human involvement and oversight at all stages of development and application of the technology or system;
-
the NASUWT’s principles address the principles set out in UNICEF’s Policy guidance on AI for children;
-
the NASUWT’s principles apply to third parties, including those who develop and supply digital technologies or who manage or support their use in schools.
For more support and further information, see:
-
Council of Europe, Artificial Intelligence and Education: A critical view through the lens of human rights, democracy and the rule of law - https://www.coe.int/en/web/education/-/new-isbn-publication-artificial-intelligence-and-education
-
European Commission, Ethical guidelines on the use of artificial intelligence (AI) and data in teaching and learning for educators - https://education.ec.europa.eu/news/ethical-guidelines-on-the-use-of-artificial-intelligence-and-data-in-teaching-and-learning-for-educators
-
The Alan Turing Institute, AI, human rights, democracy and the rule of law: a primer - https://www.turing.ac.uk/news/publications/ai-human-rights-democracy-and-rule-law-primer-prepared-council-europe
-
UNI Europa video, Building ethical AI in the world of work - https://www.facebook.com/UNIEuropa/videos/2885090238469353
-
UNICEF, Policy guidance on AI for children - https://www.unicef.org/globalinsight/reports/policy-guidance-ai-children
The guidance draws from the United National Convention on the Rights of the Child (UNCRC) and sets out nine principles for upholding children’s rights:
-
support children’s development and wellbeing;
-
ensure inclusion of and for children;
-
prioritise fairness and non-discrimination for children;
-
protect children’s data and privacy;
-
ensure safety for children;
-
provide transparency, explainability and accountability for children;
-
empower government and businesses with knowledge of AI and children’s rights;
-
prepare children for present and future developments in AI;
-
create an enabling environment.
Legal and regulatory frameworks
-
The design, development, procurement and application of digital technologies comply with legislation, regulations and good practice standards
This means that:
-
digital technologies are designed, developed, procured and implemented in ways that uphold children’s rights including their rights under the UNCRC;
-
digital technologies are designed, developed, procured and implemented in ways that protect teachers and leaders’ rights, including their rights in relation to data protection and privacy, health and safety, equality and human rights, work and working conditions;
-
schools consult teachers, learners and the wider school community about the rationale, design, development, procurement and application of the technology;
-
schools actively engage recognised workforce unions in decisions about the design, development, procurement and application of digital technologies;
-
impact assessments identify risks, including for data protection and privacy, equality and equity, health and safety, workers’ rights and protections including workload, wellbeing, jobs and job roles. The results of assessments are shared with workforce unions and influence decisions about the introduction or use of the technology, including limitations and measures to mitigate adverse impacts;
-
the mechanisms for protecting rights are clearly set out in policies and procedures that are communicated to all relevant parties. Policies and procedures explain how breaches will be addressed;
-
if a technology or system is being used for new or different purposes, its use is suspended until it is clear that it does not have an adverse impact on the lawful rights and interests of learners or staff. Its use for a new or different purpose should also be agreed with recognised workforce unions;
-
staff and learners have the right to ‘opt out’ if they have legitimate concerns about the use of a particular technology or system which cannot be addressed through mitigations.
For more support and further information, see:
-
The Why Not Lab - https://www.thewhynotlab.com
-
PSI Digitalisation - https://publicservices.international/resources/page/digitalisation?lang=en&id=11758
-
UNICEF, Policy guidance on AI for children - https://www.unicef.org/globalinsight/reports/policy-guidance-ai-children
-
TUC, People-powered technology: collective agreements and digital management systems (pdf) - https://www.tuc.org.uk/sites/default/files/2022-08/People-Powered_Technology_2022_Report_AW.pdf
-
Information Commissioner’s Office (ICO), Guide to the UK General Data Protection Regulation (UK GDPR) - https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr
-
Equality and Human Rights Commission (EHRC), advice and guidance on the Equality Act 2010 - https://www.equalityhumanrights.com/en/advice-and-guidance
-
Equality Commission Northern Ireland, advice and guidance on equalities legislation - https://www.equalityni.org/Legislation
Legal and regulatory frameworks: data protection and privacy
-
Digital technologies are designed, developed, procured and implemented in ways that uphold rights to privacy and data protection
This means that:
-
schools comply with the UK GDPR;
-
schools are transparent about the rationale, design, development, procurement and application of digital technologies;
-
schools explain how the technology or system is to be used in ways that teachers, learners and the wider school community understand. The explanation includes the steps to be taken when there are concerns about its use/misuse;
-
schools consult staff and negotiate with recognised trade unions about the rationale, design, development, procurement and application of the technology;
-
schools disclose what personal data is being collected through digital technologies, how the data will be stored and how it will be processed. This includes disclosing when algorithmic management systems are being used. It also includes providing information about the use of data to profile or predict behaviour or to judge performance;
-
schools ensure that the purposes for collecting and processing personal data are lawful and fair. This includes ensuring that:
-
the school only collects the personal data that is necessary for the intended purposes;
-
the intended purposes are stated explicitly;
-
the data is only processed for the intended purposes; and
-
the data is kept securely;
-
-
schools ensure that teachers, including those working as supply teachers, other staff and learners have access to their data and the right to object to the data or edit the data;
-
data protection impact assessments (DPIAs) are undertaken in consultation with the workforce and others whose personal data is collected, stored and processed. A DPIA must:
-
describe the nature, scope, context and purposes of the processing;
-
assess necessity, proportionality and compliance measures;
-
identify risks to individuals;
-
identify any additional measures to mitigate those risks;
-
-
a technology is not used if the rights of staff and learners cannot be ensured and risks mitigated;
-
digital technologies are not used to monitor the workforce or for capability or disciplinary purposes or to monitor communications between workers and NASUWT Representatives as this would breach data protection and privacy regulations;
-
schools commit to not selling or giving away datasets that include personal information about learners, staff or the wider school community.
For more support and further information, see:
-
Information Commissioner’s Office (ICO), Guide to the UK General Data Protection Regulation (UK GDPR) - https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr
-
ICO, Data Protection Impact Assessments - https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/data-protection-impact-assessments-dpias
-
PSI Digitalisation - https://publicservices.international/resources/page/digitalisation?lang=en&id=11758 (‘Digital Bargaining Hub and Negotiation Tools’)
-
NASUWT - Checklist for the Use and Management of Live Streaming
Legal and regulatory frameworks: equality, equity and inclusion
-
Digital technologies comply with equalities legislation and support and enhance equality, equity and inclusion
This means that:
-
decisions about whether and how a digital technology/system is designed, developed, procured and applied are guided by detailed assessments of its equality impact, including but not limited to those with a protected characteristic under equalities legislation:
-
digital technologies are not designed, developed, procured or applied in ways that unlawfully discriminate against or exclude particular groups of staff or learners. AI algorithms do not introduce or replicate existing biases or prejudice;
-
issues that may affect groups of teachers, learners and/or members of the school community, including those who share a protected characteristic under equalities legislation, are identified;
-
particular attention is paid to how a digital technology addresses diversity, including through the stages of value chain, i.e. design, development, trialling, procurement, application, review and evaluation;
-
assessments consider how digital technologies will support action to eliminate unlawful discrimination, advance equality of opportunity and foster good relations between groups;
-
assessments of a digital technology to support learning consider whether the technology could help to include learners who have an additional learning need. This includes clarifying the pedagogies and practices that support the inclusive use of the technology;
-
the results of assessments are shared with workforce unions and influence decisions about the use of a technology.
-
-
digital technologies are monitored to ensure that they are not being applied unfairly, in ways that result in unlawful discrimination or bias, or in ways that exclude;
-
consideration is given to using algorithmic systems to monitor and flag up issues relating to the equality impact of policies, procedures and practices, including those relating to pay, promotion and career progression and professional learning and development;
-
periodic reviews evaluate the extent to which digital technologies are supporting and enhancing equality, equity and inclusion;
-
equality impact assessments are undertaken when changes to digital technology policies and practices are proposed.
For more support and further information, see:
-
Equality and Human Rights Commission (EHRC), Artificial Intelligence in Public Services - https://archive.equalityhumanrights.com/en/advice-and-guidance/artificial-intelligence-public-services
-
EHRC, Brief note for decision makers: advice on equality impact assessments (doc) - https://archive.equalityhumanrights.com/en/advice-and-guidance/equality-impact-assessments
-
EHRC, advice and guidance on the Equality Act 2010 - https://www.equalityhumanrights.com/en/advice-and-guidance
-
Equality Commission Northern Ireland, advice and guidance on legislation - https://www.equalityni.org/Legislation
Consultation, negotiation and agreement
-
The NASUWT and recognised workforce unions are engaged meaningfully at national and local levels in decisions about the design, development, procurement, implementation, review and continued use of digital technologies
This means that:
-
digital technology agreements are established which clarify the arrangements for consultation, negotiation and decision-making;
-
agreements make it clear that schools will consult staff and negotiate with recognised trade unions about the rationale, design, development, procurement and application of digital technologies. Agreements also clarify that decisions will only be taken forward with the agreement of workforce unions;
-
the NASUWT and recognised workforce unions are kept informed of proposals, plans and decisions relating to the design, development, procurement, application and review of digital technologies;
-
staff and learners who could be affected by a digital technology are consulted as part of the impact assessment process and their views and needs influence decisions about the design, development, procurement, application, review and withdrawal of digital technologies;
-
there is ongoing monitoring of digital technologies and periodic reviews and evaluations of their use and impact and recognised workforce unions are actively engaged in the review and evaluation processes.
For more support and further information, see:
-
The Why Not Lab - https://www.thewhynotlab.com (‘Tools and guides to support negotiations to secure workers’ rights’)
-
PSI Digitalisation - https://publicservices.international/resources/page/digitalisation?lang=en&id=11758 (‘Digital Bargaining Hub and Negotiation Tools’)
-
TUC, When AI is the boss: an introduction for union reps (pdf) - https://www.tuc.org.uk/sites/default/files/When_AI_Is_The_Boss_2021_Reps_Guide_AW_Accessible.pdf
-
TUC, People-powered technology: collective agreements and digital management systems (pdf) - https://www.tuc.org.uk/sites/default/files/2022-08/People-Powered_Technology_2022_Report_AW.pdf
-
TUC, interactive training, Managed by Artificial Intelligence - https://www.tuc.org.uk/resource/managed-artificial-intelligence
Teacher professionalism and agency
-
Teachers have a voice and influence and their views and needs inform decisions about whether and how digital technologies are designed, developed, procured and applied
This means that:
-
decisions about how a digital technology is designed, developed, procured and/or applied are based on the views of teachers and the needs of teachers and learners;
-
teachers are consulted about how a technology will support or hinder teaching and learning in their classrooms and decisions about whether to use the technology are based on this feedback. A technology is not introduced if it will have a detrimental impact on teaching and learning;
-
where needed, teachers are given appropriate information and technical support so that they can provide informed and meaningful feedback and opinions;
-
measures are in place to enable a technology to be trialled and reviewed;
-
teachers are kept informed of proposals and decisions about the use of digital technologies;
-
digital technologies do not deskill teachers or de-professionalise teaching, for instance by removing tasks that involve the teacher’s professional judgement and expertise. Teachers must maintain their professional autonomy and agency;
-
impact assessments consider whether the introduction of a technology could change the role and core professional responsibilities of the teacher, either directly or indirectly;
-
digital technologies are monitored for their impact on teachers’ and leaders’ roles and responsibilities;
-
periodic reviews are undertaken to ensure that digital technologies are not having an adverse impact on the professional responsibilities and agency of teachers.
For more support and further information, see:
-
The Why Not Lab - https://www.thewhynotlab.com (‘Tools and guides to support negotiations to secure workers’ rights’)
-
European Commission, Ethical guidelines on the use of artificial intelligence (AI) and data in teaching and learning for educators - https://education.ec.europa.eu/news/ethical-guidelines-on-the-use-of-artificial-intelligence-and-data-in-teaching-and-learning-for-educators
Curriculum, assessment and pedagogy
-
Decisions about whether and how digital technologies are designed, developed, procured and applied are determined by curriculum goals, learning objectives and the purposes of assessment
This means that:
-
digital technologies support and enhance teaching and learning, including in ways that support equality, equity, inclusion and global citizenship;
-
pedagogies support effective learning:
-
new pedagogical approaches may be needed to make effective use of digital technologies;
-
-
curricula and assessments may need to respond to new digital developments, including those outside of education, which may influence how learners engage with learning and assessments;
-
the curriculum educates learners to function and thrive in a digital world. This includes supporting them to be digitally literate, helping them to distinguish between fact and fiction and enabling them to become active global citizens who make informed decisions about the safe, sustainable and ethical use of AI and digital technologies;
-
teachers have sufficient time to assess the usefulness of digital technologies and to plan and prepare for their incorporation into lessons.
For more support and further information, see:
-
UNICEF, Policy guidance on AI for children - https://www.unicef.org/globalinsight/reports/policy-guidance-ai-children
-
European Commission, Ethical guidelines on the use of artificial intelligence (AI) and data in teaching and learning for educators - https://education.ec.europa.eu/news/ethical-guidelines-on-the-use-of-artificial-intelligence-and-data-in-teaching-and-learning-for-educators
Good work - protecting jobs and decent working conditions
-
Digital technologies are designed, developed, procured and implemented in ways that protect jobs and workers’ rights and secure good working conditions
This means that:
-
teachers and other staff who will be affected by a technology or system are consulted about the proposals and plans are negotiated and agreed with recognised workforce unions;
-
impact assessments assess the impact of a digital technology or system on workload, stress and wellbeing, jobs and job roles and the results inform and influence decisions about the design, development, procurement, implementation, adjustment and/or withdrawal of digital technologies;
-
digital technologies do not replace or displace teachers, including teachers who do not have a permanent contract or place of work, such as supply teachers;
-
if digital technologies require teachers to undertake new tasks, those tasks replace existing tasks and are manageable and sustainable;
-
digital technologies do not deskill or undermine the professional status of the teacher;
-
particular attention is paid to how digital technologies impact on practice:
-
policies make it clear that teachers have the right to switch off and the timeframe within which this should happen, e.g. 6pm to 8am;
-
steps are taken to prevent teachers feeling under pressure to work longer hours or undertake additional tasks, e.g. introducing digital systems that prevent communications being received outside working hours;
-
-
action is taken to address any pre-existing generators of workload;
-
digital technologies are monitored for their impact on workload and this feeds into periodic reviews and evaluations of teacher and leader workload and wellbeing;
-
the companies that develop or supply the digital technology respect workers’ rights, including their right to decent pay and working conditions and their right to organise and join a trade union;
-
digital technologies should:
-
reduce bureaucracy and workload burdens;
-
support teachers and leaders to teach and lead teaching and learning;
-
be implemented in ways that ensure the right to a work/life balance, including the right to switch off.
-
For more support and further information, see:
-
NASUWT Tackling Excessive Teacher Workload:
Good work - training, professional development and learning
-
Teachers have sufficient and equal access to personalised training and continuing professional development and learning (CPDL) to enable them to make informed decisions about the use of digital technologies in their teaching
This means that:
-
teachers have an entitlement to CPDL and dedicated time within the working day to undertake CPDL. This entitlement is fully funded and covers all teachers, including those who work part time and those who have no permanent contract or place of work;
-
CPDL enables teachers to develop a critical understanding of digital technologies. This includes the technical aspects of their use and the benefits and limitations for teaching and learning, including an understanding of the implications for equality, inclusion and human rights;
-
teachers have access to specialist support and to professional learning networks to share ideas and innovative practice relating to the use of digital technologies;
-
teachers have access to experts who can explain how the technology functions and the legal, regulatory and good practice standards relating to its introduction and use. Time is provided to enable any associated training to take place;
-
consideration is given to using digital management systems to map teacher competencies and skills and flag up professional learning and development needs. Such systems would need to be designed to ensure equality and equity of access and must not be applied punitively.
For more support and further information, see:
-
European Commission, Ethical guidelines on the use of artificial intelligence (AI) and data in teaching and learning for educators - https://education.ec.europa.eu/news/ethical-guidelines-on-the-use-of-artificial-intelligence-and-data-in-teaching-and-learning-for-educators
-
Education International, Academy for Labour Movement Activists (ALMA) training course on the digitalisation of education. NASUWT Representatives who are interested in taking the course should contact the NASUWT Education Team so that their request can be forwarded to Education International
-
The Why Not Lab - https://www.thewhynotlab.com (‘Guidance on questions for union representatives to ask as part of co-governance process’)
-
PSI Digitalisation - https://publicservices.international/resources/page/digitalisation?lang=en&id=11758 (‘Understanding Digitalisation’, ‘Digital Bargaining Hub and Negotiation Tools‘ and ‘Assessing Your Union’s Digital Readiness’)
Good work - managing performance
-
Where digital technologies enable monitoring of a teacher’s practice, this information is controlled by the teacher and, where used, is only used for self-reflection and personal development purposes
This means that:
-
staff who will be subject to a performance management system are consulted about the design, development and implementation of the system;
-
workforce unions are actively engaged at all stages of the process to design, develop, procure and implement performance management systems and decisions about their introduction and use are negotiated and agreed with the unions;
-
digital technology is not used for punitive, high-stakes performance management purposes:
-
using digital technologies for hig-stakes performance management purposes, including purposes such as pay progression and promotion, may breach data protection regulations.
-
For more support and further information, see:
-
TUC, When AI is the boss: an introduction for union reps (pdf) - https://www.tuc.org.uk/sites/default/files/When_AI_Is_The_Boss_2021_Reps_Guide_AW_Accessible.pdf
-
NASUWT, Performance Management
Accountability for and governance of digital technologies
-
Digital technologies are inclusively governed, but the employer is the ‘responsible body’, liable for any harms that arise from the deployment of the technologies
This means that:
-
the employer is responsible for ensuring that digital technologies are implemented in ways that comply with legislation, regulations and good practice standards. This includes ensuring that data is kept secure and confidential;
-
the employer is accountable for any errors or biases that arise from implementing the technology or system;
-
a participatory data stewardship approach is adopted to involve teachers, learners and other members of the school community in the use of data. The approach is consistent with the Ada Lovelace Institute’s (ALI’s) framework for participatory stewardship, which sets five commitments:
-
inform: a commitment to keep people informed about how their data is being governed;
-
consult: a commitment to listen to, acknowledge, and provide feedback to people on concerns and aspirations for the governance of their data;
-
involve: a commitment to work with those people to ensure that their concerns and aspirations are directly reflected in data governance;
-
collaborate: a commitment to seek their advice and innovation in the design of data governance models and to incorporate their recommendations where possible;
-
empower: a commitment to advise and assist in line with their decisions about their own data governance models;
-
-
digital technologies are monitored and reviewed and evaluated periodically to assess the ongoing educational, social, ethical benefits and risks associated with their use:
-
evaluations draw on the results of monitoring about privacy and data protection, equality and equity, workload and wellbeing;
-
reviews and evaluations ensure that digital technologies are not being used beyond their original purpose.
-
For more support and further information, see:
-
Information Commissioner’s Office, Guide to the UK General Data Protection Regulation (UK GDPR) (Section on accountability and governance) - https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr/accountability-and-governance
-
Ada Lovelace Institute, Participatory data stewardship: A framework for involving people in the use of data - https://www.adalovelaceinstitute.org/report/participatory-data-stewardship
-
The Why Not Lab - https://www.thewhynotlab.com (guidance on questions for union representatives to ask as part of co-governance process)
-
TUC, People-powered technology: collective agreements and digital management systems (pdf) - https://www.tuc.org.uk/sites/default/files/2022-08/People-Powered_Technology_2022_Report_AW.pdf (This provides a brief explanation of participatory state stewardship and the data lifecycle as well as the steps for establishing collective agreements around the use of AI)
Commercial and third-party providers - protecting education interests
-
Developers and suppliers of digital technology systems and products operate in ways that are consistent with the principles above
This means that:
-
the developer and/or supplier of the technology or system operate(s) fair and sustainable employment practices, including those workers who are based in other countries. They provide decent work and working conditions and recognise workers’ rights to organise and join a trade union;
-
procurement/supplier contracts include explicit clauses to clarify that the school has joint data access and control. This means that contracts should:
-
include the right to demand amendments to or the withdrawal of digital systems if intended or unintended harms or faults are detected;
-
clarify that the school will not accept updates or amendments to digital technologies/systems that could have an adverse impact on the lawful rights, including human rights, and interests of learners and staff;
-
clarify that the school will have a say in what happens to data when a contract is terminated;
-
-
developers/suppliers have conducted impact assessments and shared the results of these assessments with the school. Assessments should identify the risks and detail mitigations in relation to:
-
data protection and privacy of staff, learners and the wider school community;
-
equality, equity and inclusion of staff, learners and the wider school community, including those who share a protected characteristic under equalities legislation;
-
workload and wellbeing of staff and learners;
-
-
the results of impact assessments are shared with recognised workforce unions and made available to staff, learners and the wider school community:
-
there should be clear evidence that the designer/developer/vendor has taken appropriate action to identify and mitigate potential adverse impacts;
-
-
developers/suppliers monitor and periodically review and evaluate the digital technology/system;
-
datasets that include the personal data of learners, staff and/or other members of the school community should not be sold, given away or transferred to third parties without explicit consent;
-
there is evidence that developers/suppliers are committed to operating in ways that are sustainable and have a positive impact on society and the environment;
-
if the school or setting has no direct influence over the design of a digital technology, e.g. a digital platform by a global company such as Microsoft or Google:
-
the technology is applied in ways that adhere to the principles above;
-
functions that pose risks for data protection and privacy are identified and disabled or rejected or users are informed of the risks and their right to ‘opt out’;
-
there is ongoing monitoring of the technology to identify new and emerging risks and unacceptable practices that are reported to the employer and to the NASUWT and recognised workforce unions:
-
e.g. evidence that user data (teachers, learners, parents) is being used to profile and target products and resources;
-
e.g. evidence that the user data is being repackaged and sold to third parties who then target those users.
-
-
For more support and further information, see:
-
The Why Not Lab - https://www.thewhynotlab.com (guidance on data rights and governance of digital systems, including questions to ask in relation to procurement contracts)
-
PSI Digitalisation - https://publicservices.international/resources/page/digitalisation?lang=en&id=11758 (‘Digital Bargaining Hub and Negotiation Tools’)
Explaining digital technologies, AI, algorithms, generative AI and algorithmic management
Digital technologies are electronic tools, systems, devices and resources that generate, store or process data. [2]
Digital learning refers to any type of learning that uses technology.
Digital technology in education can be categorised as technology that is used for education management and delivery, technologies for learning and assessment, and technologies to support and enhance teaching.
There are many definitions of AI and definitions are likely to change as AI evolves. UNICEF defines AI as:
‘…machine-based systems that can, given a set of human-defined objectives, make predictions, recommendations, or decisions that influence real or virtual environments. AI systems interact with us and act on our environment, either directly or indirectly. Often, they appear to operate autonomously, and can adapt their behaviour by learning about the context.’ (UNICEF 2021)
An algorithm is a mathematical rule or process which is followed to perform a calculation or solve a problem.
Algorithms set out the logical steps that digital technologies follow to process data to make predictions, recommendations or decisions.
Machine learning allows algorithms to extract correlations from data, build models and refine the models as the machine learns. These models can be extremely complex making it impossible for a human to work out how a particular decision, recommendation or prediction was reached.
Machine learning also means that the way in which the technology operates may change over time.
Generative AI is a type of AI system where the algorithms have been trained to respond to prompts and create new outputs such as images, text and audio. Generative AI deals with huge amounts of data at speed. ChatGPT is one example of a generative AI system.
Algorithmic management is a system which either partially or fully automates a management task. It includes technology-driven surveillance or monitoring, for instance to track and allocate tasks or to set performance targets or measures.
Further information
NASUWT advice and guidance
Data protection and privacy
Remote and hybrid education
TUC
-
TUC (2021) When AI is the boss: an introduction for union reps (pdf) - https://www.tuc.org.uk/sites/default/files/When_AI_Is_The_Boss_2021_Reps_Guide_AW_Accessible.pdf (Accessed 1 February 2023)
-
TUC (2022) People Powered Technology: Collective agreements and digital management systems (pdf) - https://www.tuc.org.uk/sites/default/files/2022-08/People-Powered_Technology_2022_Report_AW.pdf (Accessed 1 February 2023)
-
TUC (2021) Dignity at work and the AI revolution: A TUC Manifesto - https://www.tuc.org.uk/research-analysis/reports/dignity-work-and-ai-revolution (Accessed 1 February 2023)
-
TUC (2021) Technology managing people: the worker experience (pdf) - https://www.tuc.org.uk/sites/default/files/2020-11/Technology_Managing_People_Report_2020_AW_Optimised.pdf (Accessed 1 February 2023)
-
TUC interactive training, Managed by Artificial Intelligence - https://www.tuc.org.uk/resource/managed-artificial-intelligence
-
Allen, Robin QC and Masters, Dee (2021) Technology managing people: the legal implications, London: TUC (pdf) - https://www.tuc.org.uk/sites/default/files/Technology_Managing_People_2021_Report_AW_0.pdf (Accessed 1 February 2023)
Information Commissioner’s Office
-
Information Commissioner’s Office (ICO) Guide to the UK General Data Protection Regulation (UK GDPR) - https://ico.org.uk/for-organisations/guide-to-data-protection/guide-to-the-general-data-protection-regulation-gdpr (Accessed 24 January 2023)
-
ICO Video Surveillance, including guidance for organisations using CCTV - https://ico.org.uk/for-organisations/guide-to-data-protection/key-dp-themes/guidance-on-video-surveillance-including-cctv (Accessed 24 January 2023)
Hubs providing resources and tools for unions on digitalisation and AI
-
The Why Not Lab - https://www.thewhynotlab.com
The hub provides tools and guides, activities, blogs, research, reports and policies that workers can use to negotiate and secure workers’ rights. This includes a guide to support workers in relation to the co-governance of algorithmic systems. -
PSI Digitalisation - https://publicservices.international/resources/page/digitalisation?lang=en&id=11758
The website includes a link to Digital Bargaining Hub and Negotiation Tools, as well as areas covering Understanding Digitalisation and Assessing Your Union’s Digital Readiness -
Education International, Academy for Labour Movement Activists (ALMA) training course on the digitalisation of education
The course is split into three units. Each unit comprises a series of short three to ten-minute videos. The course is open to trade union activists and NASUWT Representatives who are interested in taking the course should contact the NASUWT Education Team so that their request can be forwarded to Education International. -
The Council of Europe
-
Leslie, D.; Burr, C.; Aitken, M.; Cowls, J.; Katell, M.; and Briggs, M. (2021). Artificial intelligence, human rights, democracy and the rule of law: A primer. Strasbourg, Council of Europe - https://www.turing.ac.uk/research/publications/ai-human-rights-democracy-and-rule-law-primer-prepared-council-europe (Accessed 24 January 2023)
-
Holmes, Wayne; Persson, Jen; Chounta, Irene-Angelica; Wasson, Barbara; and Dimitrova, Vania (November 2022) Artificial Intelligence and Education: A critical view through the lens of human rights, democracy and the rule of law. Strasbourg, Council of Europe - https://www.coe.int/en/web/education/-/new-isbn-publication-artificial-intelligence-and-education (Accessed 24 January 2023)
-
Other sources of advice, support and information
-
Ada Lovelace Institute (September 2021) Participatory data stewardship: A framework for involving people in the use of data - https://www.adalovelaceinstitute.org/report/participatory-data-stewardship (Accessed 28 April 2023)
-
The Edtech podcast provides useful information and discussion about the use of technology and AI in education. The podcast is aimed at teachers, parents and others who have an interest in education - https://theedtechpodcast.com
-
Equality and Human Rights Commission (EHRC), Artificial Intelligence in Public Services - https://archive.equalityhumanrights.com/en/advice-and-guidance/artificial-intelligence-public-services
-
EHRC Guidance on the Equality Act - https://archive.equalityhumanrights.com/en/advice-and-guidance/equality-act-guidance
-
EHRC (2017) Brief note for decision makers - https://archive.equalityhumanrights.com/en/advice-and-guidance/equality-impact-assessments (Accessed 28 April 2023)
-
Equality Commission Northern Ireland for advice and guidance on legislation - https://www.equalityni.org/Legislation
-
European Commission (2022), Ethical guidelines on the use of artificial intelligence (AI) and data in teaching and learning for educators, Luxembourg: European Union - https://education.ec.europa.eu/news/ethical-guidelines-on-the-use-of-artificial-intelligence-and-data-in-teaching-and-learning-for-educators (Accessed 28 April 2023)
-
Fengchun, M.; Holmes, W.; Huang, R.; and Zhang, H. (2021) Artificial intelligence in education: guidance of policy makers. Paris, UNESCO - https://unesdoc.unesco.org/ark:/48223/pf0000376709 (Accessed 26 January 2023)
-
Global Partnership on Artificial Intelligence (GPAI) November 2022. AI for Fair Work (pdf) - https://www.gpai.ai/projects/responsible-ai/gpai-responsible-ai-wg-report-2022.pdf (Accessed 17 January 2023)
-
Health and Safety Executive - https://www.hse.gov.uk
-
Southgate, E., Blackmore, K., Pieschl, S., Grimes, S., McGuire, J. & Smithers, K. (2018). Artificial intelligence and emerging technologies (virtual, augmented and mixed reality) in schools: A research report, Newcastle: University of Newcastle, Australia - https://www.education.gov.au/supporting-family-school-community-partnerships-learning/resources/ai-schools-report (Accessed 24 January 2023)
-
UNICEF (November 2021) Policy guidance on AI for children v.2, New York: Unicef - https://www.unicef.org/globalinsight/reports/policy-guidance-ai-children
-
UNI Europa (2021) Building ethical AI in the world of work (video) - https://www.facebook.com/UNIEuropa/videos/2885090238469353
Footnote
[1] This reflects NASUWT policy, e.g. see World Class Schools, but also international agreements such as the Sustainable Development Goal for Education (SDG4) and the purposes of education set out in the United Nations Convention on the Rights of the Child (UNCRC)
[2] Department of Education, Victoria State, Australia - https://www.education.vic.gov.au/school/teachers/teachingresources/digital/Pages/teach.aspx
Your feedback
If you require a response from us, please DO NOT use this form. Please use our Contact Us page instead.
In our continued efforts to improve the website, we evaluate all the feedback you leave here because your insight is invaluable to us, but all your comments are processed anonymously and we are unable to respond to them directly.