With this it will be convenient to discuss the following:
Government new clause 17—Report on the use of copyright works in the development of AI systems.
New clause 1—Age of consent for social media data processing—
“(1) The UK GDPR is as amended as follows.
(2) In Article 8 of the UK GDPR (Conditions applicable to child's consent in relation to information society services)
After paragraph 1 insert—
‘(1A) References to 13 years old in paragraph 1 shall be read as 16 years old in the case of social networking services processing personal data for the purpose of delivering personalised content, including targeted advertising and algorithmically curated recommendations.
(1B) For the purposes of paragraph 1A “social networking services” means any online service that—
(a) allows users to create profiles and interact publicly or privately with other users, and
(b) facilitates the sharing of user-generated content, including text, images, or videos, with a wider audience.
(1C) Paragraph 1B does not apply to—
(a) educational platforms and learning management systems provided in recognised educational settings, where personal data processing is solely for educational purposes.
(b) health and well-being services, including NHS digital services, mental health support applications, and crisis helplines, where personal data processing is necessary for the provision of care and support’”.
This new clause would raise the age for processing personal data in the case of social networking services from 13 to 16.
New clause 2—Compliance with UK copyright law by operators of web crawlers and general-purpose AI models—
“(1) The Secretary of State must by regulations make provision (including any such provision as might be made by Act of Parliament), requiring the operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to comply with United Kingdom copyright law, including the Copyright, Designs and Patents Act 1988, regardless of the jurisdiction in which the copyright-relevant acts relating to the pre-training, development and operation of those web crawlers and general-purpose AI models take place.
(2) Provision made under subsection (1) must apply to the entire lifecycle of a general-purpose AI model, including but not limited to—
(a) pre-training and training,
(b) fine tuning,
(c) grounding and retrieval-augmented generation, and
(d) the collection of data for the said purposes.
(3) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause requires web crawlers and general-purpose AI models with UK links to comply with UK copyright law across all stages of AI development.
New clause 3—Transparency of crawler identity, purpose and segmentation—
“(1) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to disclose information regarding the identity of crawlers used by them or by third parties on their behalf, including but not limited to—
(a) the name of the crawler,
(b) the legal entity responsible for the crawler,
(c) the specific purposes for which each crawler is used,
(d) the legal entities to which operators provide data scraped by the crawlers they operate, and
(e) a single point of contact to enable copyright owners to communicate 35 with them and to lodge complaints about the use of their copyrighted works.
(2) The information disclosed under subsection (1) must be available on an easily accessible platform and updated at the same time as any change.
(3) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose AI models to deploy distinct crawlers for different purposes, including but not limited to—
(a) web indexing for search engine results pages,
(b) general-purpose AI model pre-training, and
(c) retrieval-augmented generation.
(4) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose AI models to ensure that the exclusion of a crawler by a copyright owner does not negatively impact the findability of the copyright owner’s content in a search engine.
(5) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause requires operators of web crawlers and AI models to disclose their identity, purpose, data-sharing practices, and use separate crawlers for different functions.
New clause 4—Transparency of copyrighted works scraped—
“(1) The Secretary of State must by regulations make provision requiring operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to disclose information regarding text and data used in the pre-training, training and fine-tuning of general purpose AI models, including but not limited to—
(a) the URLs accessed by crawlers deployed by them or by third parties on their behalf or from whom they have obtained text or data,
(b) the text and data used for the pre-training, training and fine-tuning, including the type and provenance of the text and data and the means by which it was obtained, and
(c) information that can be used to identify individual works, and (d) the timeframe of data collection.
(2) The disclosure of information under subsection (1) must be updated on a monthly basis in such form as the regulations may prescribe and be published in such manner as the regulations may prescribe so as to ensure that it is accessible to copyright owners upon request.
(3) The Secretary of State must lay before Parliament a draft of the statutory 35 instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause mandates transparency about the sources and types of data used in AI training, requiring monthly updates accessible to copyright owners.
New clause 5—Enforcement—
“(1) The Secretary of State must by regulations make provision requiring the Information Commission (under section 114 of the Data Protection Act 2018) (‘the Commissioner’) to monitor and secure compliance with the duties by an operator of a web crawler or general-purpose artificial intelligence (AI) model whose service has links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 (‘a relevant operator’), including but not limited to the following—
(a) the regulations must provide for the Commissioner to have the power by written notice (an ‘information notice’) to require a relevant operator to provide the Commissioner with information that the Commissioner reasonably requires for the purposes of investigating a suspected failure to comply with the duties;
(b) the regulations must provide for the Commissioner to have the power by written notice (an ‘assessment notice’) to require and to permit the Commissioner to carry out an assessment of whether a relevant operator has complied or is complying with the duties and to require a relevant operator to do any of the acts set out in section 146(2) of the Data Protection Act 2018;
(c) the regulations must provide that where the Commissioner is satisfied 15 that a relevant operator has failed, or is failing to comply with the duties, the Commissioner may give the relevant operator a written notice (an ‘enforcement notice’) which requires it—
(i) to take steps specified in the notice, or
(ii) to refrain from taking steps specified in the notice;
(d) the regulations must provide that where the Commissioner is satisfied that a relevant operator has failed or is failing to comply with the duties or has failed to comply with an information notice, an assessment notice or an enforcement notice, the Commissioner may, by written notice (a ‘penalty notice’), require the person to pay to the Commissioner an amount in sterling specified in the notice, the maximum amount of the penalty that may be imposed by a penalty notice being the ‘higher maximum amount’ as defined in section 157 of the Data Protection Act 2018; and
(e) the regulations may provide for the procedure and rights of appeal 30 in relation to the giving of an information notice, an assessment notice, an enforcement notice or a penalty notice.
(2) The regulations must provide that any failure to comply with the duties by a relevant operator shall be directly actionable by any copyright owner who is adversely affected by such failure, and that such copyright owner will be entitled to recover damages for any loss suffered and to injunctive relief.
(3) The regulations must provide that the powers of the Commissioner and the rights of a copyright owner will apply in relation to a relevant operator providing a service from outside the United Kingdom (as well as such one provided from within the United Kingdom).
(4) The Secretary of State must lay before Parliament a draft of the statutory instrument containing the regulations under this section within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause grants the Information Commissioner enforcement powers to ensure compliance with AI and web crawler transparency rules, including penalties for breaches.
New clause 6—Technical solutions—
“(1) The Secretary of State must conduct a review of the technical solutions that may be adopted by copyright owners and by the operators of web crawlers and general-purpose artificial intelligence (AI) models whose services have links with the United Kingdom within the meaning of section 4(5) of the Online Safety Act 2023 to prevent and to identify the unauthorised scraping or other unauthorised use of copyright owners’ text and data.
(2) Within 18 months of the day on which this Act is passed, the Secretary of State must report on such technical solutions and must issue guidance as to the technical solutions to be adopted and other recommendations for the protection of the interests of copyright owners.”
This new clause requires the Secretary of State to review and report on technical measures to prevent unauthorised data scraping by web crawlers and AI models.
New clause 7—Right to use non-digital verification services—
“(1) This section applies when an organisation—
(a) requires an individual to use a verification service; and
(b) uses a digital verification service for that purpose.
(2) Where it is reasonably practicable for an organisation to offer a non-digital method of verification, the organisation must—
(a) make a non-digital alternative method of verification available to any individual required to use a verification service; and
(b) provide information about digital and non-digital methods of verification to those individuals before verification is required.”
This new clause would create a duty upon organisations to support digital inclusion by offering non-digital verification services where practicable.
New clause 8—Data Vision and Strategy—
“Within six months of Royal Assent of this Act, the Secretary of State must publish a ‘Data Vision and Strategy’ which outlines—
(a) the Government’s data transformation priorities for the next five years; and
(b) steps the Government will take to ensure the digitisation of Government services.”
New clause 9—Departmental Board Appointments—
“(1) Within six months of the day on which this Act is passed—
(a) Government departments;
(b) NHS England; and
(c) NHS trusts
shall appoint to their departmental board or equivalent body at least one of the following—
(i) Chief Information Officer;
(ii) Chief Technology Officer;
(iii) Chief Digital Information Officer;
(iv) Service Transformation Leader; or
(v) equivalent postholder.
(2) The person or persons appointed as under subsection (1) shall provide an annual report on the progress of the department or body towards the Government’s Data Vision and Strategy.”
This new clause would require digital leaders to be represented at executive level within Government departments and other bodies.
New clause 10—Data use in Public Service Delivery Review—
“(1) The Secretary of State must, every 12 months, lay before Parliament a ‘Data use in Public Service Delivery Review’.
(2) The Data use in Public Service Delivery Review shall include, but is not limited to assessment of the steps being taken to—
(a) improve the Government’s use of data in public service delivery over the previous 12 months;
(b) expand the use of data to support increased and improved digital services in public service delivery;
(c) improve expertise and digital talent within Government departments to help expand the use of data for public service delivery; and
(d) facilitate and regulate for better use of data in the delivery of public services.”
This new clause would require an annual assessment by the Secretary of State to examine the steps being taken to facilitate and regulate the use of data in the delivery of public services using digital and online technologies.
New clause 11—Access to a deceased child’s social media data—
“(1) Where a person under 18 years of age has deceased, a parent or legal guardian (the ‘requestor’) may request from any internet service provider (ISP) the child’s user data from up to 12 months prior to the date of death.
(2) The ISP must provide a copy of the requested data, or direct account access, upon verification of the requestor’s identity and relationship to the deceased person, and no court order shall be required for such disclosure.
(3) ‘User data’ includes all content, communications, or metadata generated by or associated with the deceased person’s online activity, including stored messages and posts, except where the deceased person had explicitly directed otherwise prior to death.
(4) The ISP may refuse or redact specific data only where—
(a) disclosure would unduly infringe the privacy rights of another individual,
(b) the deceased person had explicitly opted out before death,
(c) there is a conflicting court order, or
(d) a serious risk to public safety or national security would result.
(5) In providing data under this section, the ISP must comply with data protection legislation.
(6) This section constitutes a lawful basis for disclosure under Article 6 of the UK GDPR.
(7) The Secretary of State may, by regulations subject to the affirmative resolution procedure—
(a) provide guidance on verifying parent or guardian status,
(b) clarify any additional grounds for refusal, and
(c) prescribe safeguards to protect third-party confidentiality.
(8) For the purposes of this section—
‘internet service provider (ISP)’ includes any provider of social media, messaging, or other online platforms; and
‘data protection legislation’ has the meaning given in section 51 of this Act.”
This new clause would allow parents of a deceased minor to obtain that child’s social media data without a court order, subject to privacy safeguards for third parties.
New clause 12—Raising the minimum age at which users can consent to processing of personal data—
“(1) The UK GDPR is amended in accordance with subsection (2) of this section.
(2) (2) After paragraph 1 of Article 8 of the UK GDPR (Conditions applicable to child’s consent in relation to information society services) insert—
‘(1A) References to “13 years old” and “age of 13 years” in paragraph 1 shall be read as “16 years old” and “age of 16 years” in the case of processing of personal data.
(1B) Paragraph (1A) does not apply to—
(a) platform systems and services operated where the primary purpose of processing of personal data is for the advancement of a charitable purpose as defined in the Charities Act 2011;
(b) publicly owned platform systems and services operated for the primary purpose of law enforcement, child protection, education, or healthcare;
(c) cases in which the Secretary of State determines it is in the best interests of the child for an operator to accept the child’s own consent.’”
This new clause would raise the age for processing personal data from 13 to 16 years old with certain exceptions for charitable purposes and child safety.
New clause 13—Code of practice for the use of children’s educational data—
“(1) Within 6 months of the passage of this Act, the Information Commissioner must prepare a code of practice which contains such guidance as the Information Commissioner considers appropriate on the processing of children’s data in connection with the provision of education.
(2) Guidance under subsection (1) must consider—
(a) all aspects of the provision of education including learning, school management, and safeguarding;
(b) all types of schools and learning settings in the development of guidance;
(c) the use of AI systems in the provision of education;
(d) the impact of profiling and automated decision-making on children’s access to education opportunities;
(e) children’s consent to the way their personal data is generated, collected, processed, stored and shared;
(f) parental consent to the way their children’s personal data is being generated, collected, processed, stored and shared;
(g) the security of children’s data;
(h) the exchange of information for safeguarding purposes.”
This new clause requires the Information Commissioner to produce a code of practice for accessing children’s educational data.
New clause 14—Transparency of business and customer data used in training Artificial Intelligence models—
“(1) The Secretary of State must by regulations make provision requiring operators of general-purpose AI models to disclose upon request information about business data and customer data processed for the purposes of pre-training, training, fine-tuning, and retrieval-augmented generation in an AI model, or any other data input to an AI model.
(2) Business data and customer data must include, but is not limited to, the whole or any substantial part of a literary, dramatic, musical or artistic work, sound recording, film or broadcast included in any text, images and data used for the purposes set out in subsection (1).
(3) Information disclosable under subsection (1) must include but is not limited to:
(i) Digital Object Identifiers and file names;
(ii) Details of how the work was identified, including metadata;
(iii) The source from which it was scraped or otherwise obtained; and
(iv) The URLs accessed by crawlers deployed by operators, or by third parties, to obtain the data.
(4) The owner of rights in any individual work identifiable in information disclosed under subsection (1) must be provided upon request to the relevant operator with information as to whether and how they have complied with the laws of the United Kingdom in respect to that work.
(5) The Secretary of State must lay before Parliament a draft of the statutory instrument containing regulations under subsection (1) within six months of the day on which this Act is passed and the regulations are subject to the affirmative procedure.”
This new clause would require the Secretary of State to set out transparency provisions requiring generative AI developers to provide information to enable individuals and creative businesses to determine whether their data, works and other subject matter have been used in training datasets.
New clause 15—Complaints procedure for vulnerable individuals—
“(1) The Data Protection Act 2018 is amended in accordance with subsections (2) to (4).
(2) After section 165(3) insert—
‘(3A) For complaints under subsection (2), the Information Commissioner must provide appropriate complaints-handling procedures for—
(a) victims of modern slavery,
(b) victims of domestic abuse,
(c) victims of gender-based violence, or
(d) data subjects otherwise in a position of vulnerability.
(3B) Procedures under subsection (3A) must include—
(a) appropriate support for vulnerable individuals;
(b) provision of specialised officers for sensitive cases;
(c) signposting to support services;
(d) provision of a helpline;
(e) de-escalation protocols.’
(3) After section 166(1)(c) insert—
‘(d) fails to investigate a complaint appropriately or take adequate action to remedy findings of inadequacy.’
(4) After section 166(2)(b), insert—
‘(c) to use formal powers as appropriate to investigate a complaint and to remedy any findings of inadequacy, unless the request from the data subject is manifestly unfounded or excessive.’”
This new clause would require the Information Commission to introduce a statutory complaints procedure for individuals in a position of vulnerability and new grounds of appeal to an Information Tribunal.
New clause 18—Report on the introduction of a public interest test for allowing access to NHS data by third-parties and companies—
“(1) The Secretary of State must within six months of the passing of this Act—
(a) prepare and publish a report examining the need for a specific statutory public interest test to determine and safeguard access to NHS data by third-parties and companies.
(b) within 28 days of a report being laid under subsection (1) the Government must schedule a debate and votable motion on the findings of the report in each House.
(2) The report must consider—
(a) whether and in what situations it would be necessary, proportionate and lawful to share NHS data with third-parties and companies when the interests and risks to both the individual and/or public is considered.
(b) when it would be in the public interest and in the best interests of patients and the NHS to allow access by third-parties and companies to NHS data in relation to the provision of health care services and for promotion of health.”
This new clause would require the Secretary of State to produce a report on the introduction of a public interest test for allowing access to NHS data by third-parties and companies and then to schedule a debate on it in each House.
New clause 19—Secretary of State’s duty to review the age of consent for data processing under the UK GDPR—
“(1) The Secretary of State must, within 12 months of Royal Assent of this Act, have conducted a review and published a report into the operation of Article 8 (Conditions applicable to child's consent in relation to information society services) of the UK GDPR in relation to the data processed by social media platforms of children under the age of 16.
(2) As part of this review, the Secretary of State must consider—
(a) the desirability of increasing the digital age of consent under the UK GDPR from 13 to 16, taking into account the available evidence in relation to the impact of social media platforms on the educational, social and emotional development of children; and
(b) the viability of increasing the digital age of consent under Article 8 of the UK GDPR in relation to specific social media platforms which are shown by the evidence to be unsuitable for use by children under the age of 16.
(3) Within six months of the publication of the report under subsection (1), the Secretary of State must lay a plan before Parliament for raising the digital age of consent to 16 through amendments to Article 8 GDPR, unless the review concludes that such changes are unnecessary.”
New clause 20—Duties of the Secretary of State in relation to the use by web-crawlers and artificial intelligence models of creative content—
“The Secretary of State must—
(a) by 16 September 2025, issue a statement, by way of a copyright notice issued by the Intellectual Property Office or otherwise, in relation to the application of the Copyright, Designs and Patents Act 1988 to activities conducted by web-crawlers or artificial intelligence models which may infringe the copyright attaching to creative works;
(b) by 16 September 2025, lay before Parliament a report which includes a plan to help ensure proportionate and effective measures for transparency in the use of copyright materials in training, refining, tuning and generative activities in AI;
(c) by 16 September 2025, lay before Parliament a report which includes a plan to reduce barriers to market entry for start-ups and smaller AI enterprises on use of and access to data;
(d) by 1 July 2026, publish a technological standard for a machine-readable digital watermark for the purposes of identifying licensed content and relevant information associated with the licence.”
New clause 21—Directions to public authorities on recording of sex data—
“(1) The Secretary of State must, within three months of the passage of this Act, issue regulations relating to the code of practice set out in section 49 of this Act which require public authorities to—
(a) collect, process and retain sex data only where it is lawful to do so in accordance with data protection legislation;
(b) request and record sex data accurately, in every circumstance where sex data is collected, in accordance with following category terms and definitions—
(i) ‘Sex’ meaning male or female only based on ‘sex at birth’, ‘natal sex’ or ‘biological sex’ (these terms carrying the same meaning and capable of being used interchangeably); and,
(ii) in addition, where it is lawful to do so in accordance with data protection legislation and the Gender Recognition Act 2004, ‘Acquired Gender’ meaning male or female only, as recorded on a gender recognition certificate issued in accordance with the Gender Recognition Act 2004;
(c) have updated relevant organisation guidance to stipulate that, where sex data is collected, this must be done in accordance with the definitions set out by subsection (1)(b) within three months of these regulations coming into force;
(d) have conducted a review of the accuracy of data held in relation to the sex of data subjects to ensure that the data is accurate in recording sex at birth and, where relevant and collected lawfully, acquired gender as recorded on a gender recognition certificate within 12 months of these regulations coming into force;
(e) have taken every reasonable step to ensure that any data held in relation to the sex and, where relevant and collected lawfully, acquired gender as recorded on a gender recognition certificate of a data subject that is found to be inaccurate is rectified or erased within 18 months of these regulations coming into force; and
(f) have produced and submitted to the Secretary of State a report setting out the findings of its review in relation to the matters set out by subsection (1)(d) and, where relevant, a description of the steps taken to ensure that the data held by the relevant public authority is accurate within the definitions set out subsection (1)(b) with 18 months of these regulations coming into force.
(2) The Secretary of State may, on receipt of a report in accordance with subsection (1)(f) instruct a public authority to take any further remedial steps within a specified timeframe reasonably necessary to ensure the accuracy of the sex and acquired gender data held by the relevant public authority.
(3) The Secretary of State must, within one month of the passage of this Act, establish and maintain a register of public authorities approved to act as sources of data relating to the attribute of sex for persons providing digital verification services.
(4) The register in subsection (3) must be published on the website of the Office for Digital Identities & Attributes or any successor body.
(5) Until such time as a public authority is added to the register under subsection (3), persons providing digital verification services may only obtain data on the sex of an individual requesting the provision of digital verification services from the record of births held by the General Register Office in accordance with subsection (6).
(6) Information supplied by the General Register Office pursuant to subsection (5) must specify sex as recorded at birth, as well as any subsequent corrections to the register in the field marked ‘Sex’.
(7) The Secretary of State may, from time to time, add public authorities to the register as under subsection (3) only upon being satisfied on the basis of a report issued under subsection (1)(f), or satisfaction of such further steps required by the Secretary of State under subsection (2) that the data held by the relevant public authority in relation to sex and, where relevant, acquired gender as recorded on a gender recognition certificate, as defined in subsection (1)(b), is accurate.”
This new clause requires the Secretary of State to issue regulations relating to the code of practice in section 49 requiring public authorities to record sex data in line with these regulations when data are collected. This clause is linked to amendments 39 and 40.
New clause 22—Recording of ethnicity data for the purposes of public service delivery—
“(1) The Secretary of State must make regulations which make provision for the collection of individual ethnicity data in the process of public service delivery and associated data collection.
(2) The regulations set out by subsection (1) must make provision for ethnic classifications to include Jewish and Sikh categories.
(3) The Secretary of State must lay before both Houses of Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed which will be subject to the affirmative procedure.”
This new clause requires the Secretary of State to make statutory provision for individual ethnicity data to be collected in the process of public service delivery.
New clause 23—Recording of ethnicity data on the Register of Births and Deaths—
“(1) The Secretary of State must make regulations which make provision for the collection of individual ethnicity data during birth and death registration.
(2) The regulations set out by subsection (1) must make provision for ethnic classifications to include Jewish and Sikh categories.
(3) The Secretary of State must lay before both Houses of Parliament a draft of the statutory instrument containing regulations under this section within six months of the day on which this Act is passed which will be subject to the affirmative procedure.”
This new clause requires the Secretary of State to make statutory provision for individual ethnicity data to be able to be collected during birth and death registration.
Government amendments 11 to 32.
Amendment 39, in clause 45, page 42, line 30, at the beginning insert—
“Save in respect of data relating to sex,”.
This amendment is consequential on NC21.
Amendment 40, page 43, line 15, at end insert—
“”gender recognition certificate” means a gender recognition certificate issued in accordance with the Gender Recognition Act 2004.”
This amendment is consequential on NC21.
Government amendments 1 to 8.
Amendment 37, in clause 67, page 75, line 24, at end insert—
“(2A) For the purposes of paragraph 2, ‘scientific research’ means creative and systematic work undertaken in order to increase the stock of knowledge, including knowledge of humankind, culture and society, and to devise new applications of available knowledge.
(2B) To meet the reasonableness test in paragraph 2, the activity being described as scientific research must be conducted according to appropriate ethical, legal and professional frameworks, obligations and standards.”
This amendment incorporates clarifications to help reduce potential misuse of the scientific research exception. The first is a definition of scientific research based on the Frascati Manual. The second is a requirement that research be conducted in line with frameworks and standards in the UKRI Code of Practice for Research.
Amendment 41, in clause 80, page 95, line 19, at end insert—
“3. For the purposes of paragraph 1(a), a human’s involvement is only meaningful if they are a natural person with the necessary competence, authority and capacity to understand, challenge and alter the decision.”
See explanatory statement for Amendment 44.
Amendment 45, page 96, line 2, at end insert—
“5. Consent in accordance with paragraph 2 cannot be given by persons under the age of 18 where—
(a) the automated decision-making is likely to produce legal or similarly significant effects on the child, or
(b) the processing involves the profiling of a child to determine access to essential services, education, or other significant opportunities.
6. The controller shall not be obliged to maintain, acquire or process additional information in order to identify the age of a data subject for the sole purpose of complying with this Regulation.
7. A significant decision may not be taken based solely on automated processing, if the data subject is a child or may be a child unless the provider is satisfied that the decision is in, and compatible with, the best interests of a child, taking into account their rights and development stage, authorised by law to which the controller is subject, and after suitable measures to safeguard the data subject’s rights and freedoms and legitimate interests are made publicly available.
8. Profiling or solely automated processing of children’s data may not occur for the purposes of targeted advertising or behavioural analysis.”
This amendment ensures that automated decision-making cannot take place in circumstances where it would affect a child’s access to significant opportunities or would not be in their best interests, as well as protections against practices such as behavioural analysis.
Amendment 46, page 96, leave out lines 13 to 19 and insert—
“(a) communicate to the data subject before and after the decision is taken the fact that automated decision-making is involved in the decision, the extent of any human involvement, and the availability of safeguards under this Article;
(b) provide the data subject with information about decisions described in paragraph 1 taken in relation to the data subject including meaningful information about the logic involved, the significance and the envisaged consequences of such processing for the data subject, and a personalised explanation for the decision;
(c) enable the data subject to make representations about such decisions;
(d) enable the data subject to obtain human intervention on the part of the controller in relation to such decisions;
(e) enable the data subject to contest such decisions.
3. For the purposes of paragraph 2(b), a personalised explanation must—
(a) be clear, concise and in plain language of the data subject’s choice in a readily available format;
(b) be understandable, and assume limited technical knowledge of algorithmic systems;
(c) address the reasons for the decision and how the decision affects the individual personally, which must include—
(i) the inputs, including any personal data;
(ii) parameters that were likely to have influenced or were decisive to decision or a counterfactual of what change would have resulted in a more favourable outcome;
(iii) the sources of parameters and inputs;
(d) be available free of charge and conveniently accessible to the data subject, free of deceptive design patterns.
4. Where the safeguards apply after a decision is made, the controller must give effect to data subject requests as soon as reasonably practicable and within one month of the request.
5. The controller must ensure the safeguards are fully in place and complete a data protection impact assessment under Article 35 before a decision under Article 22A is taken, documenting their implementation of the safeguards in addition to the requirements of that Article.
6. The controller must publish details of their implementation of the safeguards and how data subjects can make use of them.”
This amendment would ensure that data subjects are informed of automated decisions made about them in a timely way, and that that explanation is personalised to enable them to understand why it was made. It also ensures processors are incentivised to put the safeguards in place before commencing automated decision-making.
Amendment 42, page 96, line 23, after “Article 22A(1)(a),” insert
“and subject to Article 22A(3)”.
See explanatory statement for Amendment 44.
Amendment 43, page 97, line 19, at end insert—
“(3) To qualify as meaningful human involvement, the review must be performed by a person with the necessary competence, training, authority to alter the decision and analytical understanding of the data.”
See explanatory statement for Amendment 44.
Amendment 44, page 98, line 31, after “and 50C(3)(c),” insert “and subject to 50A(3)”.
This amendment and Amendments 41, 42 and 43 would make clear that in the context of new Article 22A of the UK GDPR, for human involvement to be considered as meaningful, the review must be carried out by a competent person who is empowered to change the decision in practice.
Amendment 9, in clause 81, page 100, line 7, at end insert—
“Age assurance
1C. Information society services which are likely to be accessed by children must use highly effective age verification or age estimation measures for the purpose of delivering on children’s higher protection matters.”
This amendment requires services which are likely to be accessed by children to use highly effective age verification measures.
Amendment 38, in clause 86, page 103, line 22, at end insert—
“(2A) Where personal data is processed for the purposes of scientific research under section 87(4) of the 2018 Act (‘reuse’), the processor or controller must publish details of the data sources used.
(2B) These details must as a minimum include a description of the scientific research, the provenance and method of acquisition of the personal data being reused, the original lawful basis for processing, the number of data subjects affected, and whether the data subjects have been notified of the reuse.
(2C) The processor or controller must notify the Information Commission when processing data for the purposes of scientific research under section 87(4) of the 2018 Act with the same details.”
This amendment ensures transparency for the use of scientific research exemptions by requiring those reusing personal data to publish details of that reuse and notify the Information Commission of that reuse.
Government amendments 33 and 34.
Amendment 10, in schedule 7, page 201, line 5, at end insert—
“(1B) A third country cannot be considered adequate or capable of providing appropriate safeguards by any authority where there exists no credible means to enforce data subject rights or obtain legal remedy.
(1C) For the purposes of paragraph 1A, the Secretary of State must make a determination as to whether credible means are present in a third country.
(1D) In making a determination regarding credible means, the Secretary of State must have due regard to the view of the Information Commissioner.
(1E) Credible means do not exist where the Secretary of State considers that any of the following are true:
(a) judicial protection of persons whose personal data is transferred to that third country is insufficient;
(b) effective administrative and judicial redress are not present;
(c) effective judicial review mechanisms do not exist; and
(d) there is no statutory right to effective legal remedy for data subjects.”
The amendment would prohibit personal data transfer to countries where data subject rights cannot be adequately upheld and prohibit private entities from using contracts to give the impression that data security exists.
Government amendments 35 and 36.