UK Courts Introduce Microsoft Copilot for Judges and Revise Generative AI Regulations

Judges in the UK Embrace Microsoft’s Copilot Chat for Legal Work
Judicial office holders in the UK are being encouraged to utilize Microsoft’s ‘Copilot Chat,’ a generative AI tool, through their internal eJudiciary platform. This recommendation comes following updated guidelines for judges, stressing that public AI chatbots may not provide reliable answers sourced from authoritative databases.
Understanding Microsoft’s Copilot Chat
The recently announced AI tool, Copilot Chat, allows judges to access generative AI features via the Edge browser or Microsoft 365 applications. This platform offers enhanced data protection and aligns with the privacy and security protocols inherent to Microsoft 365. When judges log into their eJudiciary accounts, the data shared with Copilot Chat remains secure and is not made public.
Key Features of Copilot Chat
- Accessible through Edge browser or Microsoft 365 applications.
- Ensures enterprise-level data protection.
- Secure data handling when signed into eJudiciary accounts.
Guidelines for Judges Using AI Technology
While the initiative encourages judges to explore the advantages of generative AI, the UK’s Courts and Tribunals Judiciary has also highlighted several important considerations. This initiative aims to promote the responsible use of AI technology within the legal framework.
Warnings and Best Practices
Judges should be aware of the following points:
Limited Accuracy of Public AI Tools:
- Public AI chatbots do not always draw from authoritative databases, which can lead to inaccurate responses.
Verification Challenges:
- AI tools may help locate correct information, but they should not be solely relied on for discovering new, unverifiable data.
Engagement Matters:
- The effectiveness of the AI response is influenced by the user’s interaction, including the effectiveness of the prompts and the quality of the underlying data. The information provided can be prone to inaccuracies, misinformation, or bias.
- Understanding the Source:
- Many available language models (LLMs) rely on extensive internet sources, often reflecting more on US law than UK law, even while attempting to differentiate between the two.
Confidentiality and Data Protection Considerations
Judges must also pay attention to confidentiality issues when using AI tools. Protecting sensitive information is crucial.
Disabling Chat History:
- It’s advisable to turn off chat history in public AI chatbots if available. This helps prevent personal data from being utilized for training purposes. For instance, options like this are present in tools such as ChatGPT and Google Bard, but are not universally available.
Device Permissions:
- Be cautious when using AI applications on smartphones, as many request permissions that could access sensitive data on the judge’s device. It is recommended to decline such permissions.
- Incident Reporting:
- If there is an accidental exposure of confidential information, it is important to report it to the designated leadership judge and the Judicial Office. Any disclosure containing personal data should be treated as a formal data incident.
The Importance of AI in Legal Practice
While there are concerns related to the use of public LLMs, the Ministry of Justice encourages the use of Copilot within the judiciary’s secure framework. This approach aims for a balanced integration of AI tools that can enhance legal operations without compromising data security.
Legal experts suggest that it may be beneficial to utilize a suite of legal technology tools that are designed with built-in security features. These specialized instruments often provide access to a wealth of verified legal data, ensuring that judges have reliable resources at their fingertips.
By adopting a prudent approach to technology, the UK judiciary aims to leverage the benefits of AI while maintaining the integrity and confidentiality required in legal proceedings.