The organisation Responsible AI describes responsibility in the context of AI as a key concept for anchoring AI innovation to human rights, ethics and a flourishing human race.
This summarises neatly the importance of the responsible deployment of AI in support of diversity, equity, inclusion, and wellbeing by deploying frameworks and regulations that support the deployment of AI and related technology.
Building public trust through Responsible AI deployment
The responsible deployment of AI is also essential for building public trust, especially in public-facing applications, such as financial systems, healthcare and transport. It is without question that trust will need to be built over time, even with responsible deployment of AI, but focusing on ensuring AI is deployed responsibly will help to build this trust with the eventual aim of aligning AI to human rights, ethics and a flourishing human race as mentioned above.
The EU AI Act: A regulatory framework for ethical AI
The EU AI Act was one of the first pieces of regulation around the deployment of AI and identified certain acts as prohibited. In our discussion of the EU AI Act, we also suggested the possibility that some technologies could be excluded from patentability due to their links to some of the prohibited acts identified in the Act. As a follow-up to suggesting potential exclusions from patentability, in this note, we will discuss how technologies which implement responsible AI may well be patentable, and the opportunity to patent these technologies should not be ignored.
Examples of patentable Responsible AI technologies
Technologies which can implement responsible AI are wide-ranging. From technology to reduce bias in AI systems, which can be introduced through biases that exist in the data used to train the underlying model, to using AI to optimise grid computing and manufacturing processes, for instance. The range of opportunities to introduce AI into wellbeing and medical applications is also extensive.
Innovating in this area requires considerable expense in terms of time and resources but can this technology be protected using patents like technology in other areas?
Understanding the patentability of Responsible AI solutions
The patentability of technology in this field must, like technology in other fields, satisfy the requirements of novelty, inventive step, industrial applicability and not be on the list of excluded matter (methods of treatment etc.). The key obstacle to the patentability of these technologies will likely be the requirement that they solve a technical problem, especially in technologies which, for instance, purport to provide more general effects like improvements in wellbeing (it would almost certainly need to be defined more specifically to have any chance of satisfying the requirement of patentability).
Framing your Responsible AI innovation for patent success
Describing your invention as providing more responsible deployment of AI will almost certainly not be enough to satisfy the requirements of patentability. It will likely be seen as too vague or non-technical. That is to say, it is important to frame the technology in the right way and focus on the specific problem solved by the invention rather than a vague, high-level statement about the wider societal impact of AI. For example, if your invention provides more responsible deployment of AI because it, for instance, uses less energy, then it is better to focus on the energy reduction aspects rather than the wider societal view of excessive energy consumption.
Given the potential for exclusions from patentability introduced using the EU AI Act, it is also wise to critically review your invention to see if it could be interpreted as prohibited. To mitigate against this, it is advisable to file patent applications protecting your innovations but with wording that gives plenty of room for variance in the claims should they need to be amended to evade the exclusions.
The content of this article is intended to provide a general guide to the subject matter. Specialist advice should be sought about your specific circumstances.