12 August 202417 minute read

Patentability of AI – A New Perspective from the UK on Emotional Perception

The rise of Artificial Intelligence raises the question as to what extent an AI invention can be protected by a patent. One of the main issues faced when trying to patent AI inventions in the UK is the law relating to exclusions from patentability. UK legislation declares that anything that consists of certain categories of excluded matter, such as “a program for a computer … as such” or a mathematical method, is not an invention and therefore cannot be protected by a patent. AI by its nature is based on computational models and mathematical algorithms, so these statutory exclusions raise tricky questions as to whether it is possible to patent AI-related technology.

Late last year, the UK High Court handed down its judgment in the Emotional Perception case (Emotional Perception AI Ltd v Comptroller-General of Patents (2023) EWHC 2948 (Ch)) which gave some much needed guidance on whether an aspect of AI, namely an Artificial Neural Network (ANN), could be patented, or whether it engaged the statutory exclusion from patentability of a program for a computer “as such”.

The Emotional Perception case was an appeal to the High Court challenging a decision by the UKIPO to refuse to grant Emotional Perception’s patent. The patent application claimed an improved system for providing media file recommendations to an end user, including sending a file and message in accordance with the recommendation. This might be used, for example, by a music website where a user may be interested in receiving music similar to another track. In contrast to existing systems where similar tracks are suggested according to a category derived from human classification (rock, heavy metal, folk, classical etc) or human-compiled playlists, the claimed advantage of the invention is that the AI system can offer suggestions of similar music in terms of human perception and emotion irrespective of the genre of music and the apparently similar tastes of other humans. The invention arrives at these suggestions by passing music through a trained ANN.

The UKIPO found that the patent application claimed a “computer program as such”, which was therefore excluded from patentability, despite acknowledging that the invention was a significant improvement over the prior art. The High Court held that an ANN was not a “computer program as such” and that, even if it was wrong on that, the ANN in Emotional Perception's system would still be patentable because it amounted to more than a computer program due to the presence of a “technical effect”. My analysis of that decision can be found here.

At the time, it was hoped that the Emotional Perception decision would pave the way to make it easier to patent AI inventions in the UK. However, any comfort was short-lived as the decision was appealed. Last month, the Court of Appeal handed down judgment in the appeal (Comptroller-General of Patents v Emotional Perception AI Ltd (2024) EWCA Civ 825), reversing the judgment of the High Court and providing further clarification on the patentability of AI inventions.

 

The technology

ANNs are the backbone of the machine learning systems on which modern AI systems are based. They are essentially machines which process information, and ANNs as such are not new – early examples were built in the 1950s. An ANN is a machine built as a network of things called artificial neurons, which are akin to the neurons in the brain. In an ANN the artificial neurons are arranged in layers, with each neuron connected to other neurons. Each neuron is capable of processing inputs and producing an output which is passed on to other neurons in other layers. The first layer receives inputs from outside the ANN system and the last layer produces an output from the system.

At a basic level, the artificial neuron takes in a number of inputs, applies a specific weight to each input, and then adds the weighted values together. A further single weight, called a bias, is given to the result of adding up the weighted inputs, and then a function is applied to the output which converts the results of the biased sum into an overall output. The weights and biases are parameters which are adjustable by training the AI system – in the training process, the weights are adjusted iteratively so that the ANN produces a given output in given circumstances.

This training process is known as “back-propagation” and uses a training dataset and a loss function. The training dataset consists of sets of potential input data and an indication of the desired output (the target). Data from the training dataset is presented to the ANN and the output is examined. The difference between the actual output and the target is called the error. The loss function determines this error and the training process then applies small changes to the network parameters, so the error is corrected. The training dataset is applied again and the output examined again and this is done repeatedly in order to reduce the error. Every now and again, a validation dataset can be used to see how well the ANN is doing at correctly classifying data it has never encountered before. Once the ANN has been trained, the network topology and parameters are frozen.

Conceptually there are two ways in which ANNs can be built in practice – as hardware ANNs and software ANNs. For a software ANN, there is a conventional computer system in which all the components of the ANN (the neurons, links, layers, weights, biases etc) exist only as software. For a hardware ANN there is a physical box with electronics in it – the neurons are components such as resistors and transistors, the links are wires, and the layers exist because of the way the link wires are arranged. As ANNs, the two are identical (although the hardware ANN can perform tasks and be trained more quickly).

According to Emotional Perception's claimed invention, one ANN is trained to characterise certain music tracks based on a description of how they are perceived by a human (e.g. happy, sad, relaxing, though the descriptions would be more complicated and wordy than that) and to produce co-ordinates in a notional “semantic space” for each track in a pair of music files. Two tracks of music which are semantically similar will have co-ordinates closer together.

A second ANN analyses the physical and measurable properties of the same two tracks - tone, timbre, speed, loudness etc - and produces co-ordinates in a notional “property space”. Again, differences or similarities are reflected in the proximity of the co-ordinates. The second ANN is then trained to make the distances between pairs of the property co-ordinates converge or diverge in alignment with the distancing between them in the semantic space. So, if the property space co-ordinates are farther apart (or closer together) than those in the semantic space, they are moved closer together (or further apart). This training is achieved by back-propagation and correction is achieved by the ANN adjusting its own internal workings, such as weights and biases. The ANN learns from the experience without being told how to do it by a human being.

The training process is repeated many times with many pairs of tracks, so the ANN learns, by repetitive correction, how to produce property vectors whose relative distances reflect semantic similarity or dissimilarity between two tracks. In other words, the ANN learns how to discern semantic similarity (or dissimilarity) from the physical characteristics of a music track. It can then take a track, analyse its physical properties, and then recommend semantically similar music files (i.e. music which will, for example, generate a similar emotional response in humans) from a database. The system sends a recommendation message and a music file.

 

The law in the UK

To be patentable, a computer-related invention needs to amount to more than a computer program (or a mathematical method) – it must provide a “technical effect”. Guidance from the UK Intellectual Property Office (UKIPO) Manual of Patent Practice provides that a computer-related invention will not be excluded from patentability if it is directed to a specific technical process outside of a computer and contributes to a solution of a technical problem lying outside of the computer. Conversely, a task or process is unlikely to make the required technical contribution if it relates solely to excluded subject matter or to processing or manipulating information or data, or if it has the effect of just being a better or well-written program for a conventional computer. In short, the computer program needs to have an effect on a real-world external process or apparatus (which could include a better functioning computer) – the invention must make a contribution which is technical in nature.

The “technical contribution” is assessed in accordance with a 4-step framework developed through case law (the Aerotel test1) in which the claim is construed (Step 1), the actual contribution is identified (Step 2) and the question is then asked as to whether the contribution falls within the excluded subject matter (Step 3). Finally, there is consideration as to whether the contribution is actually technical in nature (Step 4). In practice, the test involves consideration of the problem to be solved by the alleged invention, how the invention works, what the advantages are and what the inventor has added to human knowledge.

The requirement for a technical effect means that core AI may be difficult to patent – improving an abstract AI algorithm is unlikely to make the necessary technical contribution. Instead, the focus should be on the technical implementation of the AI.

 

The UK Court of Appeal's decision

The Court of Appeal identified that the questions to be decided were:

  • Whether the exclusion from patentability of a program for a computer “as such” has any application to an ANN. This involves asking what a computer program is and whether there is a computer program in an ANN; and
  • If so, how does that exclusion apply to Emotional Perception's patent application? This involves asking whether the invention makes a technical contribution.

Question 1: Is there a computer program?

The meaning of “program for a computer” is a question of law. The Court of Appeal recognised that an ANN is unlike a conventional computer. The UKIPO submitted that, in order to customise an ANN for a particular task, the set of weights and biases has to be configured appropriately, and it is that set of weights and biases which forms the program for this kind of computer. In support, the UKIPO relied on various dictionary definitions of a “computer program”.

Emotional Perception submitted that the weights and biases of an ANN are not a computer program because a computer program takes the form of serial logical “if-then” type statements defined by a human programmer and which define exactly what it is that the programmed computer does. The core utility of ANNs lies in their ability to address problems which would be intractable to computer programming. To write a computer program requires the programmer to understand the problem at hand and the manner of its solution, from which to formulate a series of logical commands for the computer to follow. Where the problem is itself intractable, then a computer programmer (and computer program) cannot help because a programmer cannot write a program when the solution to the problem is not understood. By contrast, through iterative training on a (usually very extensive) dataset, an ANN is able to create for itself an internal structure which solves the otherwise intractable problem. It is normally impossible even once the ANN is trained to understand how it is approaching the problem to produce the answers given.

Having considered the parties' submissions, the Court of Appeal found that a computer is a machine which processes information. Although noting that dictionary definitions of a “computer program” are not determinative, the Court of Appeal found them to be helpful in this case – a computer program is a set of instructions for a computer to do something. In other words, a computer is a machine which does something, and that thing it does is to process information in a particular way. The program is the set of instructions which cause the machine to process the information in that particular way, rather than in another way.

The Court of Appeal held that however the ANN is implemented (hardware or software), it is clearly a computer – it is a machine for processing information. The weights and biases of an ANN are a computer program in that they are a set of instructions for a computer to do something. For a given machine, a different set of weights and biases will cause the machine to process information in a different way. The fact the set does not take the form of a logical series of “if-then” type statements is irrelevant. Consequently, the exclusion from patentability of a program for a computer as such is engaged in this case.

Question 2: Is there technical contribution?

The fact that the exclusion is engaged does not automatically mean that Emotional Perception's system is unpatentable – it may still be patentable if it passes the “technical contribution” test. The Court of Appeal gave examples of computer implemented inventions which are patentable, including a computer implemented method controlling an X-ray machine (Koch v Sterzel T26/86), a computer system for designing drill bits (Re Halliburton Energy Services (2011) EWHC 2508), and a system presenting a new interface to application programmers writing software for multi-touch devices (HTC v Apple (2013) EWCA Civ 451), and commented that each of these would have been just as patentable if the computer involved had been or used an ANN.

Likewise, the Court highlighted that conclusions that a computer implemented financial trading system was excluded (Merrill Lynch O/045/21) or a computer set up to produce the documents needed to form a company was excluded (Aerotel) would also be the same if an ANN was involved.

The Court of Appeal turned to look at the Aerotel steps. No issues of claim construction arose (Aerotel Step 1) as the claim clearly covered both hardware and software ANNs, so the analysis turned to identifying the contribution (Aerotel Step 2) and to the questions of whether the contribution falls within the excluded subject matter and whether the contribution is actually technical in nature (Aerotel Steps 3 and 4, which were taken together).

What was the contribution?

Apart from the step of sending the recommended file to a user, the Court of Appeal held that the whole of the remainder of the contribution of the Emotional Perception system consisted of a program for a computer. So, the relevant contribution was providing improved file recommendations, and the correct characterisation of that function was the final piece of the jigsaw necessary to answer the question of patentability.

Does the contribution go beyond a computer program as such and is it actually technical in nature?

The determinative question was therefore whether there was a technical contribution sufficient to avoid the computer program exclusion.

The issue boiled down to whether the UKIPO was right to find the exclusion applied because the beneficial effect was of a subjective and cognitive nature, or whether the High Court was right to hold that the exclusion did not apply because even though what made the file recommendation better was not technical criteria (because the semantic similarity is a subjective matter) the ANN had reached that result by going about its analysis and selection in a technical way.

The Court of Appeal agreed with the High Court that the Emotional Perception system goes about its analysis and selection in a technical way, but held that this is because it is an ANN, i.e. a computer. The Court of Appeal considered the approach of the High Court to be flawed because it imported the undoubtedly technical nature of computer systems (including ANNs) into the analysis. If that was appropriate, then the same could be said of the other cases of excluded matter such as the computer implemented financial trading system of Merrill Lynch.

The Court of Appeal therefore ultimately agreed with the UKIPO – what makes the recommended file worth recommending are its semantic qualities. This is a matter of aesthetics, which are subjective and cognitive in nature. They are not technical and do not turn this into a system which produces a technical effect outside the excluded subject matter. The fact that there is an external transfer of data (the file recommendation) does not help for the same reason. What matters is the correct characterisation of the data being transferred and that brings the issue back to the aesthetic and therefore non-technical quality of this aspect of the contribution.

The Court of Appeal therefore upheld the decision of the UKIPO that Emotional Perception's patent application is excluded from patentability.

 

Discussion

The appeal decision in Emotional Perception marks another exciting development and further much needed clarification for those looking to protect AI inventions.

The starting point going forwards should be that ANNs are considered to be “computer programs as such” and therefore engage the statutory exclusion from patentability. But that is not the end of the story – they may be rescued from exclusion if there is sufficient technical contribution. Accordingly, ANN implemented inventions are in no better and no worse position than other computer implemented inventions.

The UKIPO has updated its guidelines for examining patent applications relating to AI in light of the case. With immediate effect the UKIPO has changed its practice for the examination of artificial neural networks (ANNs) for excluded subject matter.

Patent examiners in the UK will treat ANN-implemented inventions like any other computer implemented invention for the purposes of section 1(2) of the Patents Act 1977. This means that they will apply the Aerotel approach to assess whether an ANN-implemented invention makes a contribution which is technical in nature.

As foreshadowed in the judgment, the computer program exclusion is not the only statutory exclusion that might be levelled at an AI invention. The reliance of AI on mathematical algorithms might invoke the exclusion for mathematical methods. There was no need for the Court of Appeal to consider the mathematical method exclusion (given that the invention was found to be unpatentable in any event), but the Court of Appeal nevertheless commented that this objection might well have had traction if the ANN had not been considered to be a computer program. It commented that it is hard to see why, even if the ANN is not to be regarded as a computer program for some reason, it is not in any case a mathematical method and so the very same analysis based on the Aerotel approach would apply with the same result. Similarly, the Court of Appeal commented that the provision of a recommendation message by the Emotional Perception system is the presentation of information, which is also unpatentable subject matter unless it involves a technical contribution. One would hope that the UKIPO will consider all possibly relevant exclusions from patentability in re-assessing its guidelines for examination of AI-related inventions.

Many will welcome the clarification from the Court of Appeal and may feel that this is a sound application of the existing law – ensuring that ANN implemented inventions are on a level-footing with other computer implemented inventions. But the bigger question may be whether, in the modern age, exclusions relating to computer programs (which were originally introduced, in part, to deal with the fact that patent offices were not equipped to search the prior art concerning computer programs and the like) are necessary and appropriate. As AI is increasingly prevalent in inventions, perhaps we are approaching the dawn of a new era in which there needs to be a reassessment of fundamental patent law concepts and their application to new technology.


1 After Aerotel Ltd v Telco Holdings Ltd and Macrossan’s Application (2006) EWCA Civ 1371

Print