Out of rithm
The idea that today’s cartelists are as likely to be found hiding behind computer screens as in smoke filled rooms is a hot topic. This is due to businesses using increasingly sophisticated algorithms and artificial intelligence (“AI”) to ensure that their online prices align with those of their competitors: a form of tacit collusion.
Tacit collusion may not in itself be illegal. It can be seen as a rational reaction to market characteristics not amounting to an agreement or concerted practice, and therefore escaping the Article 101 competition law prohibition of the Treaty on the Functioning of the European Union.
In May 2017 US Federal Trade Commissioner Terrell McSweeny said that “We shouldn’t outlaw pricing algorithms. Algorithms are right up there with the printing press in terms of their contribution to our modern economy.” However, competition authorities need to mitigate any risks to consumers, and privacy issues that may also arise.
Competition authorities have recently considered how the use of algorithms by companies such as Google and Amazon can affect markets. Whilst it provides benefits, enabling computers to process much more data quicker, it also potentially facilitates collusion and harm to consumers. Competition law expert Ariel Ezrachi recognized this at the Law Society’s Competition conference, saying “transparency is now backfiring and reducing competition”.
Ezrachi, together with fellow expert Maurice Stucke, have identified several ways in which businesses may breach competition law by using algorithms to price-match, ranging from simple software to more complex AI that reads and reacts to competitors’ pricing. The debate continues as to whether current legislation is sufficient to allow the prosecution of such conduct, and who is legally responsible for price fixing generated by machines. In the meantime, what should you be aware of?
Shooting the messenger
At the most primitive level, algorithms may form part of a wider collusive agreement, for example acting as go-betweens, executing the explicit agreements or concerted practices of businesses. In 2015, David Topkins and his co-conspirators sold posters in the United States through Amazon Marketplace, adopting “specific pricing algorithms for the sale of certain posters with the goal of coordinating changes to their respective prices and wrote computer code that instructed algorithm-based software to set prices in conformity with this agreement.” Bear in mind:
- You can’t avoid liability by hiding behind your screen. By facilitating an existing price fixing agreement, your algorithm is simply carrying out an illegal activity conspired by its human operator.
- Sharing information about algorithms with your competitors – even unintentionally – is dangerous. It is possible you may be required by competition authorities to revise or remove any offending algorithms, in order to thwart the possibility of your competitors reaching conclusions regarding how you and/or your algorithm determine prices.
Data “hub and spoke” communications
Tacit collusion through the use of algorithms is no longer restricted to the “hub and spoke” framework (communication via third parties) traditionally familiar to competition authorities. In an online environment, a potentially problematic de-facto hub and spoke structure may emerge if the shared use of an algorithm by competitors to monitor prices leads to collusive behaviour, whether intentional or not.
In 2016, the Court of Justice of the European Union confirmed that whilst actual knowledge of an electronic notice sent to travel agents announcing a restriction that capped discounted rates, was required for an infringement to exist, such knowledge could be inferred. In that instance, agents independently subscribed to use the system’s algorithm, appreciating others were doing the same, and that the algorithm fixed prices at a certain level. So, whilst using third party algorithms may save you the cost and time of acquiring data unilaterally to teach your own;
- Ensure you effectively monitor online communication to avoid allegations or inferences of collusive behaviour.
- Be aware of effects on market prices, which may be viewed as evidence of collusion.
Submitting authority to the machines
It appears that as pricing software is becoming more autonomous, you may not even need to speak to your competitors to collude, as computers will do this for you by using identical algorithms or learning from their interactions with other machines. An algorithm may not necessarily be coded to collude, but because it is programmed with game theory, that game theory will be constantly remodeled until a rational and predictable outcome is identified; likely a dominant strategy that maximizes profits. For example, Google’s DeepMind two-agent neural network game learned from experimenting and then collaborated to improve a joint position.
At what stage, if any, would you/your company relinquish responsibility over the acts of a computer? It would appear that there is certainly a perceived incentive to shift pricing decisions from businesses to algorithms, in order that human operators do not need to concern themselves with traditional deterrents such as fines and humiliation. But beware; EU Commissioner Margrethe Vestager recently warned that because algorithms could help cartels become more effective, this may mean enforcers have to reflect that in the fines imposed on businesses. That being the case, if you’re using an algorithm;
- Ensure you understand the scope of its activities and consider if any possible safeguards exist, because, as Vestager cautions, “what businesses need to know is that when they decide to use an automated system, they will be held responsible for what it does. So they had better know how that system works.”
- Consider if any illegal activity could have been anticipated through the use of the algorithm, and whether your company has done enough to avoid such an outcome.
Still, whilst the extent to which we can truly control self-learning machines is perhaps vast in the respect that we can design an algorithm, launch it or equally shut a computer down, what if that system is integrated with other functions, or if it be cannot be overridden by a manual price?
As Stephen Hawking stated: “Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all.”
 Case C-74/14, Eturas and others