Conflicting Codes and Codings: How Algorithmic Trading Is Reshaping Financial Regulation

Contemporary financial markets have recently witnessed a sea change with the ‘algorithmic revolution’, as trading automats are used to smoothen the execution sequences and reduce market impact. Constantly monitored, they take an active part in the shaping of markets, and sometimes generate crises, when ‘they mess up’ or when ‘they do not want to work’, as traders say. Made of scripts (IT code), algorithms are designed to replicate trading patterns: to be accepted, they have to ‘behave’ according to different sets of texts (regulatory texts framing the marketplace). In this article, we draw on ethnographic fieldwork in order to describe algorithms and discuss the different facets of these widespread objects. We return a controversy articulating around the detachment of trading practices from traders, and its reattachment into a specific financial object. We then raise a few questions relating with the framing of practices: what does this shift in the organisation of trading activities generate on the daily routines taking place in trading rooms? Is it possible to make codes (algorithms) comply with codings (rulebooks and codes of conduct)? How does such ‘equipment’ matter to the marketplaces, and what exactly does it imply for the development of future regulation in the market space?


Struggling for codes: the new Cold War in trading
Contemporary financial markets are witnessing a sea change in the way they operate. Since the mid-2000s, modifications both related to technological innovation and regulatory homogenisation have been reconfiguring the market landscape. The importance of these changes has recently been underlined by the development of the Aleynikov case, triggering in the summer of 2009 a new 'Cold War' and the related 'global arms race' (Alloway, 2009).
Aleynikov, a former Goldman Sachs VP, was allegedly charged with the theft of thirty-two megabytes of proprietary trading source code just before leaving the company. The code, designed to be used within a computer platform allowing 'sophisticated, high-speed, and high-volume trades on various stock and commodities markets', was part of the high frequency trading programmes that 'generate many millions of dollars of profits per year' for Goldman Sachs (Southern District of New York, 2009: 3). The case was otherwise followed by other similar thefts at UBS, and revelations on security measures intended to protect Electronic copy available at: http://ssrn.com/abstract=1616043 2 'coded secrets' at Citadel Investment Group, LLC (Berenson, 2009) Those stories somehow gave algorithmic and automated trading a major exposure in the industry, while firms are fighting for supremacy in the ability to design and protect their proprietary trading codes. Another sign that 'the hot area now is high frequency trading' (Bookstaber, 2009), the cases refer to a new step in the automation of the order flows in financial markets. After the initial shift from open-outcries to electronic markets in the mid-80s, a second revolution is well under way in marketplaces, where algorithms are playing a much bigger role in the trading process than ever before. Not only are they used in order to 'find' prices and match buy and sell orders (Domowitz and Wang, 1994;Lee, 1998;Muniesa, 2003) thereby materializing exchanges, but now they are also able to 'decide' when and how to send orders without human intervention. The growing development of algorithms therefore overflows the market frames. Indeed, moving massively from exchanges to market participants, they reconfigure the nature of agencies making markets, allowing institutions that have both the abilities to develop machines and the financial resources to deploy such systems to make wealthy profits; hence the Cold War espionage tone used by journalists and bloggers to qualify these cases. This shift has clearly been recognized as a major one by regulators, as reveals the publication, by the Federal Reserve Board of Washington, of a report entitled 'Rise of the Machines' in the US (September 2009), or the similar need for the French Autorité des Marchés Financiers to issue a press backgrounder on 'Key issues arising from the emergence of dark pools and crossing networks' (October, 2009), with reference to venues where algorithms are heavily used. Tackling with liquidity issues often serving as a landscape for debates on algorithmic trading (Does algorithmic trading significantly change the quality of the available liquidity? How does it weigh on the order book microstructure ii ?), the FED report also mentions the most recent improvements in technology, allowing some algorithms 3 to 'automatically read and interpret economic data releases, generating trading orders before economists have begun to read the first line' (Chaboud et al., 2009: 1; see also Leinweber, 2009: 56 and 84).

Providing 'political' descriptions of market devices
If a cohort of social scientists, ranging from Abolafia (1996) to Lee (1998) to Fligstein (2001), has already unfolded markets and their underlying mechanisms, interest in the description of devices has recently developed (MacKenzie, 2006;Callon et al., 2007); financial objects populating markets have now been recognized as valid objects of inquiry.
Devices at stake in diverse activities such as arbitrage (Beunza and Stark, 2004;Beunza et al., 2006), mergers and acquisitions (Beunza and Muniesa, 2005), or the hedge funds industry (MacKenzie, 2005) have recently been examined, altogether with the technologies once used to send orders (Preda, 2006), and receive them (Muniesa, 2008). However, paying attention to devices while describing objects and the contexts in which they occur has generated a number of criticisms among economic sociologists and political economists alike iii . The main areas of dispute include the alleged lack of attention towards institutions and politics in the finegrained ethnographies proposed by defendants of the Social Studies of Finance research agenda. We do not abide by this critique: in this article we will try to show why, while suggesting that the institutions and politics of the market are in fact best made visible through the description of devices providing a support to their expression.
Despite all their merits, however, the aforementioned studies have not yet addressed an issue we deem important: namely the regulation of practices involving those devices and the representations thereof. When mentioned in these works, regulation appears only in passing, but has not yet been the subject of many studies iv . The notion itself is polysemic and refers to a wide range of realities, from the realm of economics to philosophy to sociology. In this 4 article, it will be understood as a normative activity, the purpose of which is to frame practices developing both within and around the space of the market. Control functions, such as compliance officers, permanent controllers, market supervisors and regulators all contribute in their own way to the framing of market practices, whether acting a posteriori (e.g. through the careful checking that procedures have been followed, either internally or externally), or a priori (e.g. while issuing advice to market operators just before these engage in a trade). But how do these employees get a grip to market practices, when these are coded and encapsulated in a specific device -namely the algorithm? Indeed, if trading algorithms are so widely disseminated and used by market participants in their daily duties, if these tools are really changing the face of the financial world as the examples tend to suggest, then it is high time to question their being from a regulatory stance.
Nowadays traders are heavily equipped with tools, which allow them to delegate a portion of the practice of trading to automats ('robots'), framed with computational logics and complex binary languages, thereby leaving a space for the development of new kinds of market actors. By the end of August 2009, large banks recognized that approximately 80% of their total equity trading flows in some markets was being processed by algorithms (Jeffs, 2009). In fact, the dissemination of trading algorithms appears to be an eminent political subject, for it raises questions about the meaning of practices having impact on the collective once these have been delegated to machines. Our point is not to judge whether technology is good or not, and we do not wish to put the blame on the 'machinist' type of finance currently developing in markets. Rather, the question lies in the uses that are made of the technology: it is through the resituating of practices that the politics of the market -the power relations and institutions framing the culture of the field, together with voice mechanisms allowing for the expression of different views -can emerge. Our goal is to show through the unfolding of the different views aggregated in the object, the kind of outcomes they produce on their 5 regulatory environment.
Like the vast majority of financial devices, trading algorithms are the material expression of converging and diverging points of view. Built as a result of conversations defining needs between users (clients and traders) and designers (engineers and regulators), algorithms differentiate themselves from other processing devices in use in trading rooms (such as screens, keyboards, phones and fiber optics connections) in that they receive a compendious form of practice. They do not just come as partial prostheses intended to cater to specific needs, like microphones (making sure that everybody gets the information), or keyboards and screens (which provide a way to ask for prices) do. Algorithms are entities in their own rights, places where extensive financial practices are encapsulated: as codings, they amount to a specific kind of text describing the market's materiality. As a text, the algorithm is a definitional device that makes the financial world different each time it 'decides' to fire an order into the market. When describing the trading pattern it follows and making it fit into the market, algorithms get involved in the shaping of markets: not only because they belong to and co-constitute the marketplace, but also because, in so doing, they open and close possibilities to render the market adequate (or inadequate) to the patterns of action they embody.
In this article, we wish to question the reconfiguration of regulatory spaces entailed by the growing use of algorithms. Thus, we intend to shed light on the problem of misalignment between different codings: the coding of practices necessary for the algorithm to replicate a trading pattern on the one hand, and the coding of practices necessary to the regulators securing market functioning on the other hand. Indeed, the fact that the increasing use of algorithms is modifying the market ecology may well call for a complete renewal of the ways regulation is made when the automation of market practices is at stake. How can we understand the fashion in which algorithms (software codes coding practices in an IT 6 significant 'textual' device) impact the performance of regulation (codes of conduct coding the accepted practices in markets)? Algorithms embody a controversial space that we have to describe if we are to understand something of what they are and, more importantly, what they produce v .

Regulation, algorithmic innovation and calculation
While regulation is often said to weigh on innovation, recent regulatory changes in Europe have in fact favoured the development of algorithmic trading. Shared calculative spaces between different stakeholders began to emerge, articulated around the tool.

Adopting MiFID, widening algorithmic horizons
Technological innovation in European financial markets has recently been enhanced as a result of the implementation of the Markets in Financial Instruments Directive (MiFID), in November 2007. This text, designed to supplement the lacks resulting from the former 1993 Investment Services Directive (ISD), has often been described as a 'sea change' in regulation (Casey and Lanoo, 2009: 26). Now structuring the landscape of European finance, the text hinges on two ideas. First, that competition between execution venues needs to be enhancedthereby removing the old 'concentration rule' effective in most European countries, previously making it mandatory to transact financial instruments in centralized markets. Then, that customers need to be adequately protected from the 'natural' dangers they may encounter in markets. Within the context of 21 st century finance, competition materializes in the design and development of sophisticated tools enabling investment firms to compete in markets, while fulfilling their clients' instructions. 'Best execution', once a mere principle in regulatory codex, is now centre stage: investments firms are bound to produce a document 7 (the best execution policy), describing how they will manage to execute their clients' orders to the best of their abilities, according to a mix of categories such as speed, volume, price, etc. (European Commission, 2004, art. 21.2 sq. and2006, art. 44 and46).
On the technological level, this new obligation borne by intermediaries, required them to be in a position to route orders on different venues competing to offer the best liquidity.
Being able to build instant comparisons between venues soon became a major concern at brokerage firms, as these needed to know where they should execute their client's instructions. 'Smart Order Routers' have therefore been designed and developed in order to make those decisions within milliseconds, a lapse of time that makes it impossible for a human being to stand competition. Algorithmic innovation, resulting from the adoption of the European regulation, thereby began to modify the roles assigned to market participants.
Brokers, once valued for their ability to provide insightful recommendations on financial instruments to their clients, now chiefly need to prove their ability to swiftly process orders in the markets; and this goes through the development of algo trading.

'Can you beat the VWAP'? Looking for famous algorithms, and other variants
Before we try to delineate the controversy articulating around algorithms, and if we want to open the black boxes they embody, we first need to give a vision of what they are. For nonusers, algorithms are visible (but not fully available) on the websites of data vendors, IT consulting firms and brokerage houses. Besides these, professional journals such as Automated Trader or The Trade, working for the buy side provide detailed information about the different classes of algorithms, what they are intended to do, how to use them, etc. The basic strategies are usually displayed in grids and lists reproduced in marketing brochures, thus making it easier for the salesperson to sell the product. These brochures contain a series of 'fact sheets' detailing strategies, the set of parameters that are either mandatory or optional 8 for the algorithm to work, and the underlying quantitative model for a taste of scientific knowledge. Among the classic suite of algorithms that have been used for several years now, one can find the 'Volume Weighted Average Price' -or 'VWAP' algorithm. Here, the IT code attempts to meet the VWAP, which is calculated as an expression of the total value of transactions in a given instrument, divided by the number of instruments effectively traded in a predefined period of time. It gives an average price that is linked both to time and volumes transacted in the market; it therefore serves as a benchmark often used by traders when they want to assess the quality of an execution (has the trade been executed at a better price than the price available on average and over the dedicated period?).
The parameters usually available for this specific algorithm are the time period (when will the algorithm begin to 'work' the order, when will it stop?), and additional price constraints such as the maximum participation rate (in order to limit market impact) or price limits (levels above or under which the algorithm will stop working). Once the parameters are implemented in the machine, the algorithm 'slices' the initial order into small parts, with reference to historical data specific to each financial instrument, all this in a view to sending a series of instructions into the order book. The information perused by the VWAP code is the curve describing the intraday volumes available; thus, the trader using the algorithm has a reasonable assurance to match or go beyond the VWAP by the end of his trade. While resting on a rather old notion (the average price), the VWAP has become a de facto performance metric and is often used to compare the abilities of traders to beat the market: in order to achieve this, the trader will try to anticipate the volumes available on the market, and set the tool so that it participates accordingly.
In a way, algorithms are both material and immaterial, and they sometimes possess this strange ability to escape the trader's perception, despite the fact that they are acting in front of him. Although some algorithms act in the open, some, on the contrary, act in the dark; hence 9 the rather scary names they are given: 'Dagger', 'Guerrilla', 'Stealth', 'Shadow' or 'Sniper' are all metaphors describing a specific trading pattern. Because it is not easy, even for specialists, to get a good representation of what algorithms are, and what they do, marketing and sales persons are using that kind of metaphors, which encapsulate the meaning inherent to an algorithmic strategy. If one is to think about those names, it is far easier to sell the idea that the automat will wait until an opportunity arises, before 'firing' an order.
We said that those evasive objects were intervening in the market in a fashion that would make their appearance more of a non-event, as they would not always be easily noticed. In this respect, some exchanges provide participants with a stealth capacity, and the best example for this would probably be the widespread use of the 'Iceberg' order, the purpose of which is to allow the trader to submit a large volume order while publicly disclosing a small portion of it. The algorithm slices the order in several bits and only shows the 'tip' of the iceberg (a small quantity), while the remaining mass is kept secret, waiting under the displayed liquidity. Iceberg orders, when used carefully, are said to both reduce price movement and smoothen the execution phases. Indeed, once the first slice is executed, the algorithm automatically places a second slice, until the bottom of the iceberg has completely melted down.
These examples provide a first insight into the algorithmic world. The brochures describing trading patterns, grounded in quantitative models made explicit, account for the representation of algorithmic behaviour, which remains difficult to access.

Building a controversy, opening a space for calculation
With algorithms, we face a genuine financial object: not only because they do have a materiality that can be described, and which occupies the minds of traders, quantitative researchers, or IT developers, but also because they make those participants act and react. algorithm according to their situation and their specific function; however, the scripted code represents a common interest among them. It is in this sense that we think the algorithm stands as a political object aggregating and fixing the views that contribute to its enactment as a genuine financial device. This manifold concern lies at the heart of the black box we would like to open here, by trying to express the rationales they all account for.

What is an algorithm filled with? Describing the encapsulation of points of views
As a technical object, the algorithm opens a space allowing for the expression of conflicting views. Designers (IT Developers) and users (sales, traders and marketing experts) generally try to disseminate the product, either in extolling its qualities and the related value it can add to customers, or in effectively using the tool in the marketplace. On the other hand, control functions (compliance officers, market surveillance and regulators) try to corset this dissemination: not because they do not want to see algorithms in markets, but because they recognize the need for a strict monitoring of the algorithm entailing full compliance with market regulations. From time to time, conflicts relating to the nature of algorithms may arise in markets as a result of script errors or misuses of instructions, which generate 'funny trades' in the market. These impair the 'natural' formation of prices, thereby crystallizing a state of crisis giving rise to a confrontation between the views expressed by all the actors attached to and through the algorithm.
The difficulty we encounter here is deeply rooted in the fact that trading algorithms are not easy to represent. They have no unique display and, although they must be located 'somewhere' (usually on a server), it seems they do not have a single physical location, but rather a multi-local expression duplicating the market. If we want to trace the algorithmic character, should we assume it resides in the coding describing a trading pattern and the 12 related actions to take depending on the market's context? Or should we say the algorithm is displayed on the trader's screen, in the window allowing him to set the parameters, which shape the algorithm? Or else, is it the trendy image sold to clients of a technology necessary for them, if they are to beat the market? By reflecting on the algorithm's existence, we understand it is subject to the developing views expressed by those who are in close association with it; it is these views that we will detail thereafter, our focus alternating between people and contents.

A technical product that requires a specific argument (the marketing and sales view)
Among the most visible representations of the algorithm, we find the formulations produced by marketers, which generally have a 'design' approach to it. Marketers need to produce brochures explaining what the algo does, and frame these 'actions' in groundbreaking slogans. They need to find the right tone and the vocabulary that will speak to market participants (sales, traders and clients). Most of the textual representations they engender are grounded in assumptions drawn from economics and finance theories. Ideas such as liquidity, efficiency and the perfection of markets are often used to extol the product's abilities in advertisements directed at potential customers (institutional firms): Low latency, high throughput, efficient liquidity management for personalized best execution handling (Millenium advertisement, 2009) ITG DARK ALGORITHM ®. Unique, aggregated access to POSIT ®, ATSs & ECNs, total transparency of executions, anti-gaming logic (ITG advertisement, 2008) Sophisticated algorithms that dynamically adjust to minimize slippage and market impact (JPMorgan advertisement, 2008) 13 Our sophisticated EMS, SpreadHawk TM , allows clients to trade through shrink-wrapped algorithms for single stocks, pairs and portfolios (AlgoTrader advertisement, 2009) All these discourses, selected from a wide range of other similar ones, make it abundantly clear that the machine enhances the capacities to trade better on behalf of its users: that is to say to make money in milliseconds, with the ability to avoid market pitfalls, all the while not disclosing the intentions nor the trading strategies followed. The underlying idea, which serves as a basis for algorithmic design, is that of the clinical precision of a fast machine sophisticated enough to solve problems (Beunza and Stark, 2004). These representations, initially intended to sell the product, duplicate the mathematical and economic assumptions they are built on: the informational efficiency hypothesis, transparency, and the necessary smoothing of executions in order not to make 'noise' or waves in the market's liquidity, for instance.
These arguments are generally supported and used by salespeople who take advantage of the formalization offered by marketing brochures in order to place the product with their customers. Once the Alternative Execution Services (AES) sales representative has solved connectivity and settlement issues for a client, he may decide to go further and ask the client if he would like to be offered access to the in-house suite of algorithms. He will subsequently send a short email detailing which strategies could be made available to the prospect: I would like you to use my algos on Europe as well... VWAP, % of volume, TWAP, In line, Arrival price, Iceberg,... You can place larger orders into them to be worked over a period of time = better execution, less risk, increased productivity. What do you think?
Being convinced that 'the client will get better opportunities than the others' helps in the 14 selling process. In the end, the AES sales resorts to similar arguments to sell the code by emphasizing the results it may produce in the market. Technology is contributing to the industrialization of profits because it either allows faster ('low latency' allowing to capitalize in milliseconds) and more accurate trades ('less risk, increased productivity').

An object framed in a dedicated language (the IT expert view)
For the IT Developer, working hand-in-hand with the Quant analyst, algorithms consist in the translation of a mathematical model into a series of scripts, framed in a dedicated language: C++, Dotnet, FIXATDL, Java, etc. However, they are not yet another piece of software useful to their internal clients (the traders); sometimes, the IT Developers describe their algorithms as living entities, which develop and reconfigure themselves according to market shaping events.
Me: 'So, how would you describe the algorithms you write?' Kyle: 'They are tools, mmmh… something in which we put some intelligence… there is some intelligence in the automat'.
Me: 'What kind of intelligence? What do you mean?' Kyle: 'Some of them have the ability, when they are living, to use other algorithms, they decide how they should act according to a set of events… we work on this with the quants team… we call them meta algos' During the conversation, Kyle explicitly states that his job consists in finding the correct language, the right code allowing the best set-up between different sources of information vi .
He describes his work with metaphors mixing technological and human existence, somehow completing the formulation suggested by Lash (2001: 107): 'making sense of the world through technological systems '. 15 This idea that the code is a being may well be shared by users, as shown by the following mail sent by a trader to the IT department in charge of the development of algorithms, and entitled 'End of life of an algo': When client puts an end-time including the fixing (say, 5:35:00), it's obvious he wants to take part to the fixing. Today, POV [the '% of Volume' algorithm] till 5:35:00, auction @ 5:35:16, the algo was cancelled… we are long 3.5 million. I remember asking the algo being kept alive till the real fixing time.
The living item, behaving erroneously because of an improper input in its set of parameters, implied the broker finishing the day with a risk position in his book. The terms employed here depict the proximity between humans and machines, which matches the observations previously made by Beunza and Stark (2004: 396): '[…] The robots, as the traders say, are partly 'alive' -they evolve. That is, they mutate as they are maintained, re-tooled, and refitted to changes in the market'. The codes populating the marketplaces therefore do play an active role in the interaction mediating transactions between traders and other market participants. Achieving a kind of autonomy, they need to be 'kept separated to reduce the possibility that their evolution will converge (thereby resulting in a loss of diversity in the room)'.

A (not so?) helpful tool for daily duties (the trader's view)
When he opens his workstation, the trader usually gets access to an 'algo box', which resembles a toolbox (Rosen, 2009): rather than facing a black box, different strategies are displayed in the form of a list, requiring at least a mental (re)construction before the algorithms are launched into the marketplace. According to its client's instructions and the market's context, the trader will choose among different algorithms and different parameters 16 the tool with which he will have the trade executed. This is not always as easy a task as it seems. Not only because, despite their being fine-tuned to their needs, algorithms require a certain path dependency (once you have set the parameters in the machine, it is not always possible to change one's mind and revert to the initial situation, as trade slices might have already been sent to the market for execution), but also because traders sometimes recognize that 'it's no fun having the machine play for you'. Some of the old-timers, those who have What is at stake here is the materialization of a shift in the making of markets, towards a greater rigidifying of practices: older traders often criticize this shift, as they have the feeling of entering an age where financial markets resemble an electronic game they cannot access anymore. And this may not be the reaction of an aging staff unable to adapt to a better way to process trades: the algorithmic revolution is a real one, according to experts in the negotiation of financial instruments: There is a lot of anecdotal evidence regarding new behaviours as a result of particular algos. An example is the effect of volume participation algos chasing a volume spike. The piling in of several algos simultaneously to chase a spike can cause temporary supply disruption and cause 17 the price to sharply spike before reverting to the mean. (Rosen, 2009: 104) Therefore, we see that algorithms do weigh on the practices of traders, either because of the changes they bring to the market or because of what traders think they may produce on the ecology of the order book.
But automating traders' decisions also brings new challenges as regards the framing and the control of practices for at least two reasons. First because it is far easier for those who are in charge of the regulation to interact with a human being than with a machine (and besides they do not always know exactly where the machine is). Then, and perhaps more importantly, because the kind of mediation generated by the algorithm, which takes place between the trader and the market, does not reinforce the trader's ability to feel responsible of what is done on his behalf in the market. It is these two points that we will now probe.

Adding the regulatory view, furthering the controversy
To the different views already expressed (marketing, sales, IT and trading), others should indeed be added if we are to understand the controversy generated by algorithms in contemporary financial markets. Once algorithms -which we have defined as software codes coding practices in an IT significant 'textual' device -are sent into the cables leading to market servers, they may come up against a series of resistances resulting from the framing of marketplaces with regulatory texts: the codes of conduct which code accepted practices.
While the front office staff's duty is to develop, promote, sell or use the algorithm (all actions that have previously been classified under the idea of a dissemination of the code performing markets), the view from a regulatory stance seems quite distinct, if not entirely different.
Control functions do indeed work on algorithms, doing their best to monitor the suggested 18 dissemination: if there is a 'rise of the machines', then it has to be framed and monitored.

An existence to be described and disclosed (the market's point of view)
One of the main concerns expressed by market structures is that of the proliferation of systems plugged to their platforms and weighing on levels of natural liquidity. It is however very interesting to note that some markets are very strict on the use of algorithms, while others do not seem to bother too much with them. A broker willing to use algorithms to trade on the Irish Stock Exchange, on Deutsche Börse or the Swiss Market will need to go through a validation process requiring the filling of forms, detailing with precision what the algorithm is intended for, how it works, the different levels of controls or monitoring systems allowing to stop it at will ('panic buttons'), etc. The name of the person responsible for using the algorithm shall also be disclosed in order for the market surveillance to be in a position to identify the trader and the related compliance desk to contact, if the need arises. Other markets, such as Euronext or the London Stock Exchange, do not ask for details about the systems through which the transactions are submitted to the market. Rather, their rules and regulations emphasize that investment firms and traders using algorithms are solely accountable for the orders they input or withdraw from the market.
Among the markets which detail a disclosing obligation, some propose their own definition of what the algorithm is, thereby trying to set a formalized representation in accepted categories: Automatic order entry systems, in particular Quote Machines, Electronic Eyes and Algorithmic Trading Engines as well as combinations thereof, are computer programs of a company for automatic generation of orders and are part of the Participant Trading system. Such orders are generated and transferred into the electronic trading system on basis of order book information 19 and additional parameters determined by the company. (Deutsche Börse, 2009: 31) Here, it is through the alignment of mandatory conditions that the algo gets access to a recognized status: a definite physical location within the company together with its registration under a trader's name, who will be responsible for managing the coded tool.
Thus, the exchange seems to admit that algorithms are producing a part of their own reality when shaping the flows and moves that can be observed in order books, and which influence participants' activities. The definition provided here is intended to allow the market to sanction misbehaviour that would result from an inappropriate use of algorithms -or other hybrid systems.

An object framed with a set of predetermined rules (the compliance officer's point of view)
Between markets and traders, interface functions such as compliance officers make the link between texts (codes of conduct) and market contexts (in which we now know that IT codes are playing a role). They try to corset the practices where these are made, and to find solutions when rulebooks displaying the principles towards accepted market behaviour do not address the issues faced by operators. Compounding texts and contexts, compliance officers are also in charge of relationships with market surveillance: they process the paperwork accompanying the deployment of algorithms, if any, and manage issues that may arise as a result of the use of the algorithm.
If the compliance officer is in charge of processing the paperwork, he therefore needs to get an insight into what the algorithm does precisely, how it duplicates practices and trading patterns, before he can translate and document these into written explanations 'which both the IT Developer and the trader usually do not like at all' (a compliance officer). Providing an account of what exactly the coding's purpose is, or how it will behave in the market, will 20 require the gathering of explanations from the departments that participated in the development of the algorithm (the Quant who modelized the behaviour, the IT Developer who translated it in a script, and the Head trader who initiated the request). Based on these investigations, the compliance officer will construct an argument and compare it with regulatory requirements (the codes of conduct), which can be either overdeveloped or fairly scarce in their expression of what the exchange allows, or not, to its participants. This part of the description can prove a rather difficult task, as the rules to be followed may not always be as self-explanatory as they should: You see the Swiss… they used to maintain two markets, SWX and Virt-X for their big caps… Besides these hermeneutic issues, the compliance officer needs to keep updated on the changes implemented by market structures: for example, German markets' restructuration in 2009 required 'heavy paperwork, in order to keep [the formal administrative existence of codes] up-to-date'. This is all part of the compliance officer's work: enacting acceptable practices through dedicated devices (rulebooks, procedures and codes of conduct). However, we see that access to algorithmic behaviour is already a complex construction, requiring discussions, interpretations and translations that blend different points of view, in order to try to make these fit within a dedicated frame, whereas at the same time some markets do not ask for such a formalisation.
Moreover, it is the compliance officer's duty to make sure that the algorithmic device 21 will indeed be fully framed with limits defining a set of impossibilities, so that when launched into the market, the device does not impair price formation processes or has too much of an impact on the quality of liquidity. This point is a rather difficult one to document, as the relationship between IT specialists and compliance experts often revolves around the different languages -sometimes exclusive of each other -that they resort to. The level of technicity, implied both by the very coded nature of the algorithm and the complex codification of market behaviours in regulatory texts, does not help reconcile the expression of a common reality.
Discrepancies may therefore arise between those two sets of expressions holding themselves together within the algorithm; our descriptions have shown how fragile the existence of a shared apprehension of market reality is. Gaps between what the algorithm performs and the description that is provided to market supervisors on the one hand, or conflicts between underlying trading intentions and effective trading results in the markets on the other hand, always mix those different views. The algorithmic scene is set for the description of the controversy generated by the meeting of potentially conflicting codings with, on one side, the IT code replicating trading practices, and, at the other end of the spectrum, the codes of conduct intended to frame said practices. Misalignments between these two orders result in trading issues, revealing how modes of regulation get challenged in contemporary markets where algorithms are proliferating. It contains a letter signed by one of the London Stock Exchange market surveillance team members, requiring some written explanations about 'Large Erroneous Order', electronically sent to the market in late October of the same year. The letter, though very courteous, is firmly expressing the need for GES to better monitor their order management system and the related filters that should be preventing erroneous orders to run inadequately into the public order book: I am writing with regard to an incident that occurred on October 29 th , when your firm submitted an extremely large order onto the buy side of AXT order book during the closing auction. This order was for a size of 1,184,966 shares at a price of 40.4p, with the previous automated trade in the stock being 37.5p. The order remained on the book for a period of 13 seconds and lifted the price significantly, then being deleted moments before the scheduled uncrossing. After contact with our Market Supervision team it was established that this order was in fact erroneously submitted by an algorithm.

Caught in the controversy: codes and codings in crisis
The compliance officer in charge is then asked to provide details about the reasons that led these orders being placed in the book, together with an explanation of the underlying strategy pursued by the client. Indeed, the orders had been submitted 'very close to the running of the uncrossing algorithm [the market's algorithm calculating the closing price] and due to the price and size, had a strong impact upon the indicative uncrossing price disseminated by the Exchange'. Finally, an explanation of the enforced controls is also required, along with the measures to be introduced in order to prevent such behaviour in the future.
The issue, as exposed by the LSE market surveillance team, relates to the allegedly 23 uncontrolled activity of an algorithmic trader, who sent instructions to the market before deleting them after a rather 'long' period of time (13 seconds), thereby modifying the natural ecology of the order book, and displaying false and misleading information to other market participants during a critical phase of the day (the 'fixing', which serves as a reference, for instance to price managed funds). Whether this had been done purposefully or not was the reason the stock exchange requested an explanation. For the compliance officer, this meant looking once more into those 'bloody algo issues' for the third time that month.

Tackling the issue: the compliance officer's answer to the market
The compliance officer gets up and goes to the Algorithmic Trading Team sitting at a nearby desk. Once the trader responsible for the supervision of the algo has been identified ('ah… yes, it must have messed up… sorry we didn't see this before… it's been a nightmare today with so many trades to monitor that I'm surprised it actually happened so late'), explanations are soon formalized in a letter, allowing the compliance officer to revert to the market: Further to your letter mentioned above, we are pleased to provide you with the following information. You will find hereafter a chronology of what happened, based on the information gathered from our Algo team: At 16:34:08: An algorithmic trader enters a buy order for 500,000 AXT shares.
At 16:34:50: Order sent in the algo by the trader who entered 5,000,000 shares in the automat, instead of 500,000 by mistake.
At 16:34:51: The algorithm automatically split the order into 4 orders (3 orders of 1,271,678 shares and 1 order of 1,184,966 shares) with a limit of 41.2p. As the last traded price was 37.5p, the maximum price for the order, calculated by the Automat was 41.2 p (10% max deviation filter).
At 16:34:59: The trader realised his mistake and immediately cancelled the order (9 seconds 24 after the sending of the order). As the market was closing, 3 of the 4 orders have been cancelled (respectively at 16:34:59:893; 16:34:59:933; 16:34:59:936), but the last one had been executed for 1.184.966 shares at 40.4p on the close.
After having provided additional details on the controls in place, the compliance officer ends his letter by reiterating GES's commitment to better monitor algorithms in the future.
Internally, he decides to send a strong reminder to the traders and to ask the Head of Execution to review the filters in place in order to prevent other 'fat fingers', knowing that a new algo issue could lead to a reprimand or even a fine (and in the worst case, GES's access to the market being suppressed).
In this case, even though the mistake originates with the trader's 'fat finger', the algorithm plays a definitive role, in that it does not leave any time for the trader to cancel and modify the instruction: within a second, the order is managed by the automat, thereby closing the space for the interpretation and correction of the course of action. Such a situation would probably not have occurred in a different setting where the trader would have been working the order himself. Mediating the relation between the trader and the market, the algorithm exemplifies the fixation of (mis)behaviours. The IT coding has been part of a conflict with the rules and regulations of the LSE, which immediately questioned the underlying intention indirectly expressed by such an instruction (in the letter, the market surveillance noted the possibility of an abusive market practice, impacting the prices of the closing auction, one of the strategic moments of the trading day).

Some consequences on the market's ecology
The case reveals a series of interesting elements as regards the framing of machine-based practices and the kind of consequences they bear on the regulation thereof. The algorithm used by the trader lies at the heart of the controversy implying a market structure (the London 25 Stock Exchange) with its own algorithms calculating the closing auction price (referred to as the 'uncrossing') to be displayed in the order book, together with its market surveillance team and systems, and an intermediary (GES) with its own devices -including servers, wires allowing the processing of orders, embedded filters and screens used by the support teams to monitor the trading flows. The link between both institutions (the intermediary and the market) and their resulting roles within the market space is embodied by the algorithmic behaviour. At least half a dozen people interact directly in this issue, concatenated with the IT code acting in the marketplace, and the codes of conduct ruling over market behaviours.
The empirical material here provided stresses the difficulty arising from the growing uses of algorithmic instructions in markets. Framing the practice once it is delegated to the code is not a simple task: it implies a constant attention (the Algorithmic trading team spends their whole day monitoring the executions on their screens), and cautiousness (when setting the parameters and 'giving life' to the algo). Despite these efforts, errors do occur and they, almost inevitably, are not easy to revert: either because the algorithmic machines are too fast (acting within milliseconds), or because they somehow 'fix' the actions according to a frame, which is not so flexible.
This may well provide us with an indication as regards our question on the impact borne by algorithmic equipment on the regulation of financial markets. If algorithms are thought of as tools intended to help traders when they want to engage in markets, then the question concerning the ability of such systems to be acted upon, becomes a crucial one. It is not only a simple question of what can be done in terms of IT coding, it is rather a question of how compliance officers, market supervisors and market regulators can make users manage more efficiently their algorithms, which contribute to the shaping of market materiality. The issues faced in day-to-day practices, which we can read through the conversations between clients, intermediaries, and markets, show that there is a need for traders to remain adaptable in ever-26 changing market contexts, especially in times of crises. Sometimes, this may not be possible once the actions are framed within the algorithm. Traders who were accustomed to searching a good price for their clients, therefore working the order throughout the day, would now most likely be using a VWAP algo in order to better achieve such a task. In so doing, they would be in a position to provide their client with an average price down to the 4 digits after the decimal. But they would also, as a consequence, lose the adaptability that is necessary to manage tricky situation in times of market distress.

Concluding remarks: making algorithmic codings act 'by the code'?
Crises resulting from the confrontation of codings (algorithms) and codes of conduct (market rules and regulations) allow us to question the recent developments witnessed by marketplaces. Our article has suggested that contemporary financial markets are experiencing an important change with respect to the nature of their regulation. Our purpose here has been to contribute to the identification to such a change, best expressed in the misalignment between codings and codes of conducts. What is at stake with such a misalignment? At least three points, which we can summarize as follows: (1) The rapid dissemination of algorithms indeed questions the ability of regulatory functions to have a grip over the new spaces enacted by those devices. Ethnographic fieldwork has allowed us to provide an insight into the possible representations of trading algorithms, and to delineate one of the multiple controversies they generate. The change at stake here is the rigidifying of practices through their encapsulation in a coded script intended to increase the efficiency of transactions in financial markets. Fixing states of market reality in pre-determined patterns raises questions as regards the monitoring of market behaviours. The case displayed, which shows that even simple errors such as 'fat fingers' 27 cannot easily be identified, as the algorithm somehow 'forces' the hand of the trader and contributes to the development of the 'fat finger', while denying his ability to exercise his judgment. At the same time, the machine contributes to the blurring of representations in the marketplace. When time and space are reduced to dimensions in which it is materially impossible to (inter)act, then the regulation of actions may be problematic.
(2) This point shall be linked to a second one: if trading algorithms add a strong mediation between traders and markets, they also express a mix of different views that further impedes the attribution of accountability when issues arise. If market regulations generally request that intermediaries appoint someone to supervise the uses of trading algorithms, they (3) Finally, the conflict between codings (algorithms) and codes of conduct (rules and regulations), once expressed as a construction involving different points of view and keeping these together, emerges as the closing of a space rather than the opening of hermeneutic possibilities. What we mean here is that even before the action takes place in the market, the space of possibilities for interpretations are closed within the tool. This point only emerges as a derivative from the material presented here, but it certainly raises challenging issues for regulators, especially in situations where the hermeneutics of regulatory principles are of major importance when assessing the nature of a trading practice. In a sense, it is the 28 distinction between means and ends that is at stake here, with an emphasis on the role of technologies causing dislocations in order to readjust (Latour, 2002: 258).
In a nutshell, we could define trading algorithms as devices where the misalignment between different codings emerges as a strong problem for future regulation. Because algorithms aggregate so many devices and persons, they tend to 'dilute' the practice of trading, while at the same time introducing black-boxed areas within the courses of actions, thereby contributing to the blurring of representations, opening the way for potential abusive behaviours.
When professionals recognize that the 'shift from phone to low-touch trading will not be an effortless transition' (Simon and Morgan, 2007: 10), they mean that practices are grounded in a certain kind of mediation that still generates defiance. The question is not that of the possibly increased dangerousness in using algorithms, but rather that of the interests to be deployed by institutions and market participants, if they are to describe the detours and reconfigurations they are facing with the advent of the algo revolution. Therefore, regulators should seriously ponder the idea that 'equipment matters: it changes the nature of the economic agent, of economic action, and of markets' (MacKenzie, 2009: 13). To us, it is in this respect that algorithms raise a political issue, best expressed through the blurring of means and ends for the participants, thereby acknowledging the calibration problem identified by Beunza and Stark (2004). If the rise of the machines has obviously begun, the homo algorithmicus is not a grown adult yet. ii. An order book can be described as a device representing the market for a dedicated financial instrument. Although markets have their own specificities, it usually displays the following information, for each side (bid / ask) of the market: the quantity of instruments available, the price, and the number of orders available for each price.
iii. Of the numerous controversies already generated as a reaction to the development of social studies of finance, one can refer to paradigmatic cases such as Mirowski vs Callon (and MacKenzie) on performativity in 2004 or, more recently, Williams vs Beunza on the goals of such studies of financial markets, in 2010 (cf. Beunza, 2010). iv. One exception to this statement is to be found in Millo (2007). Smith that 'to access and grasp a market as a definitional practice […], it is necessary to become immersed within the market as a true participant observer' (Smith, 2007: 34), I have been serving as an Equities Compliance Officer for three years, during which I have been 30 looking at sales, analysts, traders and support functions everyday, chatting with them and recording conversations, taking part to the formation of practices in the field. I also introduced artefacts in the research design, such as brochures, emails, and conversation transcripts originating from structured, semi-structured interviews or informal discussions. vi. See for example Simon and Morgan (2007: 12 sq.): 'Certain algorithms have also been attracting recent attention due to their ability to consider publicly disseminated information that has a tendency to move a stock. Traders like the idea of algorithms that take into consideration breaking news stories or press releases before making a decision. […] Algorithms will continue to play a role similar to the evil villain that comes back to haunt the hero in any theatrical movie. Algorithms are breaking apart orders and making markets more efficient, but at the same time creating an environment where traders are having a significant amount of difficulty in finding their required liquidity.
[…] With the use of smart order routing technology, the industry is on the heels of artificial intelligence-like abilities. More and more, bulge-bracket brokers are offering their strategies to mimic the actions conducted by traders and replace intuition with a machine. The next generation of algorithms will automatically interpret what is moving the market, when to trade and make the most optimal decision'.