SEC Commissioner Stein On Wall Street's Data Explosion

April 15, 2015

On April 14, 2015, Securities and Exchange Commissioner ("SEC") Kara Stein delivered a speech at the SIFMA Operations Conference.  As Commissioner Stein so aptly warns:


For hundreds of years, physical paper documents and human beings dominated our securities operations. Today, data dominates. Digital data is part of every aspect of our markets.  And this new reality is challenging all of us. The proliferation and reliance on data has disrupted our markets - oversight and regulation need to evolve to keep pace.  In this new world, we need new tools. 

Stein's remarks are by no means comprehensive or granular; rather, they present a snapshot of the emerging role of data and rightly suggest that human beings are barely keeping pace (if at all) with the explosion.  As she perfectly captures in her remarks:

This "Flash Crash" should have been a wake-up call to all of us. It demonstrated that our markets had, in some ways, outpaced their keepers.  This was the largest, but not the last flash crash. Other mini flash crashes continue to occur in our markets.

A recent example that demonstrates some of the potential pitfalls of over reliance on technical and algorithmic trading occurred on April Fools' Day this year.  A Tesla press release jokingly announced a new "W" model for a watch.  It was clearly intended as a joke.  However, it was taken all too seriously by computers dutifully executing their algorithms in response to the press release.  The algorithms didn't quite get the joke, trading hundreds of thousands of shares and spiking the stock price within one minute of the issuance of the release.

The BrokeAndBroker.com Blog reprints Commissioner Stein's SIFMA comments in full-text below:


Commissioner Kara M. Stein

April 14, 2015

Good morning and thank you, Lisa Dolly, for that kind introduction.  It is a privilege to be with you today. 

Whether it's processing trades fairly, ensuring systems are protected from cyber threats, or making operational risk management a firm priority, you, the operations professionals, have a vitally important role and enormous responsibility.  And it's not just your firm that is counting on you. Other market participants are counting on you, and investors are counting on you.  Each of you contributes to a network that both enables and protects our markets.  

And you are doing this against a backdrop of dramatic changes in our securities markets.  Our markets have always been influenced by change in technology.  But the rate of today's changes is unprecedented. For hundreds of years, physical paper documents and human beings dominated our securities operations. Today, data dominates. Digital data is part of every aspect of our markets.  And this new reality is challenging all of us. The proliferation and reliance on data has disrupted our markets - oversight and regulation need to evolve to keep pace.  In this new world, we need new tools. 

Today, I want to speak about some of the new tools that this explosion in data now demands. Before diving in, though, I want to provide a bit of historical context to underscore how dramatically our markets have changed. Second, I will outline some of the new opportunities and risks presented by the data revolution.  Finally, I will discuss some new tools that the Commission is employing - or should be employing - to keep pace with the new data-dominated world.

Before I go any further, I need to remind you that the views I'm expressing today are my own, and do not necessarily reflect the views of my fellow Commissioners or the staff of the Commission. 

From the Buttonwood Agreement to High Speed, Electronic Trading

For hundreds of years, securities transactions were dominated by human hands.  Traders stood face to face and looked each other in the eye.  Some of the New Yorkers in the audience may know that the Buttonwood Agreement established the New York Stock Exchange in 1792.[1]Twenty-four brokers signed the agreement to become the NYSE's first members.  The agreement created a closed club whose members agreed to trade only with each other.  The location of the exchange was originally under a tree - a buttonwood tree - at 68 Wall Street. Face-to-face trading in securities and in-person communication were everything.  It was a clubby exchange where everyone knew each other and deals were done in person and with a handshake.   

Over 200 years later, today's securities markets would be unrecognizable to those who signed the Buttonwood Agreement.  Forget about conducting business under a tree - the central nexus of securities trading activity for decades, the trading floor, has also rapidly become a relic of the past. The close-knit atmosphere of the Buttonwood days - which persisted to some extent on trading floors - is gone. Human dealers and specialists have largely been phased out.  Today, nearly all trades occur on electronic venues, with more than one third of orders executed off-exchange.

The early 1970s brought fundamental change to the actual process of transacting in securities.  A confluence of regulatory changes, technological advancements, and changes in communication drove the market to automate securities transactions.[2]  As a result, our securities market is, for better or for worse, less human.  Gone are the trader's gestures and shouts. Artificial intelligence is replacing human intelligence.  Human considerations are being replaced with mathematical models and algorithms.

And, the need for speed is unquenchable.  Wires are being replaced with fiber optics, microwaves, and laser beams.  The human blink of an eye is too slow for today's market.  Trading volume is scattered among venues with no one exchange having an overall market share of twenty percent.  Increasingly, more and more volume is executed off- exchange.   Computer "matching engines" match electronic limit orders with electronic market orders.  High-speed trading dominates, representing over 55 % of US equity markets and 40% of European markets volume.[3]  Liquidity provision has largely shifted from traditional market-makers to computerized systems that trade at light speed and across different exchanges and securities.[4]

In addition, computers, not research analysts, cull through vast quantities of data to pick stocks. Market data, news reports, Twitter feeds, weather reports, and other data sets are scooped up by computers and used to devise trading models and predict prices.[5]

The New Market Structure Brings Advantages and New Concerns

Not only would the new market be unrecognizable to the signers of the Buttonwood Agreement - it barely resembles the market from just a few decades ago.  Buyers and sellers still come together, but in new ways. The volume and complexity of today's securities markets is unprecedented. Today, orders bounce from one place to another in a seemingly endless search for potential counterparties.   

For example, between 2005 and 2015, the average number of daily order audit trail system (OATS) reports for Nasdaq-listed and OTC quoted securities increased over 700 percent from approximately 107 million in 2005 to 868 million in 2015.  In addition, the average daily number of all OATS reports (all NMS Stocks and OTC securities) more than doubled over the last four years, increasing from 1.487 billion in the fourth quarter of 2011 to 3.151 billion in the first quarter of 2015.[6]

As many of you know, the OATS figures are a pretty good proxy for the increasing complexity of order routing behavior over time.   Today, so much more routing happens before it ever gets to an exchange.  This rapid transformation has outpaced nearly all of us. 

The infrastructure of our market is now mainly behind the scenes - computer terminals, software, routers, and server farms. Traders provide the same service as before - but in a new way.  Our markets are faster, more efficient, less expensive, and provide more choice to investors. Execution, bid-ask spreads, and the cost of transactions have improved.[7]    

However, new concerns and issues have arisen as technology and communications have converged to form a new market structure driven by data.    In 2010, with spectacular effect, the markets demonstrated to the world how interconnected, complex, fragile, and fast they could be.   On May 6, 2010, people were already jittery about the debt crisis affecting Europe. Then, an order from one market participant automatically pumped sell orders of E-mini futures into the market. This seemingly simple and isolated action set off a cascade of events that shook the markets.[8] Computers dutifully executed their code.   

This "Flash Crash" should have been a wake-up call to all of us. It demonstrated that our markets had, in some ways, outpaced their keepers.  This was the largest, but not the last flash crash. Other mini flash crashes continue to occur in our markets.

A recent example that demonstrates some of the potential pitfalls of over reliance on technical and algorithmic trading occurred on April Fools' Day this year.  A Tesla press release jokingly announced a new "W" model for a watch.  It was clearly intended as a joke.  However, it was taken all too seriously by computers dutifully executing their algorithms in response to the press release.  The algorithms didn't quite get the joke, trading hundreds of thousands of shares and spiking the stock price within one minute of the issuance of the release.[9]

The New Dominance, Importance, and Impact of Data

Flash crashes, disruptions, outages, and artificial intelligence failures continue to underscore the complex and interconnected nature of our new marketplace.  Data and technology present tremendous opportunities and benefits - but they have also opened the door to new and exceedingly complicated risks.   In today's market, how do we ensure the market is fair, efficient, and promotes capital formation?  How do we promote innovation and the use of technology and data analytics while understanding that there are limitations and risks?

These are some of the most important questions that the Commission - and the securities markets - face in the coming years.  We need new tools to make sense of this new environment that is so tied to computers and digital data.  I would like to spend the remainder of my talk describing how the Commission is reacting to this new environment and make a few suggestions on how to proceed. 

Legal Entity Identifiers

First, I would like to talk about Legal Entity Identifiers or LEIs.  The 2008 financial crisis demonstrated the opacity and lack of understanding about the linkages between market participants.  Parties to financial transactions were unable to appropriately assess risks, particularly as events unfolded.  That opacity led to nearly catastrophic results that cascaded throughout our financial system.  

We saw with the collapse of Lehman Brothers in 2008 that both regulators and private sector managers were unable to quickly assess exposure to Lehman.  They were also unable to determine the degree of interconnectedness between global financial market players.  If participants are going to be interconnected through risky, complex transactions and products that cross jurisdictions, regulators and firms need tools to monitor and understand what is going on.  Both need a window into the highly complex linkages that tie firms together.

As many of you know, I am a strong supporter of the global system of Legal Entity Identifiers - or LEIs.  This system has been developed to assist both regulators and market participants in obtaining reliable information in an increasingly connected financial ecosystem. I want to thank SIFMA for its leadership on developing and advocating for the Global LEI System.  SIFMA and others have been working for some time on a method to identify and develop a private-public solution. 

The LEI is a unique, 20-digit alpha-numeric code, much like a grocery product code, that will make it possible to identify all the legal entities involved in financial transactions.  I'm pleased with the progress that we are making.  We are on the way to a system that will provide great benefits to both regulators and market participants at relatively minimal cost. 

Over 350,000 LEIs have been issued so far, and many, many more are on the way.  This new system is a basic building block in understanding the risks and interconnections both within and between financial firms. A global LEI standard means that financial market data will be more consistent and usable.  Relationships and connections will become more transparent.  Data about entity relationships will ultimately show networks of control, ownership, liability, and risks. The LEI will help counterparties to financial transactions use the data for better risk management.  It also assists companies with their internal management of operational risks.

The LEI may reduce costs in collecting, cleaning, and aggregating data, and in reporting data to regulators.  The LEI also will aid regulators seeking to better monitor and analyze threats to financial stability.

Recently, the Commission released rules that mandate the use of LEI when associated with security-based swap transactions.[10]  As some companies may have hundreds or thousands of subsidiaries or affiliates operating around the world, more benefits lie ahead as the LEI becomes increasingly used.  Greater usage will allow more transparency regarding hierarchies and relationship mapping.  This will support better analysis of risks as they aggregate and potentially become systemic. 

I hope the Commission will undertake additional areas for deploying the LEI.  For example, I think that including LEI on the table of subsidiaries filed as an exhibit to a company's annual report may provide additional transparency about hierarchies and relationships to the marketplace. 

Data Needs To Be Protected

While computers have helped markets become faster and more efficient, they also have allowed for a myriad of new interconnections and potential vulnerabilities. In the fall, we learned that the failure to upgrade one server at a large US financial firm resulted in the exposure of personal information from 76 million households and seven million small business accounts.[11]  Luckily, there was no theft of money or highly confidential information. We all should learn from this and other breaches.  Attackers don't follow a playbook and that means that a security checklist won't work either. 

Data is an asset, and the linkages in our information systems architecture provide avenues for people to access it.  More importantly, breaches can affect investor confidence. It is vital that everyone works together to develop best practices in detecting and dealing with data breaches.  Boards, senior management, and line employees should constantly be asking questions about how to prevent and detect such breaches.  I also believe that knowledge sharing is critical to creating resiliency.  And this is an area where everyone needs to participate and contribute.

This past November, the Commission adopted Regulation Systems Compliance and Integrity (Reg SCI).  Reg SCI requires some market participants to establish written policies and procedures to ensure that their systems have the capacity, integrity, resiliency, availability, and security to maintain their operational capability. However, the rule only addressed certain market centers. There is no question that I would have gone further in this rule.  

Shouldn't everyone with direct access to the trading centers have to implement basic policies and procedures to ensure that their computer systems are stable, secure, and contribute to resiliency in our market? Just as we license drivers and register and inspect vehicles for the safety and soundness on our nation's roadways, shouldn't there be minimum requirements or standards for anyone with direct electronic access to the equity markets?

Market integrity is everyone's responsibility - not just the responsibility of the largest and most sophisticated.  Everyone has a role in ensuring that our US securities market is resilient.  We are all responsible.  All participants should have some basic controls in place.  Despite our complex market structure, we cannot have a fragmented regulatory approach to our diverse and complex marketplace.

A Deeper Understanding of Market Data and the Consolidated Audit Trail (CAT)

As I mentioned earlier, the speed and interconnectedness in our markets provides a number of benefits.  However, we also know that when software or computers fail, they can cause significant disruptions.  The Flash Crash gave us many lessons. Heavily traded securities are not immune to crashes driven by algorithms and computers.  The interconnections between the equities market and the futures market are such that safety features need to be coordinated.  In my mind, though, perhaps the most important lesson learned is the need for a consolidated audit trail or CAT.

I was pleased to see that you will have a panel discussing the CAT tomorrow.  With a CAT, market oversight will improve as transparency of transactions across their entire lifecycle improves, including linkages to beneficial owners.  Market data will increasingly serve as the source of rulemaking.  It can also be used to better understand potential trends or abuses.

CAT is one of the largest data projects that has ever been undertaken and it represents a paradigm shift in how we oversee the U.S. securities markets.  CAT will be the world's largest repository of data from securities transactions.  It is estimated that CAT will receive over 58 billion records each day, covering all market participants across numerous asset classes. 

I have consistently advocated for the CAT to be implemented as soon as possible.  It is hard to think of an initiative more important to the Commission and our markets.  Unfortunately, development of the CAT has been bogged down by administrative hurdles.  Development has yet to begin and implementation is still years away.  We need the CAT as soon as possible.

The Flash Crash and other events in our markets demonstrate the need for CAT. Only through a consolidated audit trail can we truly know what is happening in our marketplace, with trading activity cascading across multiple trading venues and asset classes.  The linkages, complexity, and fragmentation of our markets outstrip the current ability to monitor, analyze, and interpret market events.  Only through CAT can we develop regulations that are truly driven by facts.  Only through CAT can regulators appropriately survey our high-speed and high volume marketplace. 

The importance of CAT to our nation's securities marketplace cannot be overemphasized.  This vital initiative is our main tool to be more proactive and informed in our approach to regulation.

Office of Data Strategy

As you can imagine, in addition to the CAT, the Commission is taking on more and more data projects, such as the Swap Data Repositories.  The Commission also now has access to new, rich data sets through filings on Form PF and Form N-MFP that need to be analyzed. 

Over the last five to ten years, data management and analysis has become more complex and require a strategic approach.   I believe that the Commission should form an Office of Data Strategy overseen by a Chief Data Officer.[12]  This new office would ensure a comprehensive approach to data collection, business analysis, data governance, and data standards. 

Other regulators have proactively moved forward on forming similar offices.  I believe that the Commission needs to act now to develop a group solely focused on data, including building an infrastructure to facilitate the use of data throughout the agency. 

One of the most important focuses of this new office would be promoting data standards and taxonomies.  Data standards and taxonomies play a vital role in both the quality and utility of data. It is critical that we approach data standards as a community and not in isolation - an Office of Data Strategy should lead this effort. 

A key role of this office should be identifying data gaps and refining existing data collections.  This should be an evergreen process whereby the Commission - through the Office of Data Strategy - is constantly seeking to improve upon its data quality and filling gaps.  We should be testing our forms and data sets continuously and searching for better ways to obtain clear, usable data.  As we analyze data and receive feedback from market participants, we can tweak and refine how we collect and ask for data to produce better, more reliable results.    

It is also important for the Commission to think globally about standards and data.  For example, the International Organization of Securities Commissions (IOSCO) in conjunction with the Committee on Payments and Settlement Systems (CPSS) has been working to develop a framework for derivatives data reporting and aggregation requirements. 

Again, as I've stressed throughout my remarks, it is critical that we work together as a community, rather than independently. The Commission has limited resources and needs to optimize its use of data.  This includes collaborating with other regulators and market participants whenever possible.

Everyone benefits from regulators making informed policy choices driven by the best data possible. 

Conclusion

So, to wrap up my remarks today, our securities market has obviously evolved and changed from the face-to-face trading of the Buttonwood era to lightning fast automated systems and algorithms. Given this new reality, we all must work together as a community to improve risk management, regulation, and market resiliency. 

We need new tools, and we need to think about whether our regulatory framework is constructed in a way that can handle the new and evolving structure of our markets.  But, the new tools cannot replace the responsibility of all market participants to contribute to a resilient market structure. We all need to do more to ensure that our markets continue to be fair, orderly, and promote capital formation while protecting investors. 

As part of this collaboration, I hope that you will all consider commenting and providing feedback on Commission rulemaking initiatives, especially in areas relating to market structure and data. Your input is much appreciated and can help inform our policy decisions.  

Thank you for the opportunity to be with you today.  I hope that you enjoy the remainder of your conference.


[1] On May 17, 1792, twenty-four brokers subscribed to the original brokers' agreement, the Buttonwood Agreement, termed for their meetings under the buttonwood tree.  This was the first organized stock market in New York.  In 1817, the name of the "New York Stock & Exchange Board" was adopted as was a Constitution.  The New York Stock and Exchange Board was shortened to the New York Stock Exchange in 1863.  See New York Stock Exchange: Market Data, Facts and Figures, available at  http://www.nyxdata.com/Data-Products/Facts-and-Figures.

[2] In the late 1960s, the sheer volume of paper was overwhelming brokerages, and a "paperwork crisis" actually forced the closure of trading to allow back-offices to catch up.  During this period of time, a brokerage firm used approximately 33 different documents to execute and record a single securities transaction.  Clerical personnel at firms were working day and night to manually process transactions.  By 1971, the microchip was being adopted into general-purpose computers. The new computers and the paperwork crisis led to computers beginning to reshape the entire securities market structure.  In 1975, Congress mandated the creation of a national market system (NMS) for securities transactions.  Specifically, Congress found that "[n]ew data processing and communications techniques create the opportunity for more efficient and effective market operations." Congress emphasized that the processing systems for collecting and distributing consolidated market data would be central features of the national market system. See Larry E Bergmann, The U.S. view of the role of regulation in market efficiency, Remarks at the International Securities Settlement Conference (Feb. 10, 2004), available at  https://www.sec.gov/news/speech/spch021004leb.htm; Paul Ceruzzi, A History of Modern Computing, MIT Press, 1998,  at 222;  Section 11A(a)(1)(C)(i) of the Exchange Act, 15 U.S.C. 78k-1(a)(1)(C)(i); and  H.R. Rep. No. 94-229, 94th Cong., 1st Sess. 93 (1975). 

[3] Austin Gerig, High-Frequency Trading Synchronizes Prices in Financial Markets, Division of Economic and Risk Analysis, U.S. Sec. & Exch. Comm'n, Working Paper (Jan. 2015), available at http://www.sec.gov/dera/staff-papers/working-papers/dera-wp-hft-synchronizes.pdf.

[4] Austin Gerig & David Michayluk, Automated Liquidity Provision, Division of Economic and Risk Analysis, U.S. Sec. & Exch. Comm'n, Working Paper (Dec. 2014), available at http://www.sec.gov/dera/staff-papers/working-papers/dera-wp-automated-liquidity-provision.pdf.

[5] Bradley Hope, How Computers Trawl Sea of Data for Stock Picks, 76 Wall St. J. (Apr. 2, 2015), available at  http://www.wsj.com/articles/how-computers-trawl-a-sea-of-data-for-stock-picks-1427941801.

[6] The Financial Industry Regulatory Authority (FINRA) has established the Order Audit Trail System (OATS), as an integrated audit trail of order, quote, and trade information for all NMS stocks and OTC equity securities.  Pursuant to rules approved by the Commission in March 1998, FINRA member firms are required to develop a means for electronically capturing and reporting to OATS specific data elements related to the handling or execution of orders, including recording all times of these events in hours, minutes, and seconds, and to synchronize their business clocks.

[7] See, e.g., Angel, James J., Lawrence E. Harris & Chester S. Spatt, "Equity Trading in the 21st Century: An Update," at 23-24 (June 21, 2013).

[8] See Findings Regarding the Market Events of May 6, 2010, Report of the Staffs of the CFTC and the SEC to the Joint Advisory Committee on Emerging Regulatory Issues (Sep. 30, 2010).

[9] Matt Levine, Tesla Stockholders Can't Take a Joke, Bloomberg View (Apr. 2, 2015), available at http://www.bloombergview.com/articles/2015-04-02/tesla-stockholders-can-t-take-a-joke.

[10] See Regulation SBSR-Reporting and Dissemination of Security-Based Swap Information, SEC Release No. 34-74244, available at https://www.sec.gov/rules/final/2015/34-74244.pdf.

[11] Matthew Goldstein, Nicole Perlroth & Michael Corkery, Neglected Server Provided Entry for JPMorgan Hackers, New York Times DealBook (Dec. 22, 2014), available athttp://dealbook.nytimes.com/2014/12/22/entry-point-of-jpmorgan-data-breach-is-identified/;see also JPMorgan Chase & Co., Current Report on Form 8-K, (Oct. 2, 2014), available at http://www.sec.gov/Archives/edgar/data/19617/000119312514362173/d799478d8k.htm.

[12] The Report on the Implementation of SEC Organizational Reform Recommendations as required by Section 967 of the Dodd-Frank Wall Street Reform and Consumer Protection Act stated that the SEC was instituting "the role and function of the Chief Data Officer (CDO) at the SEC."  See U.S. Sec. & Exch. Comm'n, Report On the Implementation of SEC Organizational Reform Recommendations, at 15 (Sep. 9, 2011), available athttp://www.sec.gov/news/studies/2011/secorgreformreport-df967.pdf.   However, the initial start-up faltered.