Ted Kaufman - United States Senator for Delaware

Kaufman Asks Commission to Act Now on High Frequency Trading

In letter to SEC Chairman Schapiro, Senator outlines dangers of manipulative algorithms and systemic risk

November 20, 2009

WASHINGTON, DC – Senator Ted Kaufman (D-DE) today urged Securities and Exchange Commission (SEC) Chairman Mary Schapiro to take immediate steps to combat manipulative high frequency trading algorithms and end so-called “sponsored access,” which Kaufman and many market experts believe exposes the U.S. capital markets to systemic risk.  Sponsored access permits traders – many of which are unregulated hedge funds – to trade directly on an exchange, circumventing risk checks that apply to broker-dealer trades.
“While it is clear that high frequency trading brings certain benefits to our markets, it is also clear that manipulative high frequency algorithms are almost certainly operating today and that sponsored access creates a systemic risk today,” Kaufman wrote in a Nov. 20 letter delivered to Chairman Schapiro and the SEC’s four other commissioners.

Kaufman has been urging the SEC to look more closely at market structure issues since he wrote the Commission on Aug. 21 calling for a comprehensive “ground up” review before piecemeal changes in the existing structure only increase concerns about unfairness or systemic risk.  Last month he was the lead witness in a Senate Banking subcommittee hearing on a host of market structure issues, including high frequency trading, which uses ultra-fast computers, co-located servers and proprietary data feeds direct from market centers to make thousands of trades a second.
“Given that the Commission under current procedures is now blind to high frequency operations, the need for immediate action should not wait until the Commission has completed its comprehensive review,” Kaufman wrote.

An attachment to Kaufman’s letter to Schapiro spells out in detail his thoughts on three areas where further Commission action is needed for market fairness: consolidated surveillance authority, dissemination of market data and modernizing best execution reporting standards. “Fairness,” Kaufman writes in his attachment, “may be an elusive and evolving concept in the securities market, but it must be defined and then vigorously defended by regulators.”

“[G]iven the current lack of effective market surveillance,” Kaufman writes, “we simply cannot permit high frequency practices to continue unchecked without the ability of regulators to observe and stop manipulation or to avert systemic risks.”
Please see Kaufman’s Nov. 20 letter to Schapiro and its accompanying attachment below:

November 20, 2009

The Honorable Mary L. Schapiro


U.S. Securities and Exchange Commission

100 F Street, N.E.

Washington, DC 20549-1090

             Re: Uncovering Possible High Frequency Market Manipulation and Systemic Risk

 Dear Chairman Schapiro:

I am writing to ask you to provide the Commission’s proposed timetable for addressing concerns about market structures issues.  I am particularly interested in the Commission’s plans to address the serious issues raised by manipulative algorithms that may be employed by certain high frequency traders, as well as the systemic risk posed by high frequency traders using “sponsored access.”  These risks are particularly important given the current lack of effective market surveillance, a topic I have expanded upon in an attachment with some comments for the Commission’s consideration.  In short, we simply cannot permit high frequency practices to continue unchecked without the ability of regulators to observe and stop manipulation or to avert systemic risks.

I am pleased that the Commission has begun a comprehensive review of high frequency trading, dark pools, co-location of servers at the exchanges and other market structure issues.  I appreciate that this review should be a deliberative process and may well take some time.  But transparency, disclosure and risk-compliance requirements on the trading activities of high frequency traders are needed urgently.  And while I was encouraged to hear that the Commission may move sooner with its existing authority to require “tagging” and reporting by “large traders” now using high-frequency algorithms, I am concerned that the Commission does not intend to issue a concept release on high frequency trading until early next year, and that rule proposals should not be expected before the summer of 2010.  Given that the Commission under current procedures is now blind to high frequency operations, the need for immediate action should not wait until the Commission has completed its comprehensive review.

As you know, the Senate Banking Subcommittee for Securities, Insurance, and Investment recently held a hearing on a wide range of important market structure issues.  At the hearing, we learned from Mr. James Brigagliano, Co-Acting Director of the Division of Trading and Markets, that the Commission intends to take a “deeper dive” into high frequency trading issues due to concerns that some high frequency programs may enable front-running and manipulation.   Mr. Brigagliano’s testimony was troubling:

. . . if there are traders taking positions and then generating momentum through high frequency trading that would benefit those positions, that could be manipulation, which would concern us.  If there was momentum trading designed – or that actually exacerbated intra-day volatility – that might concern us because it could cause investors to get a worse price.  And the other item I mentioned was if there were liquidity detection strategies that enabled high frequency traders to front-run pension funds and mutual funds that would also concern us.

Rick Ketchum, Chairman and CEO of the Financial Industry Regulatory Association (FINRA), in a speech the day before the hearing, and Senator Chuck Schumer, at the hearing, also established that current regulatory market surveillance is inadequate.   None of the seven industry representatives at the hearing disagreed.  Again, I have elaborated on these concerns in the attachment.

Reinforcing the case for quick action, several panelists acknowledged that dark pools often exclude market participants in order to screen out possible high frequency manipulators.  For example, Robert Gasser, President and CEO of the Investment Technology Group, asserted that surveillance is a “big challenge” and improving market surveillance must be a regulatory priority: “I can tell you that there are some frictional trades going on out there that clearly look as if they are testing the boundaries of liquidity provision versus market manipulation.”   When asked, however, none of the panelists felt a responsibility to report any of their suspicions of manipulative activity to the Commission.  That is for the regulators to monitor and to stop, they apparently believe.

There are additional reasons to believe high frequency manipulation is presently occurring, including presentations to fund managers by broker-dealers (copies of which I have sent to the Commission previously) on how to employ their algorithms in executing orders to avoid the manipulative algorithms of others.  In addition, at the end of the recent hearing, Chairman Jack Reed asked about the arrest of a former Goldman Sachs employee who had allegedly stolen code from Goldman used for their high frequency trading programs.  According to Chairman Reed, the federal prosecutor argued that the judge should set a high bail because he had been told that with this software, a danger exists that someone who knows how to use it could manipulate the markets in unfair ways. 

Finally, a number of witnesses testified that so-called “sponsored access” – when certain broker-dealers permit unregulated entities to engage in high frequency trading directly at varying market centers without any associated risk checks – is indeed a systemic risk.  As Mr. Brigagliano noted, sponsored access is a “front-burner” issue at the Commission. 

News reports confirm that high-frequency trading is expanding rapidly, with new players entering this arena almost on a daily basis.  I am very concerned that, given the amount of money pouring into high frequency trading and the intense competition for profits, traders are increasingly tempted to leverage their positions higher and higher.  Moreover, because they are currently able to circumvent any meaningful regulatory checks, due to sponsored access and the lack of effective surveillance, the current situation is a prescription for disaster.

While it is clear that high frequency trading brings certain benefits to our markets, it is also clear that manipulative high frequency algorithms are almost certainly operating today and that sponsored access creates a systemic risk today.  That is why the Commission must not let months go by without taking meaningful action. 

Accordingly, please let me know what specific actions the Commission can and should be taking right away to discover and stop market manipulation and the spread of systemic risk.


                                                            Edward E. Kaufman

                                                            United States Senator

cc:            The Honorable Kathleen L. Casey

            The Honorable Elisse B. Walter

            The Honorable Luis A. Aguilar

            The Honorable Troy A. Paredes


November 20, 2009

To:            SEC Chairman Mary L. Schapiro

From:            U.S. Senator Edward E. Kaufman

Re:              Consolidating Surveillance Oversight, Dissemination of Market Data

and Modernizing Best Execution Reporting Standards


As I have written to you in the past, “fairness” may be an elusive and evolving concept in the securities market, but it must be defined and then vigorously defended by the regulators.  Accordingly, I am writing to you about the need for (1) a consolidated surveillance authority; (2) a review of the current rules governing the dissemination of market data; and (3) modernization of execution benchmarks and enforcement.  All three are essential to ensure execution fairness. 

In my August 21 letter to you, I raised concerns that the current fragmentation in the market, coupled with high frequency trading that thrives on small price differentials between the trading venues, has outpaced regulatory oversight.  Specifically, I questioned whether Rule 605 reporting – which regulators, market participants and investors use to measure execution quality – needs to be improved.   I also questioned if the national best bid and offer (NBBO) truly reflects “the quotes consolidated from the various venues at current execution speeds.”  Later in the letter, I suggested “Order Audit Trail System” (OATS) reporting should be expanded to NYSE-listed stocks and all market centers, including dark pools, so that regulators can better track the order execution process. 

Consolidated Surveillance Authority

As you know, U.S. Senator Chuck Schumer and Rick Ketchum, Chairman and CEO of the Financial Industry Regulatory Authority (FINRA), among others, have made clear the need to consolidate surveillance oversight in the U.S. equity markets.

Chairman Ketchum recently proposed that, with respect to equity trading, “all the data needs to be consolidated, with a single set of eyes looking at the market holistically ….  A single regulator that can bring the best technology, the best people, and a unified set of rules needs to be empowered.”   The next day, at a hearing before the Senate Banking Subcommittee on Securities, Insurance, and Investment, Senator Schumer stated: “I propose to the SEC that market surveillance should be consolidated across all trading venues to eliminate the information gaps and coordination problems that make surveillance across all the markets virtually impossible today.”  Several of the panelists endorsed Senator Schumer’s proposal.    

This is necessary, as Chairman Ketchum asserted, because “there are impediments to regulatory effectiveness that are not terribly well understood and potentially damaging to the integrity of the markets ….  The decline of the primary market concept, where there was a single price discovery market whose on-site regulator saw 90-plus percent of the trading activity, has obviously become a reality.  In its place are now two or three or maybe four regulators, all looking at an incomplete picture of the market and knowing full well that this fractured approach does not work.”  (emphasis added)

            I too hope the Commission will move to establish a consolidated surveillance authority, whether at FINRA or elsewhere.  I applaud Chairman Ketchum for his initiative and am pleased he understands that regulators must keep pace with market developments in order to preserve the integrity of the markets.  I also applaud Senator Schumer’s leadership on this issue. 

            Just as importantly, I believe the Commission should empower the market by making more data available.  As you know, the NASDAQ scandal of the mid-1990s was broken by academics who painstakingly built a database from the industry’s real-time data.  Even making subsets of order and OATS data publicly available, on a delayed basis and with identities masked, would be a useful supplement to a surveillance authority as well as an additional deterrent.

Market Data Dissemination

In the months following my August 21 letter, regulators and members of the industry have expressed similar concerns about the staleness of the NBBO as compared to proprietary data feeds.  For example, while testifying at the recent hearing, Direct Edge CEO William O’Brien asserted: “[T]he consolidated quote, the national best bid, best offer, is very slow and totally non-comprehensive related to the proprietary data feeds that some exchanges are selling to high frequency traders.”

The notion that some high frequency trading firms have an information advantage over individual investors because they can co-locate servers at the exchanges and have sophisticated telecommunications equipment to access proprietary data feeds is itself an important regulatory issue.   Before speed even becomes an issue, some traders have access to data feeds about prices that – when coupled with speed and high frequency algorithms – permit them to see price movements before others.  This is an important area for your review.

What is worse, I have received numerous reports of “memory sharing.”  This is co-location on steroids.  With memory sharing, a market participant actually shares a market center’s server.  As reported to me, a market center cannot offer this opportunity to more than a handful of market participants without adversely affecting server performance.  Unlike co-location facilities that can be built out and provide “fair access” under Commission review, memory sharing can be offered only to a few players.  This violates any notion of market fairness and demands the Commission’s immediate attention.

The dissemination of market data, which exchanges sell to trading firms and third-party vendors like Bloomberg, should also be subject to a thorough review.  Two questions in particular should guide such an examination:  Is market data reasonably priced for all market participants and is it equally available to trading firms and individual investors alike?  If high-frequency trading firms are able to capitalize on information advantages and disadvantage investors, in part because publicly available quotes are slow, stale and non-comprehensive, then the credibility of our markets is threatened.


Modernizing Execution Benchmarks, Measurement and Enforcement

I also want to propose specific steps to modernize best execution benchmarks and reporting standards.  Because current best execution standards are antiquated and incomplete, the current inability to measure and enforce execution fairness is completely unacceptable. 

As I understand current regulatory practices, the benchmark for enforcing execution fairness consists of consolidating all the best bids and offers over some interval of seconds due to the time it takes to consolidate the NBBO and disseminate it to the various market centers.  The Rule 605 forms, which purport to measure execution quality, are woefully outdated. The first column for time for execution reads “0-9 seconds.” In a gap of even a few seconds, prices can change significantly. In a world of 50 market venues, with structural latency issues being targeted by an entire industry of high frequency traders, millions of trades reaping millions of dollars can take place before retail investors and the regulators who protect their interests can comprehend what happened.

Moreover, due to payment for order flow, we must question whether certain broker-dealers are acting in the best interests of their customers, under cover of flawed regulation and antiquated enforcement techniques. Average investors must now wonder if their orders are being routed to a venue because it offers the best execution quality for them, or because it leads to the most revenue or lowest transaction fees for their brokers.  At the same time, substantial numbers of trades are taking place in dark pools, which are insufficiently monitored by regulators and which undermine public price discovery.

               As a consequence, under the current framework, I am concerned that regulators no longer have the tools necessary to carry out this critically important responsibility.  Below, I have outlined several areas that merit careful examination and review in order to ensure that our markets are fair, transparent and effectively regulated.

First, we need to address two aspects of the consolidated NBBO mechanism as currently implemented and disseminated.   To begin, we must create a better, more precise benchmark, perhaps by requiring all quote data contributing to the industry consolidated “tape” to carry a microsecond time stamp representing the precise time of the creation of the data at the originating venue. Some such change is needed to create a precise, better-synchronized benchmark for post-trade comparison to a best bid and offer – in a high-frequency-trading environment which now operates in microseconds and even talks in terms of nanoseconds.  Next the communications infrastructure surrounding the dissemination of market data must be re-engineered for maximum throughput and distribution efficiency.

Second, we must mandate that all venues synchronize and report all order and trading transactions to the microsecond, not rounded to the nearest second as is currently acceptable.  This is especially important given that all venues must be better able to measure execution against high frequency traders who can place many thousands of orders in a single second, and who have the highest speed trading systems and the shortest communication paths with the most bandwidth.

Third, we need a new generation of execution quality reports and a new process to ensure that they stay relevant.  At a minimum Rule 605 reports need to be rethought and modernized in order to better reflect the realities of current market structure. 

  • The biggest problem with the Rule 605 report is that it does not follow the life of an investor’s order once it is in the market, as was the original intent.  Rather, it simply follows the investor’s order while it is in the hands of the reporting market center.  Rule 605 (in conjunction with Rule 606) was intended to cover execution and routing practices but has never been updated to consider current smart-routing and black-box market-making practices.
  • Moreover, retail broker-dealers should be required to file a Rule 605 report, as they are closest to the customer and thus should bear the burden of demonstrating to the customer that his or her orders have been filled fairly.
  • Rule 605 needs to be adjusted to allow for reporting speeds at the sub-second level.

Fourth, regulators should considerimplementing a new generation of execution quality statistics and should reconsider certain existing statistics for the benefit of investors.  It may be that gaining a more complete history of orders, even using the current Rule 605 statistical measures, would help regulators to understand what new statistical measures might be more appropriate.  Here are several possible examples:

  • The existing methodology used to calculate realized spreads (five minutes after the execution) should be revised to incorporate a variable post-trade reference time to match the trading activity of each individual security more closely.
  • The order categories specified under the Rule should be replaced by dollar bands based upon order value (i.e. shares multiplied by market price). This also forces more orders to be covered under the Rule since covered orders are now limited to 100 to 9,999 shares.
  • A “covered trades” tally should be added to provide greater transparency as to the behavior of a market center.  Presently, only a count of orders must be reported, not a count of trades.
  • The current definition of “near-the-quote limit orders” (within 10 cents of the NBBO) should be changed to a range more closely associated with the price of each security.
  • Statistics like average amount of price improvement and amount of price dis-improvement per order could help brokers and investors identify the savings or costs associated with any given transaction.
  • A metric for “ improvement” that relates order and trade to quoted , measuring the positive impact of market centers that consistently execute at more than the displayed at the inside market.

Fifth, the scope of Rule 605 should be expanded to include a full set of execution quality measurements for filled limit orders from the time they become marketable. Presently, nine of the key Rule 605 statistics defined for market and marketable limit orders are not required to be calculated or reported for limit orders, even though limit orders now represent up to 95% of all orders placed.  

Sixth, the definition of covered securities should be broadened to include all securities where reliable quotes are disseminated and an NBBO can be computed. An example of this would be OTC Bulletin Board stocks which are currently excluded under Rule 605.

Seventh, execution data generally should be made readily accessible to customers in a more readable format.  This can be accomplished, for example, through web-based viewers and data mining tools. In this way, the current standardized format can be retained as it facilitates the electronic handling and monitoring of each venue’s Rule 605 results for regulatory and analysis purposes.

Eighth, and most importantly, regulators must agree on written guidelines for statistical deviation measures applicable to all important Rule 605 statistics, especially those that represent true “red-flags” on which regulators will focus.

Print this Page E-mail this Page