“The True Sign of Intelligence”
Remarks by CFTC Commissioner Scott D. O’Malia, Stevens Institute of Technology- Hanlon Financial Systems Lab
June 19, 2012
Albert Einstein said, “The true sign of intelligence is not knowledge but imagination.” I like the quote very much, and it reminds me of how the Commission must approach its oversight mission going forward. I believe technology must play a larger and more foundational role in the Commission's execution of its market oversight responsibilities, risk management and in the protection of customer funds.
Clearly the evolution to an almost exclusively electronic futures market over the past two decades has benefited from a massive amount of imagination in terms of the creation of new algorithmic trading strategies through the application of cutting edge technology. While the Commission doesn't need to be on the cutting edge of technological application, we must be more imaginative in developing our own technology strategy and not accept the status quo.
It is a pleasure to be invited to speak at Stevens Institute, which not only imagined, but developed a first-class financial engineering research program and facility. I have to believe this is what Einstein was referring to when he described as the true sign of intelligence; the application of knowledge and imagination.
Prior to my nomination to the Commission, I served as the Clerk of the Senate Committee on Appropriations, Subcommittee on Energy and Water where I had responsibility for funding the Department of Energy (“DoE”). What few people realize about DoE is that it also funds research of everything from nuclear physics to the nuclear weapons program. Simulating, solving and understanding the most challenging physics questions have and will continue to push to bounds of computing. And it was my job to fund this cutting edge technology.
After the ban on underground nuclear testing, the weapons program relied heavily on computer simulation that required bigger and faster computers than what existed. So, we invested hundreds of millions of research dollars into advanced computing and simulation. It took real imagination to conceive of the yet-to-be developed massively parallel computing platforms that are used today. I am especially proud of the Roadrunner supercomputer, a joint effort with Los Alamos National Laboratory and IBM to create what was then the world’s fastest computer. On May 25, 2008, Roadrunner became the first computer to break and sustain the petaFLOP barrier by processing more than 1.026 quadrillion calculations per second.
My Appropriations experience was invaluable and provided me with an understanding of the potential of advanced computing and I am determined to make technology the foundation of the
Commission’s oversight, risk management and customer protection responsibilities.
Today I would like to share with you my recommendations regarding a technology strategy for the Commission. First, I will provide you with the current technology state of play and explain why it is important for the Commission to develop a comprehensive technology strategy that builds on appropriate concepts of knowledge management. Second, I will talk about how academia, the private sector, and other federal agencies must play a critical role in expanding the Commission’s technological capacity in the shortest time frame, while using the most cost-effective means possible. Finally, I would like to talk a little bit about the current issues I am working on as Chairman of the Commission’s Technology Advisory Committee.
Current State of Play: the Commission Lacks a Technology Investment Strategy.
The current state of the Commission’s overall investment and use of technology is unacceptable and unsustainable. Investing in technology has, to my knowledge, never been a priority. To say that we have failed to keep pace with the markets is an understatement. Our lack of focus and imagination in this critical area is a direct result of not having an investment strategy that identifies specific technology objectives tied to our surveillance and oversight mission responsibilities.
Much has been said about the awesome responsibility of overseeing the swaps market in terms of manpower and computing power. Without a doubt, the Dodd-Frank Act expands our oversight responsibility to include the swaps market, which notionally represents a $650 trillion global market of which roughly $300 trillion are U.S. jurisdictional trades.1 This figure eclipses the notional size of futures market which stands at roughly $35 trillion.
On a gross value basis, however, the total global swaps market is just $20 trillion.2 On a per trade basis, which is specifically relevant to our technology discussion because this is the level at which our software will have to engage, the two markets bear no resemblance. CME's Globex system executes approximately 13 million trades per day,3 which pales in comparison to the 250 million order messages that pass across that same system on a daily basis. In contrast, only tens of thousands of swaps trade globally on a daily basis.
The Commission has expressed an interest in collecting and analyzing futures order data, an idea that I support. But it remains unclear as to what the system requirements of such an endeavor will be. And this problem is further exaggerated by our failure to clearly articulate the ultimate purpose for which we will utilize the data. Before we invest in new technology, we need to specifically identify our goals and determine these systemic requirements. On several occasions, I have expressed my concerns that the Commission applies a "Ready, fire, aim" approach in developing its Dodd-Frank rule-makings—we must not make the same mistakes when developing a technology strategy.
There is no doubt the Commission needs to intently focus on deploying new technology that is properly scaled to the markets it is tasked with regulating. Nevertheless, we need to develop our capabilities to collect, aggregate, store, and perform the necessary analytics to oversee and monitor the swaps, futures and options markets because it is our statutory duty to do so. The main questions are: what are our informational needs and how should we prioritize technology investment to meet these needs?
I believe there are three basic components to the Commission’s development and execution of a successful technology strategy. The first element is to develop a needs-based strategy. The second element is to organize the Commission in a manner that effectively utilizes new technology. The third and final element is determining the financial requirements, and that includes finding creative solutions to bringing it all together.
Knowledge Management/Needs-based Strategy
With regard to a needs-based system, I believe that we have failed to employ concepts of knowledge management. We have a lot of data being manipulated and organized into more information. But, we are missing that last step of turning that information into comprehensive knowledge about the markets, trading strategies and behaviors, risks, weaknesses, etc.
We need to be asking each office and division within the Commission: What information do you need to accomplish your duties and mission critical goals? What are your concerns? What keeps you up at night? Flash crash? Manipulation? Missing customer money? Risk Management? We need to understand their priorities and develop a clear strategy based on that information.
Organization: Office of Technology and Data is on the Right Path
With regard to the Commission's organizational structure, I believe we are making good progress. In 2011, I urged the Commissioners to reorganize the Commission and create an Office of Data and Technology with responsibility for the intake, organization and management of data from both the futures and swaps markets. This office joined our technology group with market oversight staff. The joined forces are now able to focus on developing automated data collection, aggregation and surveillance. Thus far, I have been impressed with this new initiative. But, I still believe we have to do more to remove the silos among other divisions to ensure their level of technology adoption is maximized and our investment priorities are well-defined. We don't have the financial luxury develop redundant systems.
Big Data Challenges
The Commission is facing a big data crisis, but I use this term “big data” loosely. While our data challenges are big for the Commission, they pale in comparison to most companies – many in our industry. When I asked our staff about our storage challenges, they indicated that five terabytes of data is pushing the limits of our storage capacity. However, when I spoke with a hedge fund manager about their data demands, I was told this one fund collects over seven terabytes per day.
The Commission is pushing the limits on both our hardware and software capabilities. The lack of data storage and processing capacity is undermining our oversight and surveillance capabilities. Early this month, our new surveillance chief, Matthew Hunter, gave me a demonstration utilizing new visualization surveillance tools, and on several occasions the system crashed, unable to handle the computing demands.
Budgets
Since arriving at the Commission, I have argued for a greater share of our budgets to be dedicated to technology. The Commission has consistently requested additional funding for technology. However, when technology competes with new hires, technology always loses. Congress directed the Commission to provide a minimum floor for technology. In Fiscal Year 2011, Congress directed the Commission to spend not less than $37.2 million; we spent the minimum.
In Fiscal Year 2012, the Commission received $55 million for technology and then immediately requested that $10 million of the funds be reprogrammed to support new hires, leaving the Commission with just $45 million.
Congress has been very sympathetic to the Commission’s technology needs pushing us to invest more in technology, only to be told “No, we have other higher priorities.” I hope Congress continues to push the Commission to do more with technology and fence the money off to prevent it from being used for other purposes.
Our fate for fiscal year 2013 remains unresolved, but we have requested $96 million for technology. We based this request on a two page budget narrative—which constitutes our technology strategy. You cannot tell where funding for Blackberries end, and investment in new surveillance hardware begins. And that may not be the worst of our problems.
Let me make on one final point about applying imagination, so that we have an intelligent financial regulatory systems. The new Office of Financial Regulation (OFR) is being established to be the aggregator of all financial data to perform a cross market analysis and research. They will use the data collected by the various prudential regulators and undertake research across markets to spot trends or concerns that aren't apparent to the front line regulators. This only works if we show a little imagination and develop a system that leverages the work of the individual agencies, rather than duplicates it. Duplication and redundancy is the default position of every other federal agency—it wastes resources we don't have and causes us to miss opportunities to solve our big data problems together. This office should consider adopting a role similar to the Office of Science and Technology Policy (“OSTP”). Recently, OSTP coordinated a cross agency research effort on “Big Data” in which six Federal departments and agencies announced more than $200 million in new commitments towards solving big data questions.4 A similar approach could be used by OFR. Oh, and remember I told you about Roadrunner? Well, on Monday, DoE announced that it awarded contracts to various technology vendors to develop systems that operate at one quintillion calculations per second, or a thousand times faster than Roadrunner.5 And, they are going to do that without escalating energy consumption. They refer to this type of research and develop as “extreme-scale.” Here at the Commission, we need to start by just getting on a scale.
Technology Investment Strategy
I believe our lack of commitment to technology stems from the fact that we don’t have a specific technology strategy. I believe there is recognition of the importance of technology, but it is accompanied by an incomplete understanding of how it will be specifically applied at this point.
Leveraging Academia
Now let me turn to my second point and a new opportunity the Commission should aggressively pursue, which is to institutionalize the Commission’s interaction and cooperation with academia. The bottom line is our limited interaction should be expanded. This could provide the critical, cost-effective expertise that the agency so desperately needs to address difficult and challenging technology, micro finance and critical data analysis problems.
Since arriving at the Commission, I have been seeking economic analysis and market structure studies on everything from the impact of changing the price of wheat storage, to the role high frequency trading has on our markets. I also re-established the Commission’s Technology Advisory Committee (the “TAC”) along with two subcommittees comprised of folks who are known in their fields as visionaries. I have been impressed with the breadth and depth of the insight and analysis presented, and I am convinced we aren’t doing enough to tap these academic resources.
As I noted in my opening statement, my previous experience working with the Department of Energy and the national laboratories exposed me to the opportunities to establish collaborative research with universities and other research institutions in centers of excellence that are already tackling difficult technology challenges. Whether it was on fusion energy, high performance computing, or nanotechnology, collaborative research was an essential solution to leverage investment in scientific manpower, facilities and technical expertise. I am convinced the government was able to achieve more than if it tried to solve these problems all on its own. This cooperation wasn’t limited to academia, and the private sector contributed as well.
To date, the Commission has applied such a model in a more limited fashion by contracting with economists in the Office of the Chief Economist. This symbiotic relationship benefits researchers who are provided access to market data, and the Commission, which is able to leverage this research for its own market oversight objectives. Unfortunately, this model doesn’t scale well and is not the ultimate solution for the CFTC’s collaboration problems.
New Model to Institutionalize Academic Interaction with the CFTC
It is time we created a new model for the CFTC to draw on the vast expertise of universities, like the Stevens Institute. I believe there are two ways in which the Commission can formalize this relationship to support the creation of new analytical and automated surveillance tools, risk modeling and cross market analytical capabilities, just to name a few.
The first option would be to develop contracting tools with universities that would enable the Commission to utilize an academic institution to research a specific question and allow the university to assemble the best and brightest to tackle the question at hand. It could draw on its own academic expertise to vet and review the research, saving the Commission from creating its own redundant academic review teams. I believe the Stevens Institute performs a similar role for the Department of Defense.
The second option might be to establish a CFTC Academic Advisory Panel. The role of the Academic panel would be to recommend research topics, select and review competitive research awards, and help solve specific market structure questions to the Commission. This would be different from the Joint CFTC-SEC Advisory Committee and would have a more hands on approach to solving specific questions presented by various divisions within the Commission.
I am interested in exploring these opportunities for the Commission and how this type of collaboration might work within the confines of our bureaucracy and governing statutes. I believe this offers a scalable approach to solving specific research questions. If we collaborate with researchers by providing access to market data on a confidential basis, in the same manner as provided today to the in-house economists, this approach can be extraordinarily cost effective and of great academic interest to universities.
I would like to work with Stevens Institute and others to better define and institutionalize this academic cooperation.
The Technology Advisory Committee
Now let me turn to my final topic: the agenda of the Technology Advisory Committee. We recently completed the second year of the reorganized TAC. In reflecting upon the first two-years as Chairman of the TAC, I am incredibly proud of what our crew of 24 was able to accomplish. We have covered such issues as pre-trade functionality and pre-trade credit checks, data collection standards, technological surveillance and compliance, the deployment of technology solutions in the swaps market, and most recently, algorithmic and high frequency trading. To put a finer point on what we have accomplished since bringing back the TAC in June 2010 with its 24 charter members, we have:
- Held seven public meetings;
- Established the 19-member Subcommittee on Data Standardization charged with providing recommendations based on public/private solutions for creating well-accepted standards for describing, communicating, and storing data on complex financial products;
- Established the 23-member Subcommittee on Automated and High Frequency Trading charged with advising the Commission as to a working definition of high frequency trading (“HFT”) in the context of automated trading strategies;
- Issued Recommendations on Pre-Trade Practices for Trading Firms, Clearing Firms and Exchanges involved in Direct Market Access; and
- Issued recommendations on data standardization through the use of legal entity and product identifiers.
The Commission recently rechartered the Committee, and as Chairman, I intend to pursue an aggressive new agenda focused on defining high frequency trading and understanding its impact on our markets. The Subcommittee on Automated and High Frequency Trading members have worked very hard to develop a draft definition of high frequency trading with four parts: (1) the use of algorithms for decision making, order initiation or, among other things, execution, without human direction; (2) the employment of low latency technology including co-location services; (3) the use of high-speed connections to markets; and (4) high message rates. We discussed this proposal at length during our June meeting, and discussions are ongoing.
In addition to defining what HFT is, additional Subcommittee working groups are also focused on: (1) identifying specific HFT strategies; (2) quantifying and qualifying the economic impact of HFT strategies in the markets; and (3) identifying the impact HFT is having on market micro structures and liquidity.
The Subcommittee's work will continue this fall culminating with final recommendations.
Concept Release on High Frequency Trading
There has been a considerable amount of discussion about a draft concept release recommending various testing and supervision requirements. I am not satisfied with the current draft. First, it fails to appropriately identify specific exchange-level supervisory tools already in place. Second, it is in direct conflict with itself over recent rulemakings under the Dodd-Frank Act. It appropriately states that we have new rules in place like external and internal business conduct rules directed at many of the issues identified as problematic. As the same time, however, it dismisses these rules as insufficient in order to justify the necessity of the concept release.
At a minimum, the proposal needs to be rewritten to explain where the existing market controls and Commission rules fall short and make specific recommendations as to how these gaps can be filled. Development of a gap analysis could be of use, but until this proposal clearly outlines possible problems related to specific regulatory gaps it is tough to rationalize spending already limited Commission and industry time on this effort when implementation of Dodd-Frank is so demanding.
Applying Technology to Protect Customers Funds
Unfortunately, as a result of a second failure of a futures commission merchant (FCM), Peregrine Financial Group, Inc., which resulted in the reported shortfall in customer funds in excess of $200 million,6 I have had to call an emergency meeting of the TAC on July 26th to discuss the technology solutions to better protect customer funds. I have asked the TAC to explore an industry-led technology solution overseen by the CFTC to help prevent scandals like those seen at MF Global and now Peregrine before they occur again.
In an age when I can check my bank balance on my smartphone, there is no reason that futures customers cannot go to bed every night knowing their investments are safe and where they should be. Faxing or mailing customer fund documentation on a monthly basis is not an intelligent oversight program. We must show much more imagination to develop a fully automated solution that confirms balances daily with independent verification and sends automated alerts to the Commission, banks and FCMs when balances don't reconcile.
The TAC, along with academia and the private sector can help provide that imaginative spark we need to develop a more intelligent technology program at the Commission.
I have covered the three areas I outlined in the beginning: the need for a strategic and specific technology plan; the importance of leveraging and institutionalizing an academic interaction with the Commission; and provided an update on the important work on the Technology Advisory Committee.
Before, I close, I do want to reiterate the importance of the Commission focusing its attention on developing better rules to protect customer money. While, I believe strongly that a technology solution is essential to preventing unscrupulous money managers from getting away with fraud or theft, there are other operational and rule changes that are also appropriate. While I will not go into the specifics, I did give a speech in January at New York Law School7 that outlined several specific short, medium and long term changes that can be made the would improve customer protection and protect the system against operational risk going forward. I stand by those proposals, and I hope that the Commission and Congress will address each and every one of them going forward.
Thank you very much.
1 BANK FOR INTERNATIONAL SETTLEMENTS, STATISTICAL RELEASE: OTC DERIVATIVES STATISTICS AT END-DECEMBER, 2012, available at http://www.bis.org/publ/otc_hy1205.pdf.
2 Id.
3 CME Group Volume Averaged 13.1 Million Contracts per Day in June 2012, PR Newswire, available at http://www.prnewswire.com/news-releases/cme-group-volume-averaged-131-million-contracts-per-day-in-june-2012-161191235.html
4 Press Release, Office of Science and Technology Policy, Executive Office of the President, Obama Administration Unveils “Big Data” Initiative: Announces $200 Million in new R&D Investments (Mar. 29, 2012), available at http://www.whitehouse.gov/sites/default/files/microsites/ostp/big_data_press_release_final_2.pdf.
5 Patience Wait, Energy Department Pursues Exascale Computing, InformationWeek, July 16, 2012, available at http://www.informationweek.com/news/government/enterprise-architecture/240003788.
6 Press Release, Commodity Futures Trading Commission, CFTC Files Complaint Against Peregrine Financial Group, Inc. and Russell R. Wassendorf, Sr. Alleging Fraud, Misappropriation of Customer Funds, Violation of Customer Fund Segregation Laws, and Making False Statements (July 10, 2012), available at http://www.cftc.gov/PressRoom/PressReleases/pr6300-12.
7 Scott. D. O’Malia, Commissioner, CFTC, Address at the Center on Financial Services Law, New York Law School: Where are We? And Where Should We Be? Thoughts on MF Global and High Frequency Trading (Jan. 31, 2012), available at http://www.cftc.gov/PressRoom/SpeechesTestimony/opaomalia-11.
Last Updated: July 19, 2012