Showing posts sorted by relevance for query mackenzie. Sort by date Show all posts
Showing posts sorted by relevance for query mackenzie. Sort by date Show all posts

Tuesday, November 18, 2008

Bill Janeway interview

Via The Big Picture, this wonderful interview with Bill Janeway. Janeway was trained as an academic economist (PhD Cambridge), but spent his career on Wall Street, most recently in private equity. I first met Bill at O'Reilly's foo camp; we've had several long conversations about finance and the markets. The interview is long, but read the whole thing! Topics covered include: physicists and quants in finance, mark to market, risk, regulatory and accounting regimes, market efficiency.

The IRA: How did we get into this mess?

Janeway: It took two generations of the best and the brightest who were mathematically quick and decided to address themselves to the issues of capital markets. They made it possible to create the greatest mountain of leverage that the world has ever seen. In my own way, I do track it back to the construction of the architecture of modern finance theory, all the way back to Harry Markowitz writing a thesis at the University of Chicago which Milton Friedman didn’t think was economics. He was later convinced to allow Markowitz to get his doctorate at the University of Chicago in 1950. Then we go on through the evolution of modern finance and the work that led to the Nobel prizes, Miller, Modigliani, Scholes and Merton. The core of this grand project was to reconstruct financial economics as a branch of physics. If we could treat the agents, the atoms of the markets, people buying and selling, as if they were molecules, we could apply the same differential equations to finance that describe the behavior of molecules. What that entails is to take as the raw material, time series data, prices and returns, and look at them as the observables generated by processes which are stationary. By this I mean that the distribution of observables, the distribution of prices, is stable over time. So you can look at the statistical attributes like volatility and correlation amongst them, above all liquidity, as stable and mathematically describable. So consequently, you could construct ways to hedge any position by means of a “replicating portfolio” whose statistics would offset the securities you started with. There is a really important book written by a professor at the University of Edinburgh named Donald MacKenzie. He is a sociologist of economics and he went into the field, onto the floor in Chicago and the trading rooms, to do his research. He interviewed everybody and wrote a great book called An Engine Not a Camera. It is an analytical history of the evolution of modern finance theory. Where the title comes from is that modern finance theory was not a camera to capture how the markets worked, but rather an engine to transform them.

...

Janeway: Yes, but here the agents were principals! I think something else was going on. It was my son, who worked for Bear, Stearns in the equity department in 2007, who pointed out to me that Bear, Stearns and Lehman Brothers had the highest proportion of employee stock ownership on Wall Street. Many people believed, by no means only the folks at Bear and Lehman, that the emergence of Basel II and the transfer to the banks themselves of responsibility for determining the amount of required regulatory capital based upon internal ratings actually reduced risk and allowed higher leverage. The move by the SEC in 2004 to give regulatory discretion to the dealers regarding leverage was the same thing again.

The IRA: And both regimes falsely assume that banks and dealers can actually construct a viable ratings methodology, even relying heavily on vendors and ratings firms. There are still some people at the BIS and the other central banks who believe that Basel II is viable and effective, but none of the risk practitioners with whom we work has anything but contempt for the whole framework. It reminds us of other utopian initiatives such as fair value accounting or affordable housing, everyone sells the vision but misses the pesky details that make it real! And the same religious fervor behind the application of physics to finance was behind the Basel II framework and complex structured assets.

Janeway: That’s my point. It was a kind of religious movement, a willed suspension of disbelief. If we say that the assumptions necessary to produce the mathematical models hold in the real world, namely that markets are efficient and complete, that agents are rational, that agents have access to all of the available data, and that they all share the same model for transforming that data into actionable information, and finally that this entire model is true, then at the end of the day, leverage should be infinite. Market efficiency should rise to the point where there isn’t any spread left to be captured. The fact that a half a percent unhedged swing in your balance sheet can render you insolvent, well it doesn’t fit with this entire constructed intellectual universe that goes back 50 years.

...

Janeway: There are a couple of steps along the way here that got us to the present circumstance, such as the issue of regulatory capture. When you talk about regulatory capture and risk, the capture here of the regulators by the financial industry was not the usual situation of corrupt capture. The critical moment came in the early 1980s, which is very well documented in MacKenzie’s book, when the Chicago Board appealed to academia because it was then the case that in numerous states, cash settlement futures were considered gambling and were banned by law.

...

Janeway: The point here is that the regulators were captured intellectually, not monetarily. And the last to be converted, to have the religious conversion experience, were the accountants, leading to fair value accounting rules. I happen to be the beneficiary of a friendship with a wonderful man, Geoff Whittington, who is a professor emeritus of accounting at Cambridge, who was chief accountant of the British Accounting Standards Board and was a founder of the International Accounting Standards Board. He is from the inside an appropriately knowledgeable, balanced skeptic, who has done a wonderful job of parsing out what is involved in this discussion in a paper called “Two World Views.” Basically, he says that if you really do believe that we live in a world of complete and efficient markets, then you have no choice but to be an advocate of fair value, mark-to-market accounting. If, on the other hand, you see us living in a world of incomplete, but reasonably efficient markets, in which the utility of the numbers you are trying to generate have to do with stewardship of a business through real, historical time rather than a snapshot of “truth,” then you are in a different world. And that is a world where the concept of fair value is necessarily contingent.

Previous posts on Donald MacKenzie's work. MacKenzie is perhaps the most insightful of academics working on the history and development of modern finance.

Tuesday, May 24, 2011

MacKenzie on high frequency trading

Donald MacKenzie writes on high frequency trading in the London Review of Books.

Earlier posts: MacKenzie on the credit crisis , two book reviews

LRB: ... No one in the markets contests the legitimacy of electronic market making or statistical arbitrage. Far more controversial are algorithms that effectively prey on other algorithms. Some algorithms, for example, can detect the electronic signature of a big VWAP, a process called ‘algo-sniffing’. This can earn its owner substantial sums: if the VWAP is programmed to buy a particular corporation’s shares, the algo-sniffing program will buy those shares faster than the VWAP, then sell them to it at a profit. Algo-sniffing often makes users of VWAPs and other execution algorithms furious: they condemn it as unfair, and there is a growing business in adding ‘anti-gaming’ features to execution algorithms to make it harder to detect and exploit them. However, a New York broker I spoke to last October defended algo-sniffing:

"I don’t look at it as in any way evil … I don’t think the guy who’s trying to hide the supply-demand imbalance [by using an execution algorithm] is any better a human being than the person trying to discover the true supply-demand. I don’t know why … someone who runs an algo-sniffing strategy is bad … he’s trying to discover the guy who has a million shares [to sell] and the price then should readjust to the fact that there’s a million shares to buy."

Whatever view one takes on its ethics, algo-sniffing is indisputably legal. More dubious in that respect is a set of strategies that seek deliberately to fool other algorithms. An example is ‘layering’ or ‘spoofing’. A spoofer might, for instance, buy a block of shares and then issue a large number of buy orders for the same shares at prices just fractions below the current market price. Other algorithms and human traders would then see far more orders to buy the shares in question than orders to sell them, and be likely to conclude that their price was going to rise. They might then buy the shares themselves, causing the price to rise. When it did so, the spoofer would cancel its buy orders and sell the shares it held at a profit. It’s very hard to determine just how much of this kind of thing goes on, but it certainly happens. In October 2008, for example, the London Stock Exchange imposed a £35,000 penalty on a firm (its name has not been disclosed) for spoofing.

... As Steve Wunsch, one of the pioneers of electronic exchanges, put it in another TABB forum discussion, US share trading ‘is now so complex as a system that no one can predict what will happen when something new is added to it, no matter how much vetting is done.’ If Wunsch is correct, there is a risk that attempts to make the system safer – by trying to find mechanisms that would prevent a repetition of last May’s events, for example – may have unforeseen and unintended consequences.

Systems that are both tightly coupled and highly complex, Perrow argues in Normal Accidents (1984), are inherently dangerous. Crudely put, high complexity in a system means that if something goes wrong it takes time to work out what has happened and to act appropriately. Tight coupling means that one doesn’t have that time. Moreover, he suggests, a tightly coupled system needs centralised management, but a highly complex system can’t be managed effectively in a centralised way because we simply don’t understand it well enough ...

This is the funniest thing I've seen in a long time, if you can understand what the chimp is saying :-)

Friday, October 05, 2007

Two book reviews

I've had both of these on my shelf for some time, but haven't found time to write detailed reviews.


Google's PageRank and Beyond, Langville and Meyer (Princeton University Press).

Written by two math professors, this is the best technical account I could find of search algorithms. The math (mainly linear algebra and a little graph theory) is accessible and introduced in a self-contained way in a separate chapter. The coverage isn't limited to beautiful algorithmic ideas (the primary one being to find the dominant eigenvector of the matrix representing the graph of hyperlinks) -- the discussion includes nitty gritty details like how to treat dangling nodes, how to accelerate computations, etc. There's also a running historical summary of Google's development up to and including the IPO.

If you're wondering why I have this book, it's not just academic curiosity -- the PageRank algorithm in its basic form can be understood pretty quickly from overviews available online. I'm interested in understanding the current state of the art and the possibility of improvements ;-)


An Engine, Not a Camera, D. MacKenzie (MIT Press)

This is the best history of modern finance and options pricing theory I have yet read. MacKenzie has a sufficient understanding of the theory and of the subtle sociological issues involved (strangely, he is not an economist but a sociologist). Figures like Mandelbrot (the mathematician), Thorp (perhaps the real inventor of Black-Scholes) and Osborne (a physicist) appear along with better known economists like Samuelson, Fama, Miller, Sharpe, Black, Scholes, Merton, etc. The section on Mandelbrot and Levy distributions is especially good, as is the account of LTCM. The title is from Milton Friedman, who (controversially) characterized economic theory as an "engine to analyze the world, not a photographic reproduction of it".

Sunday, June 08, 2008

MacKenzie on the credit crisis

Edinburgh sociology professor Donald MacKenzie wrote what I feel is the best history (so far) of modern finance and derivatives. In this article in the London Review of Books, he tackles the current credit crisis. Highly recommended.

On Gaussian copula (cognitive limitations restrict attention to an obviously oversimplified model; big brains were worried from the start):

Correlation is by far the trickiest issue in valuing a CDO. Indeed, it is difficult to be precise about what correlation actually means: in practice, its determination is a task of mathematical modelling. Over the past ten years, a model known as the ‘single-factor Gaussian copula’ has become standard. ‘Single-factor’ means that the degree of correlation is assumed to reflect the varying extent to which fortunes of each debt-issuer depend on a single underlying variable, which one can interpret as the health of the economy. ‘Copula’ indicates that the mathematical issue being addressed is the connectedness of default risks, and ‘Gaussian’ refers to the use of a multi-dimensional variant of the statistician’s standard bell-shaped curve to model this connectedness.

The single-factor Gaussian copula is far from perfect: even before the crisis hit, I wasn’t able to get a single insider to express complete confidence in it. Nevertheless, it became a market Esperanto, allowing people in different institutions to discuss CDO valuation in a mutually intelligible way. But having a standard model is only part of the task of understanding correlation. Historical data are much less useful here. Defaults are rare events, and producing a plausible statistical estimate of the extent of the correlation between, say, the risk of default by Ford and by General Motors is difficult or impossible. So as CDOs gained popularity in the late 1990s and early years of this decade, often the best one could do was simply to employ a uniform, standard figure such as 30 per cent correlation, or use the correlation between two corporations’ stock prices as a proxy for their default correlations.

Ratings, indices and implied correlation:

However imperfect the modelling of CDOs was, the results were regarded by the rating agencies as facts solid enough to allow them to grade CDO tranches. Indeed, the agencies made the models they used public knowledge in the credit markets: Standard & Poor’s, for example, was prepared to supply participants with copies of its ‘CDO Evaluator’ software package. A bank or hedge fund setting up a standard CDO could therefore be confident of the ratings it would achieve. Creators of CDOs liked that it was then possible to offer attractive returns to investors – which are normally banks, hedge funds, insurance companies, pension funds and the like, not private individuals – while retaining enough of the cash-flow from the asset pool to make the effort worthwhile. As markets recovered from the bursting of the dotcom and telecom bubble in 2000-2, the returns from traditional assets – including the premium for holding risky assets – fell sharply. (The effectiveness of CDOs and other credit derivatives in allowing banks to shed credit risk meant that they generally survived the end of the bubble without significant financial distress.) By early 2007, market conditions had been benign for nearly five years, and central bankers were beginning to talk of the ‘Great Stability’. In it, CDOs flourished.

Ratings aside, however, the world of CDOs remained primarily one of private facts. Each CDO is normally different from every other, and the prices at which tranches are sold to investors are not usually publicly known. So credible market prices did not exist. The problem was compounded by one of the repercussions of the Enron scandal. A trader who has done a derivatives deal wants to be able to ‘book’ the profits immediately, in other words have them recognised straightaway in his employer’s accounts and thus in the bonus that he is awarded that year. Enron and its traders had been doing this on the basis of questionable assumptions, and accounting regulators and auditors – the latter mindful of the way in which the giant auditing firm Arthur Andersen collapsed having been prosecuted for its role in the Enron episode – began to clamp down, insisting on the use of facts (observable market values) rather than mere assumptions in ‘booking’ derivatives. That credit correlation was not observable thus became much more of a problem.

From 2003 to 2004, however, the leading dealers in the credit-derivatives market set up fact-generating mechanisms that alleviated these difficulties: credit indices. These resemble CDOs, but do not involve the purchase of assets and, crucially, are standard in their construction. For example, the European and the North American investment-grade indices (the iTraxx and CDX IG) cover set lists of 125 investment-grade corporations. In the terminology of the market, you can ‘buy protection’ or ‘sell protection’ on either an index as a whole or on standard tranches of it. A protection seller receives fees from the buyer, but has to pay out if one or more defaults hit the index or tranche in question.

The fluctuating price of protection on an index as a whole, which is publicly known, provides a snapshot of market perceptions of credit conditions, while the trading of index tranches made correlation into something apparently observable and even tradeable. The Gaussian copula or a similar model can be applied ‘backwards’ to work out the level of correlation implied by the cost of protection on a tranche, which again is publicly known. That helped to satisfy auditors and to facilitate the booking of profits. A new breed of ‘correlation traders’ emerged, who trade index tranches as a way of taking a position on shifts in credit correlation.

Indices and other tranches quickly became a huge-volume, liquid market. They facilitated the creation not just of standard CDOs but of bespoke products such as CDO-like structures that consist only of mezzanine tranches (which offer combinations of returns and ratings that many investors found especially attractive). Products of this kind leave their creators heavily exposed to changes in credit-market conditions, but the index market permitted them to hedge (that is, offset) this exposure.

Quants and massive computational power (one wonders whether the mathematics and computers did nothing more than lend a spurious air of technicality to untrustworthy basic assumptions):

With problems such as the non-observability of correlation apparently adequately solved by the development of indices, the credit-derivatives market, which emerged little more than a decade ago, had grown by June 2007 to an aggregate total of outstanding contracts of $51 trillion, the equivalent of $7,700 for every person on the planet. It is perhaps the most sophisticated sector of the global financial markets, and a fertile source of employment for mathematicians, whose skills are needed to develop models better than the single-factor Gaussian copula.

The credit market is also one of the most computationally intensive activities in the modern world. An investment bank with a big presence in the market will have thousands of positions in credit default swaps, CDOs, indices and similar products. The calculations needed to understand and hedge the exposure of this portfolio to market movements are run, often overnight, on grids of several hundred interconnected computers. The banks’ modellers would love to add as many extra computers as possible to the grids, but often they can’t do so because of the limits imposed by the capacity of air-conditioning systems to remove heat from computer rooms. In the City, the strain put on electricity-supply networks can also be a problem. Those who sell computer hardware to investment banks are now sharply aware that ‘performance per watt’ is part of what they have to deliver.

Collapse of rating agency credibility:

The rating agencies are businesses, and the issuers of debt instruments pay the agencies to rate them. The potential conflict of interest has always been there, even in the days when the agencies mainly graded bonds, which generally they did quite sensibly. However, the way in which the crisis has thrust the conflict into the public eye has further threatened the credibility of ratings. ‘In today’s market, you really can’t trust any ratings,’ one money-market fund manager told Bloomberg Markets in October 2007. She was far from alone in that verdict, and the result was cognitive contagion. Most investors’ ‘knowledge’ of the properties of CDOs and other structured products had been based chiefly on ratings, and the loss of confidence in them affected all such products, not just those based on sub-prime mortgages. Since last summer, it has been just about impossible to set up a new CDO.

Illiquid assets, difficulty of mark to market:

Over recent months, banks have frequently been accused of hiding their credit losses. The truth is scarier: such losses are extremely hard to measure credibly. Marking-to-market requires that there be plausible market prices to use in valuing a portfolio. But the issuing of CDOs has effectively stopped, liquidity has dried up in large sectors of the credit default swap market, and the credibility of the cost of protection in the index market has been damaged by processes of the kind I’ve been discussing.

How, for example, can one value a portfolio of mortgage-backed securities when trading in those securities has ceased? It has become common to use a set of credit indices, the ABX-HE (Asset Backed, Home Equity), as a proxy for the underlying mortgage market, which is now too illiquid for prices in it to be credible. However, the ABX-HE is itself affected by the processes that have undermined the robustness of the apparent facts produced by other sectors of the index market; in particular, the large demand for protection and reduced supply of it may mean the indices have often painted too uniformly dire a picture of the prospects for mortgage-backed securities. One trader told the Financial Times in April that the liquidity of the indices had become very poor: ‘Trading is mostly happening on interdealer screens between eight or ten guys, and this means that prices can move wildly on very light volume.’ Yet because the level of the ABX-HE indices is used by banks’ accountants and auditors to value their multi-billion dollar portfolios of mortgage-backed securities, this esoteric market has considerable effects, since low valuations weaken banks’ balance sheets, curtailing their capacity to lend and thus damaging the wider economy.

Josef Ackermann, the head of Deutsche Bank, has caused a stir by admitting ‘I no longer believe in the market’s self-healing power.’ ...

Saturday, September 27, 2008

CDOs, auctions and price discovery

How is Treasury going to buy up CDOs and other mortgage backed securities? What is the price discovery mechanism? I've heard discussion of a reverse auction process, in which the government offers a price and owners of the assets decide whether to accept the bid.

But this makes the problem sound much easier than it is. There are no simple or uniform categories for these securities -- no two are exactly alike. I imagine Treasury is going to have to do a lot of homework before each auction, perhaps aided by some sophisticated professionals (Bill Gross of PIMCO recently offered his team's services). Data on each security is available from ratings agencies like S&P and Moody's but presumably one would supplement this with additional information. After some initial analysis Treasury could set a conservative bound (i.e., using pessimistic estimates of future default rates and home prices) on the value of each security in units of the original face value (this one is worth at least 25 cents on the dollar, this is one, 45 cents, etc.). Then, they can publish a list of securities in a particular value category (without, of course, giving out the actual value estimate) and conduct a reverse auction covering all the assets on the list.

If they can get the assets below the value estimate, great for taxpayers like you and me. If banks (hedge funds? pension funds? foreign banks? who is really holding all this stuff?) won't sell at prices below the bound, and the auction heads above that price, Treasury should start demanding warrants or equity stakes on some sliding scale. In other words, the bid keeps getting higher, but at some point Treasury starts asking for not only the particular CDO but some additional warrants or stock. (This could also be done on a sliding scale from the beginning of the auction -- Treasury gets an additional x percent of the bid in warrants, where x increases with price.) The equity stake is compensation for the government for having to having to overpay for the security. At this price there is an (expected) flow of funds from taxpayers to recapitalize the seller, but at least we are getting equity in return. It is claimed that there is a range of values (roughly 20 percent of current market prices) over which the seller would be getting more at auction than the market is currently offering, but the government is still getting a good deal on the asset (expects to make money even under conservative assumptions).

Will it work? Who knows, but at least it may restore some confidence to credit markets.

Here are some old posts that really get into the nitty gritty of what is inside a typical CDO. You'll see that I've been covering credit securities since 2005 :-)

anatomy of a cdo

deep inside the subprime crisis

mackenzie on the credit crisis

gaussian copula and credit derivatives

Here's a recent NYTimes article that gives a peek into the complexity of structured finance.

NYTimes: ...Consider the Bear Stearns Alt-A Trust 2006-7, a $1.3 billion drop in the sea of risky loans. Here’s how it worked:

As the credit bubble grew in 2006, Bear Stearns, then one of the leading mortgage traders on Wall Street, bought 2,871 mortgages from lenders like the Countrywide Financial Corporation.

The mortgages, with an average size of about $450,000, were Alt-A loans — the kind often referred to as liar loans, because lenders made them without the usual documentation to verify borrowers’ incomes or savings. Nearly 60 percent of the loans were made in California, Florida and Arizona, where home prices rose — and subsequently fell — faster than almost anywhere else in the country.

Bear Stearns bundled the loans into 37 different kinds of bonds, ranked by varying levels of risk, for sale to investment banks, hedge funds and insurance companies.

If any of the mortgages went bad — and, it turned out, many did — the bonds at the bottom of the pecking order would suffer losses first, followed by the next lowest, and so on up the chain. By one measure, the Bear Stearns Alt-A Trust 2006-7 has performed well: It has suffered losses of about 1.6 percent. Of those loans, 778 have been paid off or moved through the foreclosure process.

But by many other measures, it’s a toxic portfolio. Of the 2,093 loans that remain, 23 percent are delinquent or in foreclosure, according to Bloomberg News data. Initially rated triple-A, the most senior of the securities were downgraded to near junk bond status last week. Valuing mortgage bonds, even the safest variety, requires guesstimates: How many homeowners will fall behind on their mortgages? If the bank forecloses, what will the homes sell for? Investments like the Bear Stearns securities are almost certain to lose value as long as home prices keep falling.

“Under the current circumstances it’s likely that you are going to take a loss on these loans,” said Chandrajit Bhattacharya, a mortgage strategist at Credit Suisse, the investment bank.

The Bear Stearns bonds are just one example of the kind of assets the government could buy, and they are by no means the most complicated of the lot. Wall Street took bonds like those of Bear Stearns and bundled and rebundled them into even trickier investments known as collateralized debt obligations, or C.D.O.’s

“No two pieces of paper are the same,” said Mr. Feltus of Pioneer Investments.

On Wall Street, many of these C.D.O.’s have been selling for pennies on the dollar, if they are selling at all. In July, Merrill Lynch, struggling to bolster its finances, sold $31 billion of tricky mortgage-linked investments for 22 cents on the dollar. Last November, Citadel, a large hedge fund in Chicago, bought $3 billion of mortgage securities and other investments for 27 cents on the dollar.

But Citigroup, the financial giant, values similar investments on its books at 61 cents on the dollar. Citigroup says its C.D.O.’s are relatively high quality because they were created before lending standards weakened in 2006.

A big challenge for Treasury officials will be deciding whether to buy the troubled investments near the values at which the banks hold them on their books. That would help minimize losses for financial institutions. Driving a hard bargain, however, would protect taxpayers.

Saturday, March 03, 2012

Derivatives history

I'm writing a review (for Physics World) of a recent book on the history of options pricing, and I'm collecting a few links here so I don't lose them. Please ignore this post unless you are interested in arcana ... the actual review will appear here eventually.

AFAIK, high energy physicist M.F.M. Osborne was the first to note log-normal behavior of stock prices. (Bachelier, who amazingly gets so much credit, proposed arithmetic Brownian motion, which neither fits the data nor makes logical sense.) Osborne's book is quite interesting as he explores market microstructure, market making, supply-demand (bid-ask) in detail, going far beyond the usual idealizations made by economists. I had a library copy out years ago but perhaps I should actually buy my own someday. Of course modern HFT types have gone far beyond Osborne's work in the 1950s.

Mathematician Ed Thorp (of Beat the Dealer fame) obtained the Black Scholes equation years before Black and Scholes, but kept it a secret in order to trade on it for his fund. He also first obtained the correct pricing for American options. That he was way beyond Black and Scholes intellectually seems pretty obvious to me. Thorp's web site.

I wish I could remember whether MacKenzie got all this right.

First regulated futures market involved trading of rice in 17th century Japan.

Blog Archive

Labels