Author:

Over the years, there have been numerous researchers, including several Nobel Prize winners in economics, who have devoted themselves to the study of the field of quantitative finance and trading. We believe that, without them, finance today would not be as we know it, and therefore, in this article, we will describe some of the many papers that had a profound impact on financial markets.

Whenever possible, I provide the link to the paper instead of the Journal where it was published. Due to obvious reasons, I cannot provide such links in cases where a paywall is in place. Having said that, let’s get started!

Optimal Execution of Portfolio Transactions (Almgren & Chriss, 2000)

Link to paper

Robert Almgren was one of the pioneers and had a direct impact on the nature of the first implementations of automated orders via algorithms. This paper is definitely a must-read piece of academic literature for every quantitative developer in charge of refining trade execution algorithms. The authors formalize a method of executing and measuring trade execution performance by minimizing a combination of transaction costs and volatility risk.

An order execution algorithm splits large orders into smaller ones with the objective of reducing market impact. By doing so, a trade-off between exposure to risk and market impact arises, and this paper is the first one that tries to solve this problem.

In a nutshell, the authors indicate that the share price varies for two reasons

  • Exogenous: The volatility of the market itself. These variations occur randomly and independently of the order that we want to carry out in the market.
  • Endogenous: The impact in the market caused by our own orders. The magnitude of said impact has an inverse relationship with regard to liquidity. The change can be either temporary, that is to say, that after having caused an imbalance in the order book, it will be normalized, or permanent.

To minimize endogenous price changes, which are the only ones under our control, Almgren suggests the use of execution algorithms. He presents a rigorous yet practical approach to measuring and implementing algorithms.

This paper is definitely a must-read for any quantitative developer in charge of optimizing the execution of portfolio rebalancing routines, among other strategies that don’t require immediate execution.

The Long-Run Evolution of Energy Prices (Pindyck, 1999)

Link to paper

Written by Robert Pindyck, the paper show that the prices of raw materials vary according to a trend in the form of a convex quadratic function.

Gas, coal, and oil were analyzed for the time period spanning from 1916 to 1996, and the author concludes by stating that the trend of oil has changes that are tied to the marginal costs of the extraction of the raw material and the available estimated reserves of said resource. For natural gas and coal, the tests carried out indicated that they are inconclusive.
It should be noted that these findings can only be applied to non-renewable energies since renewable ones are unable to accumulate significant reserves.

A closed-form GARCH option valuation model (Heston & Nandi, 1997)

Link to paper

The paper presents a closed-form formula for valuing spot assets modeling its variance with a Generalized AutoRegressive Conditional Heteroskedasticity (GARCH) model.
Due to its sophistication and usefulness, GARCH models gained popularity for estimating volatilities during the 1990s, and the financial industry adopted them aggressively.
The reasons behind incorporating GARCH models as a means of estimating volatilities in the stock market can be summarized as follows:

  • Usual methods of calculating volatilities result in a fixed value, which could be argued to be sub-optimal. Given the fact that stock returns are none other than time series, it is reasonable to assume that their volatility changes over time.
  • Traditional methods also assume a fixed mean, which is also a strong and limiting assumption when it comes to modeling stock returns. GARCH models allow researchers to relax this assumption.

This paper proved to be incredibly popular in the financial markets industry since it allowed practitioners to value option prices with higher accuracy.

Portfolio Selection (Markovitz, 1952)

Link to paper

Published by Harry Markowitz in the Journal of Finance, this paper lead to him winning the 1990 Nobel Prize winner in economics. The paper revolutionized the way in which professionals invest by formally introducing the concept of diversification. Markovitz showed that for a given expected return, there exists an optimal portfolio that minimizes the risk. Conversely, for a given level of risk, there exists a portfolio that maximizes the expected return.

In other words, the paper extended the common risk-return tradeoff by incorporating the correlations between them into the calculations.

The intuition behind his formal framework is very simple. For example, since most cryptocurrencies have an extremely high correlation with respect to Bitcoin, holding lots of different cryptocurrencies does not greatly reduce the overall risk of the portfolio. On the other hand, investing in a handful of assets and commodities from different industries does reduce the total risk to a greater extent.

A New Interpretation of Information Rate (Kelly, 1956)

Link to paper

This paper is the formal presentation of the nowadays famous Kelly Criterion. This model is extensively used in casino games in particular and in risk management in general. The author derives a formula that determines the optimal size of an allocation in order to maximize wealth growth over time.

The formula is extensively used in quantitative finance since it is a useful technique for maximizing the natural logarithm of wealth (or the expected geometric growth rate).

Capital Asset Prices: A Theory of Market Equilibrium under Conditions of Risk (Sharpe, 1964)

Link to Journal

The famous Capital Asset Pricing Model (CAPM) is a model developed by William Sharpe and John Lintner and published in 1964. Based on the aforementioned work published by Harry Markowitz (1952), it is used to calculate the return that can be expected when investing in a financial asset, according to the risk that it entails.

By assuming that all investors possess the same information, receive it at the same time and process it in the same way, the CAPM demonstrates how there is only one efficient portfolio, called the “market portfolio”. In contrast to Markovitz (1952), the CAPM concludes that the only way to choose the risk-return profile of a portfolio is by holding the market portfolio and a risk-free asset (T-Bill‘s).
The paper introduces the famous and popular notion of Beta, which is the most popular measure of the volatility of the returns of an asset with respect to the overall market.

The CAPM is based on the following assumptions:

  • Investors are risk averse: The higher this, the higher the return that will be required.
  • Increasing risk by suboptimally diversifying is not rewarded with higher returns.
  • All investors have access to the same information at the same time and process it in the same way.

Incorporating Signals into Optimal Trading (Lehalle, 2017)

Link to paper

Much like in the work of Almgren & Chris (2000), this paper deals with optimal trading execution. The author further refines the work done in the field by incorporating a Markovian signal in the optimal trading framework and derives an optimal trading strategy for the special case of assets following a stochastic process with drift (Ornstein-Uhlenbeck process).

They also use historical tick data to show that orderbook imbalance predicts future price movement and displays mean-reverting properties.

Efficient Capital Markets: a Review of Theory and Empirical Work (Fama, 1970)

Link to paper

This paper is the seminal work that presents the very popular “Efficient Market Hypothesis” concept for the first time. The author presents a theory of information and market efficiency and theorizes that asset prices already accurately contain all the relevant information available.

By assuming that this is true, the author concludes that prices follow a stochastic random process called “random walk“, yielding it impossible to predict the future price of assets.

He presents three different definitions of efficiency, each one assuming different types of information being available and priced into prices. Fama presents three definitions of efficiency: weak, semi-strong, and strong efficiency.

  • Weak Form: In this version, prices only contain information regarding the historical prices. Under this hypothesis, technical analysis has no use in this type of market. This definition does not rule out the validity of fundamental analysis and also considers the possibility that not all relevant information is public.
  • Semi-Strong: the price reflects, in addition to its historical price, other types of public information, for example, quarterly balance sheets, reverse split announcements, dividends, etc. In this case, fundamental analysis ceases to be useful. According to this definition, it is only possible to obtain risk-adjusted returns above average by means of insider information, which is illegal in most countries.
  • Strong Form: the asset price reflects all past information, both public and private. In this market, no one can have extraordinary returns. The author indicates that this version is merely a theoretical construct useful for modeling but practically impossible to find in real life.

Most academics and quantitative traders agree that markets tend to follow the semi-strong definition

The Pricing of Options and Corporate Liabilities (Black & Scholes, 1973)

Link to paper

You might recognize the authors’ names, and for good reason: this paper presents the Black-Scholes model for pricing options. Published in 1973, it utilized the physics heat transfer equation as a starting point to estimate the price of an option. It is also the model used for calculating the implied volatility of an asset.

Further refinements and extensions have been developed since its publication, mostly by relaxing one of the assumptions required for the closed-form solution of the model. Thanks to the model using partial differential equations to model the price, numerical methods can be used whenever a closed-form solution is not achievable.

Does the Stock Market Overreact? (Bondt & Thaler, 1985)

Link to Journal

Contrary to the common assumption of the efficient market hypothesis widely used in the academic literature, Bondt & Thaler showed that there is statistically significant evidence showing the contrary. By showing that, in violation of Bayes’ rule, investors tend to overreact to unexpected news events, they discovered substantial weak-form inefficiencies. Thus, historical price information is not correctly priced by the market.

If you’re interested in diving into the fascinating field of behavioral finance, this paper is an excellent resource to get you started.

Conclusion

In this article, I tried to provide an extensive, yet by no means exhaustive, list of papers that any quantitative person interested in the stock market should be familiar with. Some of them do not need any previous formal education, whereas others do require some mathematical maturity due to their very theoretical approach.

Also, as you might have noticed, these papers cover a broad range of topics, and are only one of the most resources in each field of study. It goes without saying that any type of specialization in one of these topics requires further research.

Categories:

Tags:

[convertkit form=4793161]

No responses yet

Leave a Reply

Your email address will not be published. Required fields are marked *

[convertkit form=5379902]