NATIONAL BUREAU OF ECONOMIC RESEARCH
NATIONAL BUREAU OF ECONOMIC RESEARCH
loading...

Market Design

October 20-21, 2017
Michael Ostrovsky of Stanford University and Parag A. Pathak of MIT, Organizers

Haluk Ergin, the University of California at Berkeley, and Tayfun Sönmez and Utku Ünver, Boston College

Efficient and Incentive Compatible Liver Exchange

Liver is the second most transplanted organ from living donors. A living donor can usually donate either his smaller left lobe or larger right lobe. Left-lobe donation is substantially less risky for the donor. Because size compatibility is required besides bloodtype compatibility for liver transplantation, doctors often utilize right-lobe donation due to organ shortage. To remedy the shortage, living donor liver exchange has already been utilized in some countries. Ergin, Sönmez, and Ünver model liver exchange as a matching market design problem. They first introduce an algorithm to find a two-way efficient matching when only left-lobe donation is feasible and there are only two sizes of donors and patients. Researchers then introduce a Pareto-efficient, individually rational, and incentive-compatible mechanism to elicit the right-lobe donation willingness of donors of the patients in the liver exchange pool and extend this approach to any number of individual patient and donor sizes. This approach is quite general and introduces a new class of mechanisms for bilateral exchange problems with weak preferences induced by multi-dimensional vector partial order. By simulations, Ergin, Sönmez, and Ünver show that the decrease of number of transplants because of the incentive compatibility requirement is very small, while the number of transplants can increase substantially as liver exchange is utilized.


Nikhil Agarwal, MIT and NBER; Itai Ashlagi, Stanford University; Paulo J. Somaini, Stanford University and NBER; Michael A. Rees, the University of Toledo Medical Center; and Daniel C. Waldinger, MIT

An Empirical Framework for Sequential Assignments: The Allocation of Deceased Donor Kidneys

A transplant can improve a patient's life while saving several hundred thousands of dollars of healthcare expenditures. Organs from deceased donors, like many other common pool resources (e.g. public housing, child-care slots, publicly funded long-term care), are rationed via a waitlist. The efficiency and equity properties of design choices such as penalties for refusing offers or object-type specific lists are not well understood and depend on agent preferences. Agarwal, Ashlagi, Rees, Somaini, and Waldinger establish an empirical framework for analyzing the trade-offs involved in wait list design and applies it to study the allocation of deceased donor kidneys. They model the decision to accept an offer from a waiting list as an optimal stopping problem and use it to estimate the value of accepting various kidneys. The researchers estimated values for various kidneys is highly correlated with predicted patient outcomes as measured by life-years from transplantation (LYFT). While some types of donors are preferable for all patients (e.g. young donors), there is substantial heterogeneity in willingness to wait for good donors and also substantial match-specific heterogeneity in values (due to biological similarity). The researchers find that the high willingness to wait for good donors without considering the effects of these decisions on other results in agents being too selective relative to socially optimal. This suggests that mild penalties for refusal (e.g. loss in priority) may improve efficiency. Similarly, the heterogeneity in willingness to wait for young, healthy donors suggests that separate queues by donor quality may increase efficiency by inducing sorting without significantly hurting assignments based on match-specific payoffs.


Eric Budish, the University of Chicago and NBER; Robin S. Lee, Harvard University and NBER; and John Shim, the University of Chicago

Will the Market Fix the Market? A Theory of Stock Market Competition and Innovation

As of early 2017 there are 12 stock exchanges in the US, across which 1.5 trillion shares ($60 trillion) are traded annually. All 12 exchanges use the continuous limit order book market design, a design that causes latency arbitrage and the associated high-frequency trading arms race (Budish, Cramton and Shim 2015). Will the market adopt new market designs, such as frequent batch auctions (FBA) that address the negative aspects of high-frequency trading? Budish, Lee, and Shim build a simple new model of stock exchange competition to address this question. The model, which is guided by institutional details of the US equities market, shows that under the status quo market design: (i) the 12 distinct exchanges aggregate up into a "single virtual platform"; (ii) competition among exchanges is fierce on the dimension of traditional trading fees; but (iii) exchanges have market power in the sale of exchange-specific speed technology — arms for the arms race — from which they earn economic rents. The researchers use a variety of data to empirically validate these three sets of results. They then use the model to study the private and social incentives for market design innovation. If a new exchange enters with a market design that eliminates latency arbitrage (e.g., FBA), it would win share and tip other exchanges into also adopting the new design; perhaps surprisingly, the usual coordination problems associated with getting a new market design off the ground would not be an issue. However, the researchers find that the private returns to introducing the new design are zero for a de novo entrant and negative for an incumbent, in contrast with social returns that are large. There are two sources of this tension. First is a version of the classic problem of non-excludability, leading to competitive trading fees and no economic profits for the innovator. Second is incumbents' rents from speed technology. Budish, Lee, and Shim conclude with policy implications. Despite the pessimistic results, their analysis does not imply that a market-wide market design mandate is necessary. Rather, the model points to a more circumscribed policy response that would tip the balance of incentives and encourage the "market to fix the market."


Albert "Pete" Kyle, the University of Maryland, and Jeongmin Lee, Washington University in St. Louis

Toward a Fully Continuous Exchange

Kyle and Lee propose continuous scaled limit orders to implement Fischer Black’s vision of financial markets. By making trading continuous in price, quantity, and time, continuous scaled limit orders eliminate rents high frequency traders earn exploiting artifacts of the current market design. By avoiding time priority, this new order type protects slow traders from being picked off by high frequency traders and makes high frequency traders compete among themselves. All traders, regardless of their technological capacity, can optimally spread trades out over time to minimize adverse price impact. Organized exchanges should move not toward more discreteness but toward a full continuity.


Paul Milgrom and Ilya Segal, Stanford University

Deferred-Acceptance Clock Auctions and Radio Spectrum Reallocation

Milgrom and Segal demonstrate how a deferred-acceptance (DA) clock auction for procurement chooses winning bids by reducing prices in each round to the least attractive current bids. In contrast to Vickrey auctions, DA clock auctions for single-minded bidders are obviously strategy-proof and group strategy-proof, preserve winners' privacy, avoid intractable optimizations, can incorporate the auctioneer's budget constraint, and set prices to be no higher than either competitive equilibrium or Nash equilibrium in the related first-price auction. In simulations based on the US Incentive Auction, the DA clock auction used by the FCC leads to nearly efficient outcomes at a lower cost than a Vickrey auction while using a fraction of the computational effort.


Lawrence Ausubel, the University of Maryland; Christina Aperjis, Power Auctions LLC; and Oleg V. Baranov, the University of Colorado Boulder

Market Design and the FCC Incentive Auction

Market-based mechanisms for the allocation of spectrum licenses have been a prominent feature of the telecommunications landscape for more than two decades. Generally, spectrum auctions have taken the spectrum's use as given and have sought to allocate licenses efficiently. In March 2017, the FCC completed the Incentive Auction, the first-ever auction that incorporated voluntary clearing directly into the allocation mechanism. As such, it is the most prominent auction to date that has sought to let the market determine not only who uses the spectrum but how it is used. In this article, Ausubel, Aperjis, and Baranov review the design and assesses the outcome of the FCC Incentive Auction.


Ulrich Doraszelski and Katja Seim, the University of Pennsylvania and NBER; Michael Sinkinson, Yale University and NBER; and Peichun Wang, the University of Pennsylvania

Ownership Concentration and Strategic Supply Reduction (NBER Working Paper No. 23034)

Doraszelski, Seim, Sinkinson, and Wang explore the sensitivity of the U.S. government's ongoing incentive auction to multi-license ownership by broadcasters. The researchers document significant broadcast TV license purchases by private equity firms prior to the auction and perform a prospective analysis of the effect of ownership concentration on auction outcomes. They find that multi-license holders are able to raise spectrum acquisition costs by 22% by strategically withholding some of their licenses to increase the price for their remaining licenses. A proposed remedy reduces the distortion in payouts to license holders by up to 80%, but lower participation could greatly increase payouts and exacerbate strategic effects.


New Directions: Transportation and Market Design

Michael Ostrovsky, and Michael Schwarz, Google Research

Carpooling and the Economics of Self-Driving Cars


Peter Cramton, the University of Maryland; Richard Geddes, Cornell University; and Axel Ockenfels, the University of Cologne

Markets for Road Use: Eliminating Congestion through Scheduling, Routing, and Real-Time Road Pricing

Traffic congestion is a global problem with annual costs approaching $1 trillion. The cost of traffic congestion across the combined British, French, German and American economies was estimated at $200 billion, or about 0.8 percent of GDP, in 2013. In Los Angeles alone, traffic jams cost $23 billion each year. The health and environmental costs are severe in urban centers worldwide. With the right policies those high social costs can be avoided. Advances in mobile communications and computer technology now make it possible to efficiently schedule, route, and price the use of roads. Efficient real-time pricing of road use can eliminate traffic congestion, enhance safety, improve the environment, and increase vehicle throughput. It also raises reliable, much-needed revenue to modernize decaying infrastructure while improving the allocation of transportation investment. Cramton, Geddes, and Ockenfels describe the design of a market for road use and transportation that is based on efficient scheduling, routing, and pricing. Under their design, road use is priced dynamically by marginal demand during constrained times and locations. In unconstrained times and locations, a nominal fee is paid for road use to recover costs, as in other utilities. Transport is scheduled based on forward prices and then routed in real time based on real-time road-use prices. Efficient pricing of network capacity is not new. Indeed, wholesale electricity markets have been dynamically priced for over a decade. Communications markets are adopting dynamic pricing today. Efficient pricing of road use, however, has only recently become feasible. Advances in mobile communications make it possible to identify and communicate the location of a vehicle to within one cubic meter — allowing precise measurement of road use. User preferences can be communicated both in advance to determine scheduled transport and in real time to optimize routes based on the latest information. Computer advances also facilitate efficient scheduling and pricing of road use. Consumer apps help road users translate detailed price information into preferred transport plans. Computers also allow an independent system operator to better model demand and adjust prices to eliminate congestion and maximize the total value of road infrastructure. An independent market monitor, distinct from the operator, observes the market, identifies problems, and suggests solutions. A board governs the market subject to regulatory oversight. The market objective is to maximize the value of road infrastructure via scheduling, routing, and real-time pricing of its use. The optimization of road use eliminates congestion, making roads safer, faster, cleaner and more enjoyable to use. The road-use market thus maximizes the value of existing transport infrastructure while simultaneously providing essential funding for the roads network as well as valuable price information to evaluate road enhancements. The market is highly complementary with and indeed promotes rapid innovation in the transport sector.


Juan Camilo Castillo, Stanford University; Dan Knoepfle, Uber; and Glen Weyl, Microsoft Research

Surge Pricing Solves the Wild Goose Chase

Ride-hailing apps introduced a more efficient matching technology than traditional taxis (Cramer and Krueger, 2016), with potentially large welfare gains under the appropriate market design. However, Castillo, Knoepfle, and Weyl show that when price is too low they fall into a failure mode first pointed out by Arnott (1996) that leads to market collapse. An over-burdened platform is depleted of idle drivers on the streets and is forced to send cars on a wild goose chase to pick up distant customers. These chases occupy cars, reducing the number of customers served, earnings and thus effectively removing drivers from the road and exacerbating the problem. The researchers use data from Uber to show that wild goose chases are indeed a problem in the Manhattan market. The effects of wild goose chases dominate more traditional price theoretic considerations and imply that welfare and profits fall dramatically as price falls below a certain threshold and only gradually move in price above this point. A platform forced to charge uniform prices over time will therefore have to set very high prices to avoid catastrophic chases. Dynamic "surge pricing" can avoid these high prices while maintaining the system functioning when demand is high.


Parag A. Pathak, and Peng Shi, MIT

How Well Do Structural Demand Models Work? Counterfactual Predictions in School Choice

Discrete choice models of demand are widely used for counterfactual policy simulations, yet their out-of-sample performance is rarely assessed. Pathak and Shi use a large-scale policy change in Boston to investigate the performance of discrete choice models of school demand. In 2013, Boston Public Schools considered several new choice plans that differ in where applicants can apply. At the request of the mayor and district, the researchers estimated discrete choice demand models to forecast the effects of these alternatives. This work led to the adoption of a plan which significantly altered choice sets for thousands of applicants. Pathak and Shi (2014) update forecasts prior to the policy change and describe prediction targets involving access, travel, and unassigned students. Here, the researchers assess how well these ex ante counterfactual predictions compare to the actual choices made under the new choice sets. For equilibrium outcomes, a simple ad hoc model performs as well as the more complicated structural choice models for one of the two grades they examine. However, the inconsistent performance of the structural models is largely due to prediction errors in the characteristics of applicants, which are auxiliary inputs. Once the researchers condition on the characteristics of the actual applicants, the structural choice models outperform the ad hoc alternative in predicting both equilibrium outcomes and choice patterns. Moreover, refitting the models using the new choice data does not significantly improve their prediction accuracy, suggesting that the choice models are indeed "structural" and are robust across the reform. Pathak and Shi's findings show that structural choice models can be effective in predicting counterfactual outcomes, as long there are accurate forecasts about auxiliary input variables.


Georgy Artemov, the University of Melbourne; Yeon-Koo Che, Columbia University; and Yinghua He, Rice University

Strategic 'Mistakes': Implications for Market Design Research

Using a rich data set on Australian college admissions, Artemov, Che, and He show that a non-negligible fraction of applicants adopt strategies that are unambiguously dominated; however, the majority of these 'mistakes' are payoff irrelevant. In a model where colleges rank applicants strictly, the researchers demonstrate that such strategic mistakes jeopardize the empirical analysis based on the truth-telling hypothesis but not the one based on a weaker stable-matching assumption. Artemov, Che, and He's Monte Carlo simulations further illustrate this point and quantify the differences among the methods in the estimation of preferences and in a hypothetical counterfactual analysis.


Jacob D. Leshno and Irene Y. Lo, Columbia University

The Cutoff Structure of Top Trading Cycles in School Choice

The prominent Top Trading Cycles (TTC) mechanism has attractive properties for school choice, as it is strategy-proof, Pareto efficient, and allows school boards to guide the assignment by specifying priorities. However, the common combinatorial description of TTC does little to explain the relationship between student priorities and their eventual assignment. This creates difficulties in transparently communicating TTC to parents and in guiding policy choices of school boards. Leshno and Lo show that the TTC assignment can be described by n2 admission thresholds, where n is the number of schools. These thresholds can be observed after the mechanism is run, and can serve as non-personalized prices that allow parents to verify their assignment. In a continuum model these thresholds can be computed directly from the distribution of preferences and priorities, providing a framework that can be used to evaluate policy choices. The researchers provide closed form solutions for the assignment under a family of distributions, and derive comparative statics. As an application of the model, they solve for the welfare maximizing investment in school quality, and find that a more egalitarian investment can be more efficient because it promotes more efficient sorting by students.


Esen Onur and David Reiffen, CFTC; Lynn Riggs, Commodity Futures Trading Commission; and Haoxiang Zhu, MIT and NBER

Mechanism Selection and Trade Formation on Swap Execution Facilities: Evidence from Index CDS

The Dodd-Frank Act mandates that certain standard OTC derivatives be traded on swap execution facilities (SEF). Onur, Reiffen, Riggs, and Zhu provide a granular analysis of SEF trading mechanisms, using message-level data for May 2016 from the two largest customer-to-dealer SEFs in index CDS markets. Both SEFs offer customers various execution mechanisms that differ in how widely customers' trading interests are exposed to dealers. A theoretical model shows that although exposing the order to more dealers increases competition, it also causes a more severe winner's curse. Consistent with this trade-off, the data show that customers contact fewer dealers if the trade size is larger or nonstandard. Dealers are more likely to respond to customers' inquiries if fewer dealers are involved in competition, if the notional size is larger, or if more dealers are making markets. Finally, dealers' quoted spreads and customers' transaction costs increase in notional quantity and the number of dealers involved. Onur, Reiffen, Riggs, and Zhu results contribute to the understanding of swaps markets by providing insights into investors' and dealers' revealed preferences and strategic behaviors.


Constantinos Daskalakis, MIT; Christos H. Papadimitriou, the University of California at Berkeley; and Christos Tzamos, Microsoft Research

Does Information Revelation Improve Revenue?

Daskalakis, Papadimitriou, and Tzamos study the problem of optimal auction design in a valuation model, explicitly motivated by online ad auctions, in which there is two-way informational asymmetry, in the sense that private information is available to both the seller (the item type) and the bidders (their type), and the value of each bidder for the item depends both on his own and the item’s type. Importantly, the researchers allow arbitrary auction formats involving, potentially, several rounds of signaling from the seller and decisions by the bidders, and seek to find the optimum co-design of signaling and auction (they call this optimum the "optimum augmented auction"). Daskalakis, Papadimitriou, and Tzamos characterize exactly the optimum augmented auction for their valuation model by establishing its equivalence with a multi-item Bayesian auction with additive bidders. Surprisingly, in the optimum augmented auction there is no signaling whatsoever, and in fact the seller need not access the available information about the item type until after the bidder chooses their bid. Sub-optimal solutions to this problem, which have appeared in the recent literature, are shown to correspond to well-studied ways to approximate multi-item auctions by simpler formats, such as grand-bundling (this corresponds to Myerson's auction without any information revelation), selling items separately (this corresponds to Myerson's auction preceded by full information revelation as in [FJM+12]), and fractional partitioning (this corresponds to Myerson's auction preceded by optimal signaling). Consequently, all these solutions are separated by large approximation gaps from the optimum revenue.


Dirk Bergemann, Yale University, and Tibor Heumann and Stephen Morris, Princeton University

Information and Market Power

Bergemann, Heumann, and Morris analyze demand function competition with a finite number of agents and private information. The researchers show that the nature of the private information determines the market power of the agents and thus price and volume of equilibrium trade. The researchers establish their results by providing a characterization of the set of all joint distributions over demands and payoff states that can arise in equilibrium under any information structure. In demand function competition, the agents condition their demand on the endogenous information contained in the price. Bergemann, Heumann, and Morris compare the set of feasible outcomes under demand function to the feasible outcomes under Cournot competition. They find that the first and second moments of the equilibrium distribution respond very differently to the private information of the agents under these two market structures. The first moment of the equilibrium demand, the average demand, is more sensitive to the nature of the private information in demand function competition, reflecting the strategic impact of private information. By contrast, the second moments are less sensitive to the private information, reflecting the common conditioning on the price among the agents.


New Directions: Development Economics and Market Design

Jean-François Houde, Cornell University and NBER; Terence R. Johnson, the University of Notre Dame; Molly Lipscomb, the University of Virginia; and Laura A. Schechter, the University of Wisconsin at Madison

Using Market Mechanisms to Increase the Take-up of Improved Sanitation in Senegal

Many markets in developing countries, particularly those focused on public services, are marked by inefficiencies stemming from mismatches between demand and supply and market power. Houde, Johnson, Lipscomb, and Schechter institute just-in-time procurement auctions over three years and test the effect increased competition on prices and take-up of the services. They supplement the auctions data with survey data from 5,991 households around Dakar and compare prices between the auctions and the general market: Prices decrease in the auctions by 7% relative to the general market. The auctions for desludging services in Dakar, Senegal ran from June 2013 through July 2016 and led to 4,674 procurement auctions with 104 desludging operators. The researchers show that auctions in which more desludgers are invited are more competitive and that much of the cost of distance is priced into the bids. They estimate that the expected profit of participating firms is approximately 895 CFA, or 17% of the realized profit margin of the winning firm. Assuming that the current winners in the auctions are retained, the researchers find that perfect competition would lower prices by an additional 20%, increasing the percentage of prices accepted by consumers by 70%. Houde, Johnson, Lipscomb, and Schechter simulate the effect of inviting more active lower cost bidders to the auctions, and find that price could decrease by up to an additional 13% and acceptances by households could increase by 29% if more bidders are chosen from among those who bid actively in the platform.


Reshmaan N. Hussam, Yale University, and Natalia Rigol and Benjamin N. Roth, MIT

Targeting High Ability Entrepreneurs Using Community Information: Mechanism Design in the Field

One of the most difficult problems in development is that the cost of assessing credit risk for small businesses often makes giving loans unprofitable. In a field experiment in Maharashtra, India Hussam, Rigol, and Roth asked 1,345 entrepreneurs to rank their peers on various metrics of business profitability and growth and entrepreneur characteristics. To assess the validity of these predictions the researchers then randomly distributed cash grants of about USD100 to a third of these entrepreneurs. They find that information provided by community members is predictive of the marginal return to capital. The researchers horserace the community rankings against a machine learning prediction and find that while the machine learning exercise is able to predict high-return entrepreneurs, community information continues to correctly predict 23% returns per month for the top entrepreneurs even after controlling the machine learning prediction. The researchers use experimental variation in the design of surveys to demonstrate agency problems in obtaining community information when community members know the purpose of the information gathering exercise. Rigol and Reshmaan conclude by demonstrating how mechanism design can be used to address these agency problems; monetary incentives for accuracy, eliciting reports in public, and cross-reporting techniques motivated by implementation theory all significantly improve the accuracy of reports.


Yusuke Narita, Yale University

Experimental Design as Market Design: Billions of Dollars Worth of Treatment Assignments

Randomized Controlled Trials (RCT) enroll hundreds of millions of people and involve many human lives. In this paper, Narita proposes a design of RCT with high-stakes treatment. Unlike conventional RCT, their design respects subject welfare; it optimally randomly assigns each treatment to subjects predicted to experience better treatment effects, or to subjects with stronger preferences for the treatment. For preference elicitation, Narita’s design is also almost incentive compatible. Finally, this design unbiasedly estimates any causal effect estimable with standard RCT. To quantify these properties, they apply their proposal to a water cleaning experiment in Kenya (Kremer et al., 2011). Compared to usual RCT, their design substantially improves subjects' well-being while reaching similar treatment effect estimates with similar precision.


 
Publications
Activities
Meetings
NBER Videos
Themes
Data
People
About

National Bureau of Economic Research, 1050 Massachusetts Ave., Cambridge, MA 02138; 617-868-3900; email: info@nber.org

Contact Us