Economics of Digitization

February 21, 2014
Shane Greenstein of Northwestern University, Josh Lerner of Harvard Business School, and Scott Stern of MIT, Organizers

Jin-Hyuk Kim, University of Colorado, and Tin Cheuk Leung, Chinese Unifersity of Hong Kong

Quantifying the Impacts of Digital Rights Management and E-Book Pricing on the E-Book Reader Market

The introduction of e-book readers such as Amazon's Kindle has been an important driving force behind the success of the e-book market. Kim and Leung quantify the impacts of digital rights management (DRM) and discounted e-book pricing, two of the most controversial issues in this market, on the demand for e-book readers. Using conjoint survey data, the authors estimate a random coefficient model using a hierarchical Bayesian method. Their counterfactual experiments suggest two things. First, if Kindle or Nook were to drop DRM unilaterally then their market share would increase moderately with only a slight effect on other readers; consumer welfare would increase by 7 percent if all readers dropped DRM. Second, an increase in e-book prices would increase iPad's market share moderately at the expense of Kindle and Nook; consumer welfare would decrease by 6 to 10 percent if Kindle's and Nook's e-book prices went up by 50 percent.

Yongdong Liu, Denis Nekipelov, and Minjung Park, University of California, Berkeley

Timely versus Quality Innovation: The Case of Mobile Applications on iTunes and Google Play

iTunes and Google Play are dominant platforms where users of portable electronic devices with Android and Apple operating systems can purchase and download applications for those devices. The applications ("apps") are developed and brought to the platforms by a large number of independently operating developers. It is a highly competitive dynamic marketplace where developers innovate both by upgrading their existing apps and by introducing new apps in order to generate revenues. Liu, Nekipelov, and Park use a unique and comprehensive dataset containing information on apps on iTunes and Android platforms. Using a combination of techniques from the computer science literature, the authors identify and validate the complete set of developers that operate on both platforms as well as the apps that were introduced on both platforms. Using this matched dataset they study how the threat of competitors' entry influences the timing and quality of app entry. In particular, they find that the threat of competitors' entry can have a sizable negative impact on the quality of an app under development by forcing the developer to introduce the app before it has been properly tested and debugged. The authors' reduced form analysis demonstrates varying effects of this phenomenon depending on both the size of the developer and its competitors and on the sparsity of the product space on a given platform. The authors then develop and estimate a structural strategic model of timing and quality decisions of the cross-platform app introduction. They use novel techniques from the machine learning literature to model the beliefs of developers in their semi-parametric two-step estimator. The estimated structural model is then used to analyze the effects on resulting app quality of various counterfactual changes such as an increase in demand for certain app categories, transfers from the platform to the developers, and "A+B" type contracts between the platform and the developers.

Erik Brynjolfsson, MIT and NBER; Tomer Geva, Tel Aviv University; and Shachar Reichman, MIT

Crowd-Squared: Amplifying the Predictive Power of Large-Scale Crowd-Based Data

The analysis of large-scale data generated by the crowd has recently attracted extensive interest of marketing scholars and practitioners. Combined with recent advances in computer science and statistics, these data provide a myriad of opportunities for monitoring and modeling customers' intentions, preferences, and opinions. Nevertheless, a crucial step in any "Big Data" analysis is identifying the relevant data items that need to be processed or modeled. Interestingly, this important step has received limited attention in previous research and has been typically addressed by ad hoc approaches. Brynjolfsson, Geva, and Reichman offer a novel crowd-based method to address this data selection problem. They label the method "Crowd-Squared," as it leverages crowds to identify the most relevant elements in crowd-generated data. To implement this method they develop an online word association game that taps into people's "thought collection" process when thinking about a focal term. The authors empirically tested the approach by comparing its performance to previous studies in three domains that have been used as test beds for prediction: flu epidemics, the housing market, and unemployment. Their findings demonstrate the effectiveness of this method in providing accurate results that are equivalent or superior to previously used term selection methods.

Glenn Ellison, MIT and NBER, and Sara Fisher Ellison, MIT

Match Quality, Search, and the Internet Market for Used Books

Ellison and Ellison examine Internet-related changes in the used book market. A model in which sellers wait for high-value consumers brings out two expected effects: Improvements in the match-quality between buyers and sellers raise welfare (and may lead to higher prices). Meanwhile increased competition brings down prices especially at the lower end of the distribution. The authors examine differences between offline and online prices in 2009 and between online prices in 2009 and 2013 and find several features consistent with the model predictions. Most notably, online prices are higher than offline prices, suggesting a substantial match-quality effect. The authors develop a framework for structural estimation using the available price and quantity data. Structural estimates suggest that the shift to Internet sales substantially increased both seller profits and consumer surplus.

Aleksi Aaltonen, London School of Economics, and Stephan Seiler, Stanford University

Cumulative Knowledge and Open Source Content Growth: The Case of Wikipedia

Aaltonen and Seiler analyze content growth on one of the largest open source platforms: Wikipedia. Using edit-level data over eight years across a large number of Wikipedia pages, they find that content is still growing substantially even in later years. Fewer new pages are created over time, but at the page level they see very little slowdown in activity. One key driver of growth is a positive spillover effect of past edits on current activity: while controlling for a host of confounding factors such as the popularity of the topic and platform-level growth trends the authors find that longer pages experience significantly more editing activity. The magnitude of the externality is economically important and growth in editing activity on the average page would have been at least 50 percent lower in its absence.

Timothy Bresnahan, Joseph Orsini, and Pai-Ling Yin, Stanford University

Platform Choice by Mobile App Developers

For the past two years, Apple's iOS and Google's Android operating systems have split the market share of smartphone devices and the mobile applications (apps) for those devices. Bresnahan, Orsini, and Yin model and estimate the platform choice by mobile app developers, including the decision to multihome. The model flexibly illustrates the potential gap between the decision to multihome and the realized demand from that decision. The authors find far less difference in preferences across platforms than across types of developers and apps. They identify strong incentives for developers of the most popular apps to multihome, making tipping unlikely.

Luis Aguiar, Institute for Prospective Technological Studies, and Joel Waldfogel, University of Minnesota and NBER

Digitization, Copyright, and the Welfare Effects of Music Trade

Since the launch of the iTunes Music Store in the United States in 2003 and in much of Europe in the following years, music trade has shifted rapidly from physical to digital products, increasing the availability of products in different countries. Despite substantial growth in availability, choice sets have not converged across countries: observers point to copyright-related transaction costs as an obstacle to greater availability. Policymakers are now contemplating various copyright reforms that could reduce these trade costs. The possibility of these changes raises the question of how much benefit they would create for consumers and producers around the world. Aguiar and Waldfogel address this question with a structural model of supply and demand for music in 17 countries which they employ to counterfactually simulate the effect of a European digital single market (the equivalent of a pan-European copyright regime) on the welfare of consumers and producers. They also simulate autarky and worldwide frictionless trade — in which all products are available in all countries — allowing them to quantify both the conventional gains from status quo trade as well as the maximum possible gains available to free trade. Existing and additional trade have different patterns of benefit to consumers and producers. Status quo trade benefits consumers everywhere, but European consumers have benefited more than North Americans. Existing trade has had large benefits to American producers but on balance small benefits to European producers. Additional trade would continue the pattern of consumers’ benefits with larger gains to European consumers, but would reverse the pattern for producers. Greater availability of products resulting from easing of copyright restrictions would raise per capita gains to producers in Europe more than in North America. Finally, the authors find that a European single market would bring most of the benefits of worldwide frictionless trade to both consumers and producers alike.

NBER Videos

National Bureau of Economic Research, 1050 Massachusetts Ave., Cambridge, MA 02138; 617-868-3900; email:

Contact Us