## Highlights

**WP 12/15**

In “Learning Entrepreneurship From Other Entrepreneurs?” Luigi Guiso and Fabiano Schivardi, with Luigi Pistaferri, empirically investigate the extent to which growing up in a high entrepreneurial area increases both the likelihood that an individual becomes an entrepreneur and their entrepreneurial ability or success. By using a variety of data sets they find evidence that this is indeed the case, as would be implied by models where entrepreneurial ability is at least partly learnable through social contacts. Regarding variation in the performance among entrepreneurs, their results show that those who lived in a higher firm density area at learning age earn a higher income from their business. The authors also find evidence that individuals growing up in high firm density area acquire managerial skills, but that individual traits reflecting risk aversion, aversion to ambiguity, self-confidence or optimism, and propensity to innovate (which are traditionally associated to entrepreneurship) are independent of location. This suggests that the “personal traits factor” of entrepreneurship has a larger innate component.

**WP 11/15**

In “Comparing Distribution and Quantile Regression” Franco Peracchi, together with Samantha Leorato, compare the sampling properties of two alternative approaches to estimating the conditional distribution of a continuous outcome Y, given a vector X of regressors. The former, the distribution regression approach, is based on the direct estimation of the conditional distribution function (CDF) of Y given X; the latter, the quantile regression approach, is instead based on the direct estimation of the conditional quantile function (CQF) of Y given X. Indirect estimates of the CQF and of the CDF may then be obtained by inverting the direct estimates obtained from either approach or, to guarantee monotonicity, from their rearranged versions. The authors provide a systematic comparison of the asymptotic and finite sample performance of the monotonic estimators obtained under the two approaches, both when the linear-in-parameter models on which they are based are correctly specified and when they are not.

**WP 10/15**

In “The Effect of Discretion on Procurement Performance“, Giancarlo Spagnolo, with Decio Coviello and Andrea Guglielmo, empirically investigate to what extent higher buyers’ discretion might harm or improve procurement outcomes. To this aim they exploit a threshold present in the Italian procurement law that quasi-experimentally increases the ability of the contracting authorities to use restricted auctions, a mechanism whereby buyers have discretion over who (not) to invite to bid. By running a regression discontinuity design analysis in a large database for public works in Italy, the authors investigate the causal effects of increased discretion on procurement outcome. Their identification strategy relies on the assumption that within a small interval around the threshold, contracts will be otherwise identical in terms of observable and unobservable characteristics. The main result of the paper is that discretion leads to a significant increase in the probability that the same firm is awarded a project repeatedly by the same buyer, but this does not deteriorate (and may actually improve) the observed procurement outcomes. This suggests that the productive effects of buyer discretion in terms of efficient relationships (more than) compensate for the negative ones in terms of favoritism or corruption. The effects persist when the analysis is repeated controlling for geographical location, corruption, social capital and judicial efficiency in the region of the public buyers running the auctions.

**WP 09/15**

In “Parametric and Semiparametric IV Estimation of Network Models with Selectivity“, Eleonora Patacchini, with Tiziano Arduini and Edoardo Rainone, propose two new methods to estimate spatial autoregressive models with network data when the network structure is endogenous. The first method is a simple two-stage IV estimator with a parametric selection procedure; the second is a two-stage semiparametric IV estimator that uses a power series to approximate the selectivity bias term. The authors show that both estimators are consistent and asymptotically normal. They also conduct Monte Carlo simulations to investigate the finite sample properties of the proposed estimators and compare their performances with those of the most commonly used estimators (two-stage least squares). The results show that the latter are upward biased when the network formation process is endogenous, while the former quickly converge to the true parameters as the sample size increases.

**WP 08/15**

In “Phillips curves with observation and menu costs“, Francesco Lippi and Luigi Paciello, with Fernando Alvarez, present a general equilibrium model with two types of frictions: a menu cost and an observation cost which give rise to sticky prices and imperfect information on the state of the economy. The authors show that economies characterized by identical firms’ patterns regarding the average frequency and size of price adjustments nonetheless display very different responses to an aggregate nominal demand shock, depending on the size of the menu cost compared to the observation cost. Calibrating the model to reproduce the frequency and the size of price adjustments for the US requires both frictions to be present and the observation costs to be about three times larger than the menu cost. The output response to an unexpected (small) monetary expansion is more persistent than in the corresponding menu cost model, but smaller than in the observation cost model. In particular, the presence of the observation cost injects a time dependent component in the firms’ decision rule which makes the impulse response function quasi linear in the size of the shock.

**WP 07/15**

In “The Supply Side of Household Finance” Luigi Guiso, with Gabriele Foà, Leonardo Gambacorta and Paolo Emilio Mistrulli, use a novel methodology to test for the presence of biased financial advice from banks to households choosing a mortgage. They show that in a simple model of mortgage choice, where the lender can set the price and also give advice to the customers, the relative price of fixed rate and adjustable rate mortgages is generally not a sufficient statistic for the choice. Banks that face a mixed pool of sophisticated and unsophisticated borrowers will react to changes in the cost and availability of funding not only by adjusting prices but also providing advice to steer borrowers toward the most advantageous choices for them. Hence, supply shocks affect borrowers’ mortgage choices not only through price changes but also directly, insofar as they proxy for unobservable advice. Testing this hypothesis on a sample of 1.6 million mortgages originated in Italy between 2004 and 2010, the authors find evidence consistent with this prediction and thus with the hypothesis that intermediaries offer biased advice to customers. As the model predicts, non-price supply side effects on borrowers’ choices are stronger in the case of less sophisticated ones, who should theoretically be more responsive to the bank’s-advice.

**WP 06/15**

In “Shape Regressions“, Franco Peracchi, with Samantha Leorato, compares two approaches to the shape regression problem, namely the problem of estimating the shape of the conditional distribution of a continuous random variable Y given a random vector X. One approach - distribution regression - is based on the direct estimation of the conditional distribution function (CDF); the other approach - quantile regression - is instead based on the direct estimation of the conditional quantile function (CQF). Since the CDF and the CQF are generalized inverses of each other, indirect estimates of the CQF and the CDF may be obtained by taking the generalized inverse of the direct estimates obtained from either approach, possibly after rearranging to ensure monotonicity is satisfied. The equivalence between the two approaches holds for standard nonparametric estimators in the unconditional case. In the conditional case, when modeling assumptions are introduced to avoid curse-of-dimensionality problems, this equivalence is generally lost as a convenient parametric model for the CDF need not imply a convenient parametric model for the CQF, and vice versa. Despite the vast literature on the quantile regression approach, and the recent attention to the to the distribution regression approach, no systematic comparison of the two has been carried out yet. This paper fills-in this gap by providing a better understanding of the relative performance of the two approaches, both when the assumed parametric models on which they are based are correctly specified and when they are not.

**WP 05/15**

In “On the Ambiguous Consequences of Omitting Variables“, Franco Peracchi, with Giuseppe De Luca and Jan R. Magnus, address the question of whether adding variables to a linear regression model reduces the bias of the parameters of interest. They show that this is true when a short model is compared to a long model that coincides with the data-generation process, but it is not necessarily true when both the short and the long models are underspecified, as is usually the case. In this situation the strategy of adding variables may increase both the bias and the variance of the OLS estimators. Therefore the consequences of adding or omitting variables are ambiguous. The authors study the implications of this ambiguity by providing exact expressions for the bias and mean squared error comparisons and analyzing the dependence of the differences on the misspecification parameter.

**WP 04/15**

In “Monnet’s Error?” Luigi Guiso, with Paola Sapienza and Luigi Zingales, address the crucial question of whether the needs of a currency union will force a political integration in Europe (as anticipated by Monnet) or political national tensions will eventually lead to a backlash. To this aim they analyze the cross sectional and time series variations in pro-European sentiments in 15 EU countries from 1973 onwards. In particular they consider the reactions to three watershed moments: the entry into force of the Maastricht Treaty in 1992, the enlargement of the EU in 2004 and the 2010 Eurozone crisis. They find that both the Maastricht Treaty and the recent economic crisis have reduced the pro-Europe sentiment. Yet, in spite of the worst recession in recent history, the Europeans still support the common currency. These results suggest that Europeans do not want to go backwards, but, at the same time, do not show interest in going forward with political integration, a situation which may become economically unsustainable.

**WP 03/15**

In “The Flattening of the Phillips Curve and the Learning Problem of the Central Bank“, Jean-Paul L’Huillier and William Zame illustrate an intuitive channel through which price stickiness limits the ability of a central bank to improve welfare through stabilization policy. They consider a Central Bank with a dual objective: stabilization of economic activity from nominal disturbances in the short run, and achievement of an inflation target in the long run. In their microfounded information-based model, price stickiness is derived endogenously, as a function of the parameters of the economic environment, including monetary policy itself. Their analysis shows that, when taking into account how agents react to the adoption of inflation targeting, both objectives are not always compatible with each other. In particular, inflation targeting limits the ability of the Central Bank to get information about nominal disturbances, and this makes it unable to stabilize the economy.

**WP 02/15**

In “Cash burns: An inventory model with a cash-credit choice” Francesco Lippi and Fernando Alvarez present a model that characterizes the relationship between optimal dynamic cash management and the choice of the means of payment. The novel feature of this framework is the sequential nature of the payments choice: in each instant the agent can choose to pay with either cash or credit. The model predicts that the current level of the stock of cash determines whether the agent uses cash or credit. Cash is used whenever the agent has enough of it, while credit is used when cash holdings are low, a pattern recently documented by households’ data from several countries. The average level of cash holdings and the average share of expenditures paid in cash depend on the opportunity cost of cash relative to the cost of credit. The model produces a rich set of over-identifying restrictions for consumers’ cash-management and payment choices, which can be tested using recent households’ surveys and diary data.

**WP 01/15**

In “Corporate Culture, Societal Culture, and Institutions” Luigi Guiso, with Paola Sapienza and Luigi Zingales, show that while both cultural and legal norms (institutions), help foster cooperation, culture is the more primitive of the two and itself sustains formal institutions. Although in the last twenty years economists have resorted to the role of the latter to explain the causes of national prosperity, the authors claim that informal institutions (that is culture) are at least as important. While disentangling the effects of the two is difficult in large societies, it can be done inside corporations. Thus corporate culture is not only interesting per se, but is also as a laboratory to study the role of societal culture and the way it can be changed.