In this forthcoming working paper we consider nonparametric estimation of density and conditional expectation functions for dyadic random variables, i.e., random variables defined for all pairs of individuals/nodes in a network of size N. These random variables are assumed to satisfy a “local dependence” property, specifically, that any random variables in the network that share one or two indices may be dependent (though random variables which do not have an index in common are assumed to be independent). Estimation of density functions for continuously-distributed random variables or regression functions for continuously-distributed regressors are proposed using straightforward application of the kernel estimation methods proposed by Rosenblatt and Parzen (for densities) or by Nadaraya and Watson (for regression functions). Estimation of their asymptotic variances is also straightforward using existing proposals for dyadic data. More unusual are the rates of convergence and asymptotic (normal) distributions for the estimators, which are shown to converge at the same rate as the (unconditional) sample mean, i.e., the square root of the number N of nodes, under standard assumptions on the kernel method. This differs from the results for nonparametric estimation of densities and regression functions for monadic data, which generally have a slower rate of convergence than the sample mean.
Kernel Estimation for Dyadic Data by Jim Powell
Speaker
James L. Powell
Date & Time
28 May 2019
Type
Seminar
Venue
The Institute for Fiscal Studies
7 Ridgmount Street,
Fitzrovia,
London,
WC1E 7AE
Fitzrovia,
London,
WC1E 7AE