We suggest two nonparametric approaches, based on kernel methods and orthogonal series, respectively, to estimating regression functions in the presence of instrumental variables. For the first time in this class of problems we derive optimal convergence rates, and show that they are attained by particular estimators. In the presence of instrumental variables the relation that identifies the regression function also defines an ill-posed inverse problem, the “difficulty” of which depends on eigenvalues of a certain integral operator which is determined by the joint density of endogenous and instrumental variables. We delineate the role played by problem difficulty in determining both the optimal convergence rate and the appropriate choice of smoothing parameter.