Modern technology gathers a vast amount of data about the world we live in. Extracting the information from the data leads to machine learning/statistics tasks such as clustering, classification, regression, dimensionality reduction, and others. Many of these tasks seek to minimize a functional, defined on the available random sample, which specifies the desired properties of the object sought. It is desirable that the algorithms for these tasks are stable as the number of available data increases.
I will present a mathematical framework for establishing the asymptotic properties of such, variational, problems posed on random samples and related random geometries (e.g. proximity graphs). In particular we will discuss the passage from discrete variational problems on random samples to their continuum limits. We will show how tools of applied analysis and probability combine to establish, in some sense optimal, conditions for the convergence to hold. Implications of the results to algorithms of machine learning will also be discussed.