In previous work with Spyros Alexakis, we considered the renormalized energy of complete properly embedded minimal surfaces in $\mathbb{H}^3$ and proved several structure theorems about it. I will report on that older work as well as our new results showing how control on this renormalized area yields a certain amount of regularity of the asymptotic boundary at infinity.
Random forests are a scheme proposed by Leo Breiman in the 00's for building a predictor ensemble with a set of decision trees that grow in randomly selected subspaces of data. Despite growing interest and practical use, there has been little exploration of the statistical properties of random forests, and little is known about the mathematical forces driving the algorithm. In this talk, we offer an in-depth analysis of a random forests model suggested by Breiman in 2004, which is very close to the original algorithm. We show in particular that the procedure is consistent and adapts to sparsity, in the sense that its rate of convergence depends only on the number of strong features and not on how many noise variables are present.