Can one recover a matrix from only matrix-vector products? If so, how many are needed? We will consider the matrix recovery problem for the class of hierarchical rank-structured matrices. This problem arises in scientific machine learning, where one wishes to recover the solution operator of a PDE from only input-output pairs of forcing terms and solutions. Peeling algorithms are the canonical method for recovering a hierarchical matrix from matrix-vector products, however their recursive nature poses a potential stability issue which may deteriorate the overall quality of the approximation. Our work resolves the open question of the stability of peeling. We introduce a robust version of peeling and prove that it achieves low error with respect to the best possible hierarchical approximation to any matrix, allowing us to analyze the performance of the algorithm on general matrices, as opposed to exactly hierarchical ones. This analysis relies on theory for low-rank approximation, as well as the surprising result that the Generalized Nystrom method is more accurate than the randomized SVD algorithm in this setting.