Timely optimization problems are high-dimensional, calling for dimensionality reduction techniques to solve them efficiently. The random embedding strategy, which optimizes the objective along a low-dimensional subspace of the search space, is arguably the simplest possible dimensionality reduction method. Recent works quantify the probability of success of this strategy to solve the original problem by lower bounding the probability of a random subspace to intersect the set of approximate global minimizers. These works showed that, when the objective has low effective dimension (i.e., is only varying along a low-dimensional subspace of the search space), random embeddings of sufficiently large dimension solve the original high-dimensional problem with probability one. In this work, we relax the low effective dimension assumption by considering objectives with anisotropic variability, namely, Lipschitz continuous functions whose Lipschitz constant is small (though nonzero) when the function is restricted to a high-dimensional subspace. Exploiting tools from stochastic geometry, we lower bound the probability for a random subspace to intersect the set of approximate global minimizers of these objectives, hence, the probability of random embeddings to succeed in solving (approximately) the original global optimization problem. Our findings offer deeper insights into the role of the dimension of the optimization problem in this probability of success.