The dynamics of spatially-structured networks of N interacting stochastic neurons can be described by deterministic population equations in the mean-field limit. While this is known, a general question has remained unanswered: does synaptic weight scaling suffice, by itself, to guarantee the convergence of network dynamics to a deterministic population equation, even when networks are not assumed to be homogeneous or spatially structured? In this work, we consider networks of stochastic integrate-and-fire neurons with arbitrary synaptic weights satisfying a O(1/N) scaling condition. Borrowing results from the theory of dense graph limits, or graphons, we prove that, as N tends to infinity, and up to the extraction of a subsequence, the empirical measure of the neurons' membrane potentials converges to the solution of a spatially-extended mean-field partial differential equation (PDE). Our proof requires analytical techniques that go beyond standard propagation of chaos methods. In particular, we introduce a weak metric that depends on the dense graph limit kernel and we show how the weak convergence of the initial data can be obtained by propagating the regularity of the limit kernel along the dual-backward equation associated with the spatially-extended mean-field PDE. Overall, this result invites us to reinterpret spatially-extended population equations as universal mean-field limits of networks of neurons with O(1/N) synaptic weight scaling. This work was done in collaboration with Pierre-Emmanuel Jabin (Penn State) and Datong Zhou (Sorbonne Université).