In models of $N$ interacting particles in $\mathbb{R}^d$, the repulsive cost is usually described by a two-point function $c_\varepsilon(x,y) =\ell\Big(\frac{|x-y|}{\varepsilon}\Big)$ where $\ell: \mathbb{R}_+ \to [0,\infty]$ is decreasing to zero at infinity and parameter $\varepsilon>0$ scales the interaction distance. In this talk we explain how to deduce an asymptotic model in the short-range regime, that is, $\varepsilon \ll 1$ together with the assumption that there exists $r_0>0$ such that $\int_{r_0}^\infty \ell(r) r^{d-1}\, dr <+\infty$. This extends recent results obtained in the homogeneous case $\ell(r) = r^{-s}$ where $s>d$.