Local measurements of the expansion rate of the local universe differ from predictions of simple models fitted to large-scale cosmological measurements, at a statistically significant level. Sample variance (often called cosmic variance) is a key component of errors placed on measurements made from a small data set. For the Hubble constant, which parametrises the expansion rate, the size of the patch of the Universe covered by recent supernovae observations has a radius of 300Mpc. The smaller the patch, the larger the patch-to-patch fluctuations and the larger the error on the measured value of H0 from sample variance. Using the H0 measurement from supernovae as an example, I will consider a number of different ways to estimate sample variance using techniques developed for multiple uses, and show that they all approximately agree. The sample variance error on H0 measurements from the recent Pantheon supernovae sample is +/-1 kms^-1Mpc^-1, insufficient to explain the Hubble tension in a standard Lambda-CDM universe. This will demonstrate methods for comparing variations in expansion rate in the universe and what we mean by saying the universe is expanding (on average), or that galaxies move apart with particular velocities.