The UK used to be nominally 240V and a lot of EU countries used to be 220V, but we harmonised on 230V because it's comfortably within the tolerance bands for both. Nothing really changed in practice - UK installations still deliver about 240V and most EU installations deliver about 220V - but it was convenient to have a common standard.
It used to be the continent used 220V, and the UK used 240V, until we standardized on 230V.
The UK defines this as 230V -6% +10% (i.e. 216.2V – 253.0V), and nominal supply is 240V
The continent defines this as 230V -10% +6% (i.e. 207.0V - 243.8V), and nominal supply is 220 or 230 by region.
We called this harmonisation - in reality it's more like agreeing to disagree. We just fudged the tolerances until everyone was happy.
(To put some context around this - most sites don't receive nominal supply. The actual voltage delivered depends on your distance to the local substation. Typically they over-deliver such that sites closest are a little over, sites furthest are a little under. If they actually delivered ±0%, sites at the edge of their network would be further from nominal. So end-user equipment should be built to expect variation.)
That'd give an upper bound of 243.8, which is far too close to 240 to be practical. As I tried to add, there is some necessary and natural deviation from nominal to account for path loss etc. 230±6% only allows 240 to deviate by 1.5%, which isn't much scope at all.
I believe ±10% was the traditional definition, so they just trimmed a little off the top end so it strayed less from Europe's definition - but not enough to actually affect nominal distribution.