We are talking of an outlet for a car in winter. It is much more common to have a common outlet outside than to have a 240v outlet. Even if there were a 240, it would be in the nema format that electric vehicles don't accept directly.
This is also the same problem with electric kettles in the U.S. they boil much faster in the u.k. because almost double the power for a common outlet.
If having 240V in an outlet was sufficient to charge a car, we could have installed a transformer and inverter in a car and charge it from a 1.5V AAA battery by pulling up the voltage to 240V. Plugging a car to a 240V outlet with a 13A breaker will only give you 2650 Watt charging power (provided there is the same code that requires 85% of maximum draw in continuous application), which is just 1 kW more than plugging into a 120V 15A outlet. Not a huge difference that would make charging bearable when the car uses more than 1.5 kW just to keep the battery warm.
An extra 1KW would actually charge the car given time. Not just keep the battery warm. 1kwh is supposedly about 4 miles. Charge for 12 hours overnight, you have enough to get to the supercharger.
In the OP story, the car was losing charge while plugged into 110V outlet, that means that 1.5 kW was not enough to keep the battery warm. An extra 1kWh might or might not get it into actually charging and even if it's been charging at 1kW it would not get 4 miles per hour, the car driving in these conditions would still use energy to keep the battery pack at the working temperature.
On a side note, I find it amusing when Brits try to brag about their superior electricity due to the progressive 240V. People living in the houses wired in a ring [1], which set themselves ablaze when they try to turn on AC, in my imagination would be more reserved ;)
This is also the same problem with electric kettles in the U.S. they boil much faster in the u.k. because almost double the power for a common outlet.