Every computer power supply I've seen in the last 10 years or so is basically rated for input as something like '100-250v, 50-60hz. That way they work on virtually any power grid anywhere in the world, it's just a matter of having the right female IEC to male <local whatever> power cable.
White paper I read 20-25 years ago described how those tend to work. Larger power supplies are required to be power factor corrected. They need to draw sinusoidal power in phase with the voltage. To do that they need a power factor correction front end. In the design the paper described what happens is 120VC gets rectified and stepped up to 240DC. Which then feeds a main DC/DC step down inverter.
I think the advantage of stepping up is the filter capacitors are cheaper for the higher voltage. I think because they handle half the current.
Cheaper and older power supplies typically have a switch to change the input voltage. DO NOT attempt to plug in and turn on a power supply set to 110v to a 240 volt socket.
Most power supplies do 100-240V so they can be used in multiple countries. You might be able to find cable to connect it to 240V plug, or could wire one up. But there would be no advantage to using 240V.
The only advantage for 240V is higher power. All desktop PSUs top out at 1800W because of the 120V 15A limit. There are server PSU that have higher power limits that would need to use 240V to reach the higher limit.
American circuits are designed to support a constant 15amp load, meaning you get 1500 ish watts. One 4090 overclocked and one high end Intel CPU overclocked, plus the cooling for all that, plus other parts, gets you uncomfortably close to that.
My hope is that this limit forces the computer industry to put at least one cycle into actual efficiency instead of just promising a node shrink will magically fix things, since we only have a few of those left really.
I shouldn't need to run a damn space heater to play a stupid video game. At this rate, you have to think about how much video game you plan to play in your electricity budget!
Contrast this to my house in NZ where each circuit is rated to 20 amps continuous and supplies a double socket rated to 10 amps continuous per socket at 230V (nominal). I can easily draw 2.3kW from a standard socket for 24 hours and have no issues.
This is really handy in a country that doesn’t believe in central heating…
Actually, electric heat and hot water in the US is frequently running off the 240 volt circuits too, as well as electric ranges, electric clothes dryers, pretty much anything that takes a lot of power. These items use various standards of plugs and receptacles and the circuits are usually rated to 30 amps (peak).
The US 120 volt, 15amp plugs are pretty much just for things you plug in and other light duty equipment. For anything serious, you can wire up a 120v20amp outlet or even the 240v 30amp circuit.