The point is that when you run it on your own hardware you can feed the model your health data, bank statements and private journals and can be 5000% sure they’re not going anywhere
I've been playing around with my own home-built AI server for a couple months now. It is so much better than using a cloud provider. It is the difference between drag racing in your own car, and renting one from a dealership. You are going to learn far more doing things yourself. Your tools will be much more consistent and you will walk away with a far greater understanding of every process.
A basic last-generation PC with something like a 3060ti (12GB) is more than enough to get started. My current rig pulls less than 500w with two cards (3060+5060). And, given the current temperature outside, the rig helps heat my home. So I am not contributing to global warming, water consumption, or any other datacenter-related environmental evil.
Unless you normally use electric resistance heating (or some kind of fossil fuel with higher gCO2/kWh) then you don't get necessarily a free pass on the global warming thing!
Our whole home is heated with <500W on average: at this moment the heat pump is drawing 501W (H4 boundary) at close to freezing outside, and its demand is intermittent.