Water policy isn't as simple as you might think. Dams aren't a magical fix, they cause a lot of issues (like crashing the salmon populations, etc.). They're expensive to build and maintain, and the water you store in a big reservoir doesn't magically stay in place - you lose a lot to evaporation and you lose a lot that ends up going into the groundwater system. A much bigger part of the problem is western water law, where water rights are assigned based on prior appropriation and are lost if they aren't exercised. That leads to a lot of bullshit, like people growing very water hungry crops (alfalfa, rice) in the middle of the desert.
The reason we don't build like the people who first came to California did isn't because we're stupid, it's because we've learned a lot of lessons the hard way. If you're interested in some of the history I'd recommend Cadillac Desert, which is about western water in general, but which focuses a lot on California (including the machinations that the movie China Town was based on).
Thanks for contributing these insights. Having worked with hydrologists for 15 years or so -- water is complicated, and people who say there are simple solutions generally do not know the domain.
A moment's reflection should make this clear. It's such a fundamental resource, touching everything we do. We just tend to take it for granted.
Can I ask why you see this as a clearcut issue? Dams have environmental costs, upfront monetary costs, maintenance costs, and can't prevent drought if conditions persist for multiple years. Why are dams the best way to address drought?
In 2020 federal memo and regulatory changes under Trump's first administration to send more water from Northern California to Central Valley agriculture via federal projects were ignored by the governor of california, and instead of allowing the water to flow into southern california, his office sued over those Trump-era water rules, arguing they violated environmental protections for endangered fish.... had he done what the current administration forced him to do, there would be no drought in 2020, there would be no empty reservoirs in 2020. So given those facts, I would argue that yes the current Governor is responsible for what happened 100%.
take a look at SB 79 is a 2025 California state law (Senate Bill 79, authored by Sen. Scott Wiener) that overrides local zoning limits to allow higher-density multifamily housing near major public transit stops, signed into law by Governor Newsom on October 10th 2025, despite local resistance by residents.
Gavin Newsom ran on building housing, and SB79 is him fulfilling his mandate from voters, "local resistance by residents" is why California has some of the most expensive housing in the world.
Gavin Newsom also vetoed AB 2903, the bipartisan bill for auditing of California's $24 billion spent and squandered on fixing the homeless problem, which only got worse. SB79 is another example of Newsom intent to change zoning laws to allow developers to build high density housing which is what the parent comment was about. if you want to be a shill for the governor, thats your business. It looks like willfull graft to me.
there would be no drought if the 2020 Federal regulations were followed. the only reason there's no drought today is because the federal government stepped in and finally opened up the water lines in the North coming south.
keep in mind there used to be a big freshwater lake (Tulare Lake) in the middle of California for at least ten thousand years.....
> In 2020 federal memo and regulatory changes under Trump's first administration to send more water from Northern California to Central Valley agriculture via federal projects were ignored by the governor of california, and instead of allowing the water to flow into southern california ... had he done what the current administration forced him to do, there would be no drought in 2020, there would be no empty reservoirs in 2020.
How would diverting water from Northern California, where drought was the worst in 2020, to the Central Valley possibly end the drought?
Filling up reservoirs that are upstream by moving water downstream sounds like quite the magic trick.
1. Trump’s order in 2020 had nothing to do with fire, so it doesn’t support your position that this has anything to do with fires.
2. The water management plan has nothing to do with where water flows to fight fires.
3. A legal fight in 2020 is not caused by a bill that was passed in 2025.
> there would be no drought in 2020
That’s not how droughts work. A drought is a lack of rainfall. Moving water can reduce the problems caused by a drought, but it cannot prevent a drought.
All the best sites were built on long ago. Dams require favorable geography. More can be built to squeeze out a bit more storage, but there are diminishing returns.
dams have trade offs that they stop sediment outflows which can cause faster erosion. this is a big reason many california beaches have gone from mostly sandy to mostly rocky
Yeah, and with California's typical topography (relatively younger mountains), there's a lot of sediment at the ready than can fill dams and render them worse than useless -- i.e., costs money, loses capacity fast, alters river and coast.
> Almost immediately after construction, the dam began silting up. The dam traps about 30% of the total sediment in the Ventura River system, depriving ocean beaches of replenishing sediment. Initially, engineers had estimated it would take 39 years for the reservoir to fill with silt, but within a few years it was clear that the siltation rate was much faster than anticipated.
There are similar sites all over the state. If you happen to live in the LA area, the Devil's Gate Dam above Pasadena is another such (but originally built for flood control, not for storage).
If you look into the actual design capacity of our municipal water systems, many of them were designed for far larger populations. The EBMUD, for example, intentionally secured 325 million gallons per day in upstream capacity because that was 10x the needs of the service area in 1929. Implicitly they assumed that the service area would grow to 4 million people, but it never did, primarily because of zoning. Today EBMUD delivers only about 120 MGD. We could more than double the service area population without water issues.
The new Sites Reservoir and capacity increase of the existing San Luis Reservoir are both expected to start construction this year. Several other recent proposals like the Pacheos Reservoir have been cancelled due to cost but it is not the case that California is doing nothing re: new water infrastructure.
Sites Reservoir isn't going to do a damned thing for municipal water systems in most of the state. You have to remember that there is not such a thing as a statewide municipal water policy. Every city or region has its own thing going on. The Sites capacity is dedicated to its investors, so depending on where you live it could be a helpful resource, or it could be irrelevant.
I'd argue the question was wrong, it's not that big companies can copy you easier now. They could have always invaded your space and destroyed your business. As other pointed out it was always picking up the pennies that they didn't want until those pennies became dollars.
The concern now is that other small team or solo developers can rebuild what you have very quickly. This happened in the mobile space with all the decompiled and repacked (with min changes) apps that showed up in the stores.
The moat for SaaS startups was that the code is hidden. Now that matters less because people use AI to try and reverse engineer your backend from the API or even UI screenshots.
You have to pick up the pace to keep ahead of them and make sure you don't cause problems for customers while doing it.
There is some ability for it to make novel connections but it's pretty small. You can see this yourself having it build novel systems.
It largely cannot imaginr anything beyond the usual but there is a small part that it can. This is similar to in context learning, it's weak but it is there.
It would be incredible if meta learning/continual learning found a way to train exactly for novel learning path. But that's literally AGI so maybe 20yrs from now? Or never..
You can see this on CL benchmarks. There is SOME signal but it's crazy low. When I was traing CL models i found that signal was in the single % points. Some could easily argue it was zero but I really do believe there is a very small amount in there.
This is also why any novel work or findings is done via MASSIVE compute budgets. They find RL enviroments that can extract that small amount out. Is it random chance? Maybe, hard to say.
Is this so different from what we see in humans? Most people do not think very creatively. They apply what they know in situations they are familiar with. In unfamiliar situations they don't know what to do and often fail to come up with novel solutions. Or maybe in areas where they are very experienced they will come up with something incrementally better than before. But occasionally a very exceptional person makes a profound connection or leap to a new understanding.
The difference is AI tooling lies to you. Day 0 you think it's perfect but the more you use ai tools you realize using them wrong can give you gnarly bugs.
It took me a couple of days to find the right level of detail to prompt it. Too high level, and the codebase gets away from me/the tooling goes off the rails. Too low level, and I may as well do it myself. Maybe also learn the sorts of things Claude Code isn't good at yet. But once I got in the groove it was very easy from there. I think the whole process took 2-3 days.
Claude code feels like the first commodity agent. In theory its simple but in practice you'll have to maintain a ton of random crap you get no value in maintaining.
My guess is eventually all "agents" will be wipped out by claude code or something equivalent.
Maybe not the companies will die but that all those startups will just be hooking up a generic agent wrapper and let it do its thing directly. My bet is that that the company that would win this is the one with the most training data to tune their agent to use their harness correctly.
No, GPT 5.x are very unlike GPT4.5. GPT 5.x are much more censored and second-guessing what you "really meant".
When it comes to conversation, Gemini 3 Pro right now is the closest.
When I asked it to make a nightmare Sauron would show me in Palantir, and ChatGPT5.2 Thinking tried to make it "playful" (directly against my instructions) and went with some shallow but safe option. Gemini 3 Pro prepared something much deeper and more profound.
I don't know nearly as much about talking with Opus 4.5 - while I use it for coding daily, I don't use it as a go-to chat. As a side note, Opus 3 has a similar vibe to GPT 4.5.
That grumpy guy is using an LLM and debugging with it. Solves the problem. AI provider fine tunes their model with this. You now have his input baked into it's response.
How you think these things work? It's either a human direct input it's remembering or a RL enviroment made by a human to solve the problem you are working on.
Nothing in it is "made up" it's just a resolution problem which will only get better over time.
If we simply built like the people who first came to california did we would never have water shortages again.
Any water shortage is a 1:1 failure of the state to do the clear and obvious task needed.
reply