The US was largely founded on white supremacy and it has been part of the state institutions for two hundred years. It didn't disappear at the end of the civil war, or in the 1960s, or in 2008.
I'm afraid I don't know the definition of state you're operating under, so I don't think I can evaluate claims such as "white supremacy is endorsed by the state in the US."