>If you remove the need to host, you remove a point of failure from publishing
I guess it's marginally better since there's no central "place" to take it down. However, the deplatforming concern is still there because you're still probably sharing those links on the major platforms. So they can still take those down. You can argue that having it distributed among random sites is better than having it centralized, but you can achieve the same thing by uploading to multiple sites.
The proper way to do this (that doesn't involve shifting the problem), is to use IPFS/tor. When you're sharing it, you can use a gateway server, but if that gets taken down or blocked, users can still access it using the underlying network if they really wanted.
How is IPFS better? To share a piece of content, you need some place to share a new link, just like with this mechanism. In fact, IPFS is more failure-prone, because besides removing the link from where it's posted, attackers can also try to identify and DDOS the nodes hosting the content, thereby making the links useless. With this mechanism, anyone with the link could decode the data using some offline tool/script, even if the whole Internet goes down.
This and IPFS/Tor are not mutually exclusive. The one sticking point is that this requires JavaScript to work, which is disabled by default in most tor installations.
I guess it's marginally better since there's no central "place" to take it down. However, the deplatforming concern is still there because you're still probably sharing those links on the major platforms. So they can still take those down. You can argue that having it distributed among random sites is better than having it centralized, but you can achieve the same thing by uploading to multiple sites.
The proper way to do this (that doesn't involve shifting the problem), is to use IPFS/tor. When you're sharing it, you can use a gateway server, but if that gets taken down or blocked, users can still access it using the underlying network if they really wanted.