Think of it more like this: If I was selling you a car and said it would last for years, then would you expect it to fall apart after two years? I certainly wouldn’t. When talking about small quantities we tend to specify an exact number (two, three), however as the range becomes larger and less exact we use generalities (years). Because of this “years” would typically refer to a span of at least 3-5 years, and I would argue even longer.
Even given that convention, with your example, a car is a far steeper investment than this ring. The more you invest in something, the more you expect to get out of it, and this ring is designed for a very low investment point, while still being highly durable (there are other similar rings out there at even lower investment points, but they probably won't survive anything beyond a sprinkle).
> A hundred years from now, thanks to the workings of the Inhuman Centipede, I’m known as a deservedly obscure dadaist prose stylist who thought it was cool to stop his books mid-sentence.
96% AI generated according to gptzero.
Which I wouldn't mind, honestly, if it had something useful, insightful, or original to say.
In a way I'm glad it doesn't seem to be written by a human:
> What used to be a disagreement becomes “emotional labor.”
> A bad mood gets labeled “toxic energy.”
This sounds like someone who dismisses their partner's feelings as fragmented memes, and sees her as almost brain-washed by the algorithm.
It contrasts this against a time where a relationship was something entirely different, where he could know everyone she's interacting with.
> And it doesn’t stop there.
> She has friends.
God forbid...
If this was a person and not an AI, they would sound incredibly controlling. Maybe the "toxicity" and "red flag" ideas didn't form in a vacuum?