There’s an interesting hypothetical situation here:
Let’s say that use of AI generated SEO to game search and recommendation algorithms become very widespread. This drives adoption of summarizers because reading these articles is a chore.
The result is that there a whole big chunk of “shadow-text” going unread by users BUT is still being used to drive ranking and discoverability.
There’s essentially a divorce between “content used to rank” and “content delivered to the user”, which could result in a couple different outcomes:
- search is forced to adapt in a way that brings these into alignment, so ranking is driven by the content people want to see and isn’t easily gamed
- SEO is allowed to get really, really weird because you can throw whatever text you want in there knowing that users will never see it
Let’s say that use of AI generated SEO to game search and recommendation algorithms become very widespread. This drives adoption of summarizers because reading these articles is a chore.
The result is that there a whole big chunk of “shadow-text” going unread by users BUT is still being used to drive ranking and discoverability.
There’s essentially a divorce between “content used to rank” and “content delivered to the user”, which could result in a couple different outcomes:
- search is forced to adapt in a way that brings these into alignment, so ranking is driven by the content people want to see and isn’t easily gamed
- SEO is allowed to get really, really weird because you can throw whatever text you want in there knowing that users will never see it