Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I suspect that web traffic will encapsulate both. Many websites (government ones in particular) aren't interested in API-based access patterns.

This kind of pattern makes it so you can serve both users and agents with a single interface



This would be ideal. The only issue here is trust. If my website relies on advertising then of course I would prefer to serve more content to a human visitor.

So what? I bot protect my site, redirecting the AI to a minimalistic part that most likely expects some sort of value given?

People will just breach this trust, like OP and abuse tools like Selenium (as they always have) to imitate being a human.


I think this is pretty interesting -- I wonder if websites could allow agents to self-identify, and not count them towards advertising CPM to prevent dilution in the advertising metrics

Perhaps a similar thing as robots.txt is in order (agents.txt?)




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: