User-agent is a common HTTP header that generally indicates the name, version, and a URL for the application making the request, and is also a robots_txt command (
User-agent:) that tells specific robots by name to obey the following commands.
As a simple anti-spam measure, many sites (especially Wordpress?) block requests from User-Agents like curl and python-requests, and it can be useful to specify your own to avoid these restrictions.
If your application goes haywire and starts spamming them with too many requests, it also gives the receiver a clue where it's coming from so they can let you know!
- Pelle Wessman's webmention endpoint "A-WebMention-Endpoint/0.8.1 (https://github.com/voxpelli/webpage-webmentions)"
various Silos use specific user-agents when they crawl a page, e.g. to create URL previews. You can use this if you want to give them special treatment, e.g. only serve silo-specific tags to them, work around HTTPS issues ,…
- Twitter's contains