Computer robots are simply programs that automate repetitive tasks at speeds impossible 4 humans to reproduce. The term bot on the internet is usually used 2 describe anything that interfaces with the user or that collects data.
Search engines use "spiders" which search (or spider) the web for information. They are software programs which request pages much like regular browsers do. In addition to reading the contents of pages 4 indexing spiders also record links.
- Link citations can be used as a proxy 4 editorial trust.
- Link anchor text may help describe what a page is about.
- Link co citation data may be used 2 help determine what topical communities a page or website exist in.
- Additionally links are stored 2 help search engines discover new documents 2 later crawl.
Another bot example could be Chatterbots, which are resource heavy on a specific topic. These bots attempt to act like a human and communicate with humans on said topic.