Skip to main content

Bots, also known as web-robots, are software applications that run automated tasks over the Internet. Typically, they perform tasks that are both simple and structurally repetitive, at a much higher rate than would be possible for a human alone. Notable examples are spiders, crawlers, scrapers, spam-bots and game-bots.

Bots are made to automate human surfing activity to get commercial advantage. Currently there are three main fields of application for them:

  • Searching: crawling the web, gathering hyperlink structure, parsing and/or storing content found
  • Advertisement: post advertisements on resources which technically allow it - blogs, forums, social networks
  • Games / Gambling / Trading: "gold farming", "e-bay snipers", "auto-bid", etc.

Even though the first category is almost a mainstream, all bots are negatively perceived by society in general. Site owners actively refuse to let unknown bots to use their sites, by all technical and legal means. CAPTCHA is the most famous artifact of this "war".

Bots can be written in any language that allows network interaction, but most often in PHP, Perl or Python. Modern bots are often based on web-testing frameworks, like Selenium or Watir, giving them ability to fully simulate human behavior and work with AJAX sites.

Notice: Bots should not be confused with "zombie-pc", individual units of botnet.