BotSeer was a Web-based information system and search tool used for research on Web robots and trends in Robot Exclusion Protocol deployment and adherence. It was created and designed by Yang Sun,[1] Isaac G. Councill,[2] Ziming Zhuang[3] and C. Lee Giles. BotSeer was in operation from 2007 to 2010, approximately.
BotSeer served as a resource for studying the regulation and behavior of Web robots as well as information about the creation of effective robots.txt files and crawler implementations. It was publicly available on the World Wide Web at the College of Information Sciences and Technology at the Pennsylvania State University.
BotSeer provided services including robots.txt searching, robot bias analysis,[4][5] and robot-generated log analysis. The prototype of BotSeer also allowed users to search 6,000 documentation files and source codes from 18 open source crawler projects.
BotSeer had indexed and analyzed 2.2 million robots.txt files obtained from 13.2 million websites, as well as a large Web server log of real-world robot behavior and related analysis. BotSeer's goals were to assist researchers, webmasters, web crawler developers and others with web robots related research and information needs. However, some people received BotSeer negatively, arguing that it contradicted the purpose of the robots.txt convention.[6]
BotSeer had also had set up a honeypot[7] to test the ethics, performance and behavior of web crawlers.