|
Robots.txt 2.0 for twodownload
-
Version
4.39
Robots.txt is a visual editor for Robot Exclusion
Files and a log analyzer software. Easily and quickly,
it allows a user to create the robots.txt files
required to instruct search engine spiders, which
parts of a Web site are not to be indexed and made
searchable by the general Web public and then to
identify spiders, which do not keep to those
instructions. The program provides the user a way to
log onto his FTP or local network server and then
select the documents and directories which are not to
be made searchable.
By means of this program you will be able to visually
generate industry standard robots.txt files; identify
malicious and unwanted spiders and ban them from your
site; direct search engine crawlers to the appropriate
pages for multilingual sites; use robots.txt files for
doorway page management; keep spiders out of sensitive
and private areas of your Web site; upload the
correctly formatted robots.txt files directly to your
FTP server not switching from Robots.txt Editor; track
spider visits; create spider visits reports in HTML,
|