back to article Publishers punt new web crawler blocking standards

A long-awaited new standard designed to give webmasters more control over how search engines and newsreaders access their content will be unveiled in New York today. After a year-long pilot the Automated Content Access Protocol (ACAP) will be launched at the headquarters of the Associated Press. It aims to improve on the …

COMMENTS

This topic is closed for new posts.
  1. Anonymous Coward
    Pirate

    Or...

    Just make a really horrible puke coloured website so nobody will bother reading the conent - like this: http://www.the-acap.org/

  2. Mark

    Regarding Copiepresse

    They also sued Google when they stopped indexing their site.

    Missed that bit out. Maybe Orlowski has to agree your copy before it can be printed...

  3. Chris Williams (Written by Reg staff)

    Re: Regarding Copiepresse

    Erm, it's not in the copy because it's not relevant.

    I mention the Copiepresse-Google News case as one of a few examples of publishers worried about how their content is used, which is why the APAC project was started - relevant.

    The same publisher sues because the dominant search engine doesn't index them in tit for tat action - interesting, but irrelevant to this story.

    See you at the tinfoil hat shop, anyway.

  4. Mark

    tinfoil not wanted

    I *like* to see how people are trying to control me!

  5. Gareth
    Thumb Down

    Solution looking for a problem?

    Doesn't HTTP already support an "expires" header? Can't a page be unpublished from a site or moved into an authenticated area of the site when it should no longer be visible to search engines?

    robots.txt is a small, elegant, simple solution for keeping crawlers away from non-content areas which still need to be publically accessible (ie. Javascript files, site templates, etc).

    This seems like rather a lot of committee-designed overly complex, redundant cruft which is forcing the hand of the "old media" way of doing things.

  6. Mark

    Re: Solution looking for a problem

    The problem was that the content producers didn't want to go to the effort of understanding and obeying the current standards for the internet (such as, if your webserver is asked for a page, your webserver handing it over means you're allowing the recipient to copy it).

    They want all the web indexers to change THEIR stuff to make their life easier.

  7. DV Henkel-Wallace
    Boffin

    Yeah, yeah, been there before

    Err, ACAP was defined in RFC-2244 exactly 10 years ago in November 1997?

    I predict today's ACAP will be just as popular and effective as ACAP was 10 years ago.

This topic is closed for new posts.