Automated Content Access Protocol
Encyclopedia
Automated Content Access Protocol ("ACAP") was proposed in 2006 as a method of providing machine-readable permissions information for content, in the hope that it would have allowed automated processes (such as search-engine web crawling) to be compliant with publishers' policies without the need for human interpretation of legal terms. ACAP was developed by organisations that claimed to represent sections of the publishing industry (World Association of Newspapers
World Association of Newspapers
The World Association of Newspapers is a non-profit, non-governmental organization made up of 76 national newspaper associations, 12 news agencies, 10 regional press organisations and individual newspaper executives in 100 countries...

, European Publishers Council, International Publishers Association
International Publishers Association
The International Publishers Association is an international publishing industry federation of national Publisher associations representing book and journal publishing. It is a non-profit and non-governmental organization, founded in 1896 to promote and protect publishing and to raise awareness...

). It was intended to provide support for more sophisticated online publishing business models, but was criticised for being biased towards the fears of publishers who see search and aggregation as a threat rather than as a source of traffic and new readers.

Status

In November 2007 ACAP announced that the first version of the standard was ready. No non-ACAP members, whether publishers or search engines, have adopted it so far. A Google spokesman appeared to have ruled out adoption. In March 2008, Google's CEO Eric Schmidt stated that "At present it does not fit with the way our systems operate". No progress has been announced since the remarks in March 2008 and Google , along with Yahoo and MSN, have since reaffirmed their commitment to the use of robots.txt and Sitemaps.

In 2011 management of ACAP was turned over to the International Press Telecommunications Council
International Press Telecommunications Council
The International Press Telecommunications Council, based in London, United Kingdom, is a consortium of the world's major news agencies and news industry vendors...

 and announced that ACAP 2.0 would be based on Open Digital Rights Language
ODRL
ODRL is an XML-based standard Rights Expression Language used in Digital Rights Management systems and open content management systems. ODRL is managed by an open organization that's open to public participation...

 2.0.

Previous milestones

In April 2007 ACAP commenced a pilot project in which the participants and technical partners undertook to specify and agree various use cases for ACAP to address. A technical workshop, attended by the participants and invited experts, has been held in London to discuss the use cases and agree next steps.

By February 2007 the pilot project was launched and participants announced.

By October 2006, ACAP had completed a feasibility stage and was formally announced at the Frankfurt Book Fair
Frankfurt Book Fair
The Frankfurt Book Fair is the world's largest trade fair for books, based on the number of publishing companies represented. As to the number of visitors, the Turin Book Fair attracts about as many visitors, viz. some 300,000....

 on 6 October 2006. A pilot program commenced in January 2007 involving a group of major publishers and media groups working alongside search engines and other technical partners.

ACAP and search engines

ACAP rules can be considered as an extension to the Robots Exclusion Standard
Robots Exclusion Standard
The Robot Exclusion Standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web crawlers and other web robots from accessing all or part of a website which is otherwise publicly viewable. Robots are often used by search engines to...

 (or "robots.txt") for communicating website
Website
A website, also written as Web site, web site, or simply site, is a collection of related web pages containing images, videos or other digital assets. A website is hosted on at least one web server, accessible via a network such as the Internet or a private local area network through an Internet...

 access information to automated web crawler
Web crawler
A Web crawler is a computer program that browses the World Wide Web in a methodical, automated manner or in an orderly fashion. Other terms for Web crawlers are ants, automatic indexers, bots, Web spiders, Web robots, or—especially in the FOAF community—Web scutters.This process is called Web...

s.

It has been suggested that ACAP is unnecessary, since the robots.txt protocol already exists for the purpose of managing search engine access to websites. However, others support ACAP’s view that robots.txt is no longer sufficient. ACAP argues that robots.txt was devised at a time when both search engines and online publishing were in their infancy and as a result is insufficiently nuanced to support today’s much more sophisticated business models of search and online publishing. ACAP aims to make it possible to express more complex permissions than the simple binary choice of “inclusion” or “exclusion”.

As an early priority, ACAP is intended to provide a practical and consensual solution to some of the rights-related issues which in some cases have led to litigation between publishers and search engines.

No public search engines recognise Acap. Only one, Exalead
Exalead
Exalead is a software company that provides search platforms and search-based applications for consumer and business users. The company is headquartered in Paris, France, and is a subsidiary of Dassault Systèmes .- CloudView Platform :...

, ever confirmed that they will be adopting the standard, but they have since ceased functioning as a search portal to focus on the software side of their business.

Comment and debate

The project has generated considerable online debate, in the search, content and intellectual property communities. If there are linking themes to the commentary, they are that keeping the specification simple will be critical to its successful implementation, and that the aims of the project are focussed on the needs of publishers, rather than readers. Many have seen this as a flaw.

External links

The source of this article is wikipedia, the free encyclopedia.  The text of this article is licensed under the GFDL.
 
x
OK