Another Word For It Patrick Durusau on Topic Maps and Semantic Diversity

February 10, 2014

Text Retrieval Conference (TREC) 2014

Filed under: Conferences,TREC — Patrick Durusau @ 11:33 am

Text Retrieval Conference (TREC) 2014

Schedule: As soon as possible — submit your application to participate in TREC 2014 as described below.
Submitting an application will add you to the active participants’ mailing list. On Feb 26, NIST will announce a new password for the “active participants” portion of the TREC web site.

Beginning March 1
Document disks used in some existing TREC collections distributed to participants who have returned the required forms. Please note that no disks will be shipped before March 1.

July–August
Results submission deadline for most tracks. Specific deadlines for each track will be included in the track guidelines, which will be finalized in the spring.

September 30 (estimated)
relevance judgments and individual evaluation scores due back to participants.

Nov 18–21
TREC 2014 conference at NIST in Gaithersburg, Md. USA

From the webpage:

The Text Retrieval Conference (TREC) workshop series encourages research in information retrieval and related applications by providing a large test collection, uniform scoring procedures, and a forum for organizations interested in comparing their results. Now in its 23rd year, the conference has become the major experimental effort in the field. Participants in the previous TREC conferences have examined a wide variety of retrieval techniques and retrieval environments, including cross-language retrieval, retrieval of web documents, multimedia retrieval, and question answering. Details about TREC can be found at the TREC web site, http://trec.nist.gov.

You are invited to participate in TREC 2014. TREC 2014 will consist of a set of tasks known as “tracks”. Each track focuses on a particular subproblem or variant of the retrieval task as described below. Organizations may choose to participate in any or all of the tracks. Training and test materials are available from NIST for some tracks; other tracks will use special collections that are available from other organizations for a fee.

Dissemination of TREC work and results other than in the (publicly available) conference proceedings is welcomed, but the conditions of participation specifically preclude any advertising claims based on TREC results. All retrieval results submitted to NIST are published in the Proceedings and are archived on the TREC web site. The workshop in November is open only to participating groups that submit retrieval results for at least one track and to selected government invitees.

The eight (8) tracks:

Clinical Decision Support Track: The clinical decision support track investigates techniques for linking medical cases to information relevant for patient care.

Contextual Suggestion Track: The Contextual Suggestion track investigates search techniques for complex information needs that are highly dependent on context and user interests.

Federated Web Search Track: The Federated Web Search track investigates techniques for the selection and combination of search results from a large number of real on-line web search services.

Knowledge Base Acceleration Track: This track looks to develop techniques to dramatically improve the efficiency of (human) knowledge base curators by having the system suggest modifications/extensions to the KB based on its monitoring of the data streams.

Microblog Track: The Microblog track examines the nature of real-time information needs and their satisfaction in the context of microblogging environments such as Twitter.

Session Track: The Session track aims to provide the necessary resources in the form of test collections to simulate user interaction and help evaluate the utility of an IR system over a sequence of queries and user interactions, rather than for a single “one-shot” query.

Temporal Summarization Track: The goal of the Temporal Summarization track is to develop systems that allow users to efficiently monitor the information associated with an event over time.

Web Track: The goal of the Web track is to explore and evaluate Web retrieval technologies that are both effective and reliable.

As of the data of this post, only the Clinical Decision Support Track webpage has been updated for the 2014 conference. The others will follow in due time.

Apologies for the late notice but since the legal track doesn’t appear this year it dropped off my radar.

Application Details

Organizations wishing to participate in TREC 2014 should respond to this call for participation by submitting an application. Participants in previous TRECs who wish to participate in TREC 2014 must submit a new application. To apply, submit the online application at: http://ir.nist.gov/trecsubmit.open/application.html

No Comments

No comments yet.

RSS feed for comments on this post.

Sorry, the comment form is closed at this time.

Powered by WordPress