Browse DORAS
Browse Theses
Search
Latest Additions
Creative Commons License
Except where otherwise noted, content on this site is licensed for use under a:

Evaluation campaigns and TRECVid

Smeaton, Alan F. and Over, Paul and Kraaij, Wessel (2006) Evaluation campaigns and TRECVid. In: MIR 2006 - 8th ACM SIGMM International Workshop on Multimedia Information Retrieval, 26-27 October 2006, Santa Barbara, CA, USA.

Full text available as:

[img]
Preview
PDF - Requires a PDF viewer such as GSview, Xpdf or Adobe Acrobat Reader
441Kb

Abstract

The TREC Video Retrieval Evaluation (TRECVid) is an international benchmarking activity to encourage research in video information retrieval by providing a large test collection, uniform scoring procedures, and a forum for organizations interested in comparing their results. TRECVid completed its fifth annual cycle at the end of 2005 and in 2006 TRECVid will involve almost 70 research organizations, universities and other consortia. Throughout its existence, TRECVid has benchmarked both interactive and automatic/manual searching for shots from within a video corpus, automatic detection of a variety of semantic and low-level video features, shot boundary detection and the detection of story boundaries in broadcast TV news. This paper will give an introduction to information retrieval (IR) evaluation from both a user and a system perspective, highlighting that system evaluation is by far the most prevalent type of evaluation carried out. We also include a summary of TRECVid as an example of a system evaluation benchmarking campaign and this allows us to discuss whether such campaigns are a good thing or a bad thing. There are arguments for and against these campaigns and we present some of them in the paper concluding that on balance they have had a very positive impact on research progress.

Item Type:Conference or Workshop Item (Paper)
Event Type:Workshop
Refereed:Yes
Additional Information:Workshop held in conjunction with ACM Multimedia 2006
Uncontrolled Keywords:Evaluation; Benchmarking; Video Retrieval;
Subjects:Computer Science > Digital video
Computer Science > Information retrieval
DCU Faculties and Centres:Research Initiatives and Centres > Centre for Digital Video Processing (CDVP)
Research Initiatives and Centres > Adaptive Information Cluster (AIC)
Publisher:Association for Computing Machinery
Official URL:http://dx.doi.org/10.1145/1178677.1178722
Copyright Information:© ACM, 2006. This is the author's version of the work. It is posted here by permission of ACM for your personal use. Not for redistribution.
Funders:Science Foundation Ireland, SFI 03/IN.3/I361
ID Code:415
Deposited On:03 Apr 2008 by DORAS Administrator. Last Modified 05 May 2010 13:06

Download statistics

Archive Staff Only: edit this record