Project

RETRIEVAL - An Online Performance Evaluation Tool for Information Retrieval Methods (http://retrieval.ceti.gr)

Goal: A Web-based integrated information retrieval performance evaluation platform. It offers a number of popular, within the scientific community, metrics so as to compose an efficient framework for executing different evaluation scenarios.

Updates
0 new
12
Recommendations
0 new
1
Followers
0 new
15
Reads
0 new
197

Project log

Anestis Koutsoudis
added an update
RETRIEVAL composes a platform where you can perform your retrieval competition results. Contact us to find out how :-)
 
μπραβο και συγχαρητήρια
 
George Alexis Ioannakis
added a research item
Despite numerous recent efforts, 3D object retrieval based on partial shape queries remains a challenging problem, far from being solved. The problem can be defined as: given a partial view of a shape as query, retrieve all partially similar 3D models from a repository. The objective of this track is to evaluate the performance of partial 3D object retrieval methods, for partial shape queries of various scan qualities and degrees of partiality. The retrieval problem is often found in cultural heritage applications, for which partial scans of objects query a dataset of geometrically distinct classes.
George Alexis Ioannakis
added a research item
Performance evaluation is one of the main research topics in information retrieval. Evaluation metrics are used to quantify various performance aspects of a retrieval method. These metrics assist in identifying the optimum method for a specific retrieval challenge but also to allow its parameters finetuning in order to achieve a robust operation for a given set of requirements specification. In this work, we present RETRIEVAL, a Web-based integrated information retrieval performance evaluation platform. It offers a number of popular, within the scientific community, metrics so as to compose an efficient framework for implementing performance evaluation. We discuss on the functionality of RETRIEVAL by citing important aspects such as the data input approaches, the user-level performance metrics parameterisation, the evaluation scenarios, the interactive plots andtheperformancereportsrepositorythatoffersbotharchiving and download functionalities.
George Alexis Ioannakis
added an update
The Help Index has been updated in order to enhance RETRIEVAL's interoperablity:
  1. GUI Appendix section has been reorganized and enriched. In particular apart from the video tutorials coupled with each data structure, an appendix with instructions and screenshot related to GUI components, uploading file procedures, evaluation setup and performance reports visualisation has been added.
  2. Using Retrieval - Video Tutorials section has been added.
  3. File Organisation section has been enriched with figures and textual information.
RETRIEVAL's website: http://retrieval.ceti.gr
 
Anestis Koutsoudis
added 2 research items
The continuous evolution of 3D computer graphics and the progress of 3D digitization systems resulted in a continuous increase in the available 3D content. The widespread use of 3D objects in diverse domains contributed on forming 3D object retrieval as an active research field. In order to objectively evaluate the performance of retrieval methodologies there is a need for objective benchmarking schemes. In this work, we provide a comprehensive overview of the state-of-the-art evaluation methodologies including not only the performance measures but also the corresponding benchmark datasets. Meaningful benchmark datasets are discussed while a detailed list of publicly available 3D model repositories is given organized in terms of application domains, content magnitude and data types.
Anestis Koutsoudis
added a research item
Performance benchmarking is an absolute necessity when attempting to objectively quantify the performance of content-based retrieval methods. For many years now, a number of plot-based and scalar-based measures in combination with benchmark datasets have already been used in order to provide objective results. In this work, we present the first version of an integrated on-line content-based retrieval evaluation tool, named RETRIEVAL 3D, which can be used in order to quantify the performance of a retrieval method. The current version of the system offers a set of popular performance measures that can be accessed through a dynamic visualisation environment. The user is able to upload retrieval results using different input data structures (e.g. binary ranked lists, floating point ranked lists, dissimilarity matrices and groundtruth data) that are already encountered in the literature including the SHREC competition series. Moreover, the system is able to provide evaluation mechanisms for known within the retrieval research community benchmark datasets. It offers performance measures parameterisation that enables the user to determine specific aspects of the evaluated retrieval method. Performance reports archiving and downloading are some of the system's user-oriented functionalities.
George Alexis Ioannakis
added a project reference
George Alexis Ioannakis
added an update
The metric's arsenal has been updated, three new metrics have been added:
  1. Ranked Biased Precision (RBP)
  2. Q-measure
  3. AUC- PR (Area Under Precision- Recall curve)
Moreover the upload-handler has been updated in order to inform the user for potential errors in the data structure of his files. The help documentation had a major update and video tutorials have been included.
 
George Alexis Ioannakis
added an update
"How-to" video tutorials have been added in the on-line help documentation session.
The videos can be found under the beneath link:
 
Anestis Koutsoudis
added an update
RETRIEVAL offers now extended metrics and scalar parameterisation
 
George Alexis Ioannakis
added an update
RETRIEVAL incorporated 2 benchmark datasets to perform performance evaluation of IR methods:
1. Princeton Shape Benchmark Dataset: a database of 3D polygonal models collected from the World Wide Web (Philip Shilane, Patrick Min, Michael Kazhdan, and Thomas Funkhouser, The Princeton Shape Benchmark, Shape Modeling International, Genova, Italy, June 2004)
2. SIMPLIcity: Semantics-sensitive Integrated Matching for Picture LIbraries (James Z. Wang, Jia Li, Gio Wiederhold, "SIMPLIcity: Semantics-sensitive Integrated Matching for Picture LIbraries,'' IEEE Trans. on Pattern Analysis and Machine Intelligence, vol 23, no.9, pp. 947-963, 2001.)
 
Anestis Koutsoudis
added a project goal
A Web-based integrated information retrieval performance evaluation platform. It offers a number of popular, within the scientific community, metrics so as to compose an efficient framework for executing different evaluation scenarios.