Evaluating Information Retrieval and Access Tasks

Evaluating Information Retrieval and Access Tasks
Author: Tetsuya Sakai,Douglas W. Oard,Noriko Kando
Publsiher: Springer Nature
Total Pages: 225
Release: 1901
Genre: Electronic books
ISBN: 9789811555541

Download Evaluating Information Retrieval and Access Tasks Book in PDF, Epub and Kindle

This open access book summarizes the first two decades of the NII Testbeds and Community for Information access Research (NTCIR). NTCIR is a series of evaluation forums run by a global team of researchers and hosted by the National Institute of Informatics (NII), Japan. The book is unique in that it discusses not just what was done at NTCIR, but also how it was done and the impact it has achieved. For example, in some chapters the reader sees the early seeds of what eventually grew to be the search engines that provide access to content on the World Wide Web, todays smartphones that can tailor what they show to the needs of their owners, and the smart speakers that enrich our lives at home and on the move. We also get glimpses into how new search engines can be built for mathematical formulae, or for the digital record of a lived human life. Key to the success of the NTCIR endeavor was early recognition that information access research is an empirical discipline and that evaluation therefore lay at the core of the enterprise. Evaluation is thus at the heart of each chapter in this book. They show, for example, how the recognition that some documents are more important than others has shaped thinking about evaluation design. The thirty-three contributors to this volume speak for the many hundreds of researchers from dozens of countries around the world who together shaped NTCIR as organizers and participants. This book is suitable for researchers, practitioners, and students--anyone who wants to learn about past and present evaluation efforts in information retrieval, information access, and natural language processing, as well as those who want to participate in an evaluation task or even to design and organize one.

Evaluating Information Retrieval and Access Tasks

Evaluating Information Retrieval and Access Tasks
Author: Anonim
Publsiher: Unknown
Total Pages: 135
Release: 2021
Genre: Information retrieval
ISBN: 9811555559

Download Evaluating Information Retrieval and Access Tasks Book in PDF, Epub and Kindle

This open access book summarizes the first two decades of the NII Testbeds and Community for Information access Research (NTCIR). NTCIR is a series of evaluation forums run by a global team of researchers and hosted by the National Institute of Informatics (NII), Japan. The book is unique in that it discusses not just what was done at NTCIR, but also how it was done and the impact it has achieved. For example, in some chapters the reader sees the early seeds of what eventually grew to be the search engines that provide access to content on the World Wide Web, todays smartphones that can tailor what they show to the needs of their owners, and the smart speakers that enrich our lives at home and on the move. We also get glimpses into how new search engines can be built for mathematical formulae, or for the digital record of a lived human life. Key to the success of the NTCIR endeavor was early recognition that information access research is an empirical discipline and that evaluation therefore lay at the core of the enterprise. Evaluation is thus at the heart of each chapter in this book. They show, for example, how the recognition that some documents are more important than others has shaped thinking about evaluation design. The thirty-three contributors to this volume speak for the many hundreds of researchers from dozens of countries around the world who together shaped NTCIR as organizers and participants. This book is suitable for researchers, practitioners, and students--anyone who wants to learn about past and present evaluation efforts in information retrieval, information access, and natural language processing, as well as those who want to participate in an evaluation task or even to design and organize one.

Information Retrieval Evaluation

Information Retrieval Evaluation
Author: Donna K. Harman
Publsiher: Morgan & Claypool Publishers
Total Pages: 122
Release: 2011
Genre: Computers
ISBN: 9781598299717

Download Information Retrieval Evaluation Book in PDF, Epub and Kindle

Evaluation has always played a major role in information retrieval, with the early pioneers such as Cyril Cleverdon and Gerard Salton laying the foundations for most of the evaluation methodologies in use today. The retrieval community has been extremely fortunate to have such a well-grounded evaluation paradigm during a period when most of the human language technologies were just developing. This lecture has the goal of explaining where these evaluation methodologies came from and how they have continued to adapt to the vastly changed environment in the search engine world today. The lecture starts with a discussion of the early evaluation of information retrieval systems, starting with the Cranfield testing in the early 1960s, continuing with the Lancaster "user" study for MEDLARS, and presenting the various test collection investigations by the SMART project and by groups in Britain. The emphasis in this chapter is on the how and the why of the various methodologies developed. The second chapter covers the more recent "batch" evaluations, examining the methodologies used in the various open evaluation campaigns such as TREC, NTCIR (emphasis on Asian languages), CLEF (emphasis on European languages), INEX (emphasis on semi-structured data), etc. Here again the focus is on the how and why, and in particular on the evolving of the older evaluation methodologies to handle new information access techniques. This includes how the test collection techniques were modified and how the metrics were changed to better reflect operational environments. The final chapters look at evaluation issues in user studies -- the interactive part of information retrieval, including a look at the search log studies mainly done by the commercial search engines. Here the goal is to show, via case studies, how the high-level issues of experimental design affect the final evaluations. Table of Contents: Introduction and Early History / "Batch" Evaluation Since 1992 / Interactive Evaluation / Conclusion

Test Collection Based Evaluation of Information Retrieval Systems

Test Collection Based Evaluation of Information Retrieval Systems
Author: Mark Sanderson
Publsiher: Now Publishers Inc
Total Pages: 143
Release: 2010-06-03
Genre: Computers
ISBN: 9781601983602

Download Test Collection Based Evaluation of Information Retrieval Systems Book in PDF, Epub and Kindle

Use of test collections and evaluation measures to assess the effectiveness of information retrieval systems has its origins in work dating back to the early 1950s. Across the nearly 60 years since that work started, use of test collections is a de facto standard of evaluation. This monograph surveys the research conducted and explains the methods and measures devised for evaluation of retrieval systems, including a detailed look at the use of statistical significance testing in retrieval experimentation. This monograph reviews more recent examinations of the validity of the test collection approach and evaluation measures as well as outlining trends in current research exploiting query logs and live labs. At its core, the modern-day test collection is little different from the structures that the pioneering researchers in the 1950s and 1960s conceived of. This tutorial and review shows that despite its age, this long-standing evaluation method is still a highly valued tool for retrieval research.

Information Retrieval Evaluation in a Changing World

Information Retrieval Evaluation in a Changing World
Author: Nicola Ferro,Carol Peters
Publsiher: Springer
Total Pages: 595
Release: 2019-08-13
Genre: Computers
ISBN: 9783030229481

Download Information Retrieval Evaluation in a Changing World Book in PDF, Epub and Kindle

This volume celebrates the twentieth anniversary of CLEF - the Cross-Language Evaluation Forum for the first ten years, and the Conference and Labs of the Evaluation Forum since – and traces its evolution over these first two decades. CLEF’s main mission is to promote research, innovation and development of information retrieval (IR) systems by anticipating trends in information management in order to stimulate advances in the field of IR system experimentation and evaluation. The book is divided into six parts. Parts I and II provide background and context, with the first part explaining what is meant by experimental evaluation and the underlying theory, and describing how this has been interpreted in CLEF and in other internationally recognized evaluation initiatives. Part II presents research architectures and infrastructures that have been developed to manage experimental data and to provide evaluation services in CLEF and elsewhere. Parts III, IV and V represent the core of the book, presenting some of the most significant evaluation activities in CLEF, ranging from the early multilingual text processing exercises to the later, more sophisticated experiments on multimodal collections in diverse genres and media. In all cases, the focus is not only on describing “what has been achieved”, but above all on “what has been learnt”. The final part examines the impact CLEF has had on the research world and discusses current and future challenges, both academic and industrial, including the relevance of IR benchmarking in industrial settings. Mainly intended for researchers in academia and industry, it also offers useful insights and tips for practitioners in industry working on the evaluation and performance issues of IR tools, and graduate students specializing in information retrieval.

Advances in Information Retrieval

Advances in Information Retrieval
Author: Matthias Hagen,Suzan Verberne,Craig Macdonald,Christin Seifert,Krisztian Balog,Kjetil Nørvåg,Vinay Setty
Publsiher: Springer Nature
Total Pages: 734
Release: 2022-04-05
Genre: Computers
ISBN: 9783030997366

Download Advances in Information Retrieval Book in PDF, Epub and Kindle

This two-volume set LNCS 13185 and 13186 constitutes the refereed proceedings of the 44th European Conference on IR Research, ECIR 2022, held in April 2022, due to the COVID-19 pandemic. The 35 full papers presented together with 11 reproducibility papers, 13 CLEF lab descriptions papers, 12 doctoral consortium papers, 5 workshop abstracts, and 4 tutorials abstracts were carefully reviewed and selected from 395 submissions.

Evaluation of Multilingual and Multi modal Information Retrieval

Evaluation of Multilingual and Multi modal Information Retrieval
Author: Cross-Language Evaluation Forum. Workshop,Carol Peters
Publsiher: Springer Science & Business Media
Total Pages: 1018
Release: 2007-09-06
Genre: Computers
ISBN: 9783540749981

Download Evaluation of Multilingual and Multi modal Information Retrieval Book in PDF, Epub and Kindle

This book constitutes the thoroughly refereed postproceedings of the 7th Workshop of the Cross-Language Evaluation Forum, CLEF 2006, held in Alicante, Spain, September 2006. The revised papers presented together with an introduction were carefully reviewed and selected for inclusion in the book. The papers are organized in topical sections on Multilingual Textual Document Retrieval, Domain-Specifig Information Retrieval, i-CLEF, QA@CLEF, ImageCLEF, CLSR, WebCLEF and GeoCLEF.

Current Challenges in Patent Information Retrieval

Current Challenges in Patent Information Retrieval
Author: Mihai Lupu,Katja Mayer,Noriko Kando,Anthony J. Trippe
Publsiher: Springer
Total Pages: 455
Release: 2017-03-24
Genre: Computers
ISBN: 9783662538173

Download Current Challenges in Patent Information Retrieval Book in PDF, Epub and Kindle

This second edition provides a systematic introduction to the work and views of the emerging patent-search research and innovation communities as well as an overview of what has been achieved and, perhaps even more importantly, of what remains to be achieved. It revises many of the contributions of the first edition and adds a significant number of new ones. The first part “Introduction to Patent Searching” includes two overview chapters on the peculiarities of patent searching and on contemporary search technology respectively, and thus sets the scene for the subsequent parts. The second part on “Evaluating Patent Retrieval” then begins with two chapters dedicated to patent evaluation campaigns, followed by two chapters discussing complementary issues from the perspective of patent searchers and from the perspective of related domains, notably legal search. “High Recall Search” includes four completely new chapters dealing with the issue of finding only the relevant documents in a reasonable time span. The last (and with six papers the largest) part on “Special Topics in Patent Information Retrieval” covers a large spectrum of research in the patent field, from classification and image processing to translation. Lastly, the book is completed by an outlook on open issues and future research. Several of the chapters have been jointly written by intellectual property and information retrieval experts. However, members of both communities with a background different to that of the primary author have reviewed the chapters, making the book accessible to both the patent search community and to the information retrieval research community. It also not only offers the latest findings for academic researchers, but is also a valuable resource for IP professionals wanting to learn about current IR approaches in the patent domain.