2021_Book_EvaluatingInformationRetrieval.pdf

This open access book summarizes the first two decades of the NII Testbeds and Community for Information access Research (NTCIR). NTCIR is a series of evaluation forums run by a global team of researchers and hosted by the National Institute of Informatics (NII), Japan. The book is unique in that it...

Πλήρης περιγραφή

Λεπτομέρειες βιβλιογραφικής εγγραφής
Γλώσσα:English
Έκδοση: Springer Nature 2020
Διαθέσιμο Online:https://www.springer.com/9789811555541
id oapen-20.500.12657-41715
record_format dspace
spelling oapen-20.500.12657-417152020-09-22T00:47:36Z Evaluating Information Retrieval and Access Tasks Sakai, Tetsuya Oard, Douglas W. Kando, Noriko Information Storage and Retrieval Evaluation Information Retrieval Multilingual Information Access NTCIR Test Collections Information Search Information Storage Artificial Intelligence Open Acces Information retrieval Data warehousing bic Book Industry Communication::U Computing & information technology::UN Databases::UNH Information retrieval This open access book summarizes the first two decades of the NII Testbeds and Community for Information access Research (NTCIR). NTCIR is a series of evaluation forums run by a global team of researchers and hosted by the National Institute of Informatics (NII), Japan. The book is unique in that it discusses not just what was done at NTCIR, but also how it was done and the impact it has achieved. For example, in some chapters the reader sees the early seeds of what eventually grew to be the search engines that provide access to content on the World Wide Web, today’s smartphones that can tailor what they show to the needs of their owners, and the smart speakers that enrich our lives at home and on the move. We also get glimpses into how new search engines can be built for mathematical formulae, or for the digital record of a lived human life. Key to the success of the NTCIR endeavor was early recognition that information access research is an empirical discipline and that evaluation therefore lay at the core of the enterprise. Evaluation is thus at the heart of each chapter in this book. They show, for example, how the recognition that some documents are more important than others has shaped thinking about evaluation design. The thirty-three contributors to this volume speak for the many hundreds of researchers from dozens of countries around the world who together shaped NTCIR as organizers and participants. This book is suitable for researchers, practitioners, and students—anyone who wants to learn about past and present evaluation efforts in information retrieval, information access, and natural language processing, as well as those who want to participate in an evaluation task or even to design and organize one. 2020-09-21T13:40:32Z 2020-09-21T13:40:32Z 2021 book ONIX_20200921_9789811555541_73 https://library.oapen.org/handle/20.500.12657/41715 eng The Information Retrieval Series application/pdf n/a 2021_Book_EvaluatingInformationRetrieval.pdf https://www.springer.com/9789811555541 Springer Nature Springer 10.1007/978-981-15-5554-1 10.1007/978-981-15-5554-1 6c6992af-b843-4f46-859c-f6e9998e40d5 Springer 43 219 open access
institution OAPEN
collection DSpace
language English
description This open access book summarizes the first two decades of the NII Testbeds and Community for Information access Research (NTCIR). NTCIR is a series of evaluation forums run by a global team of researchers and hosted by the National Institute of Informatics (NII), Japan. The book is unique in that it discusses not just what was done at NTCIR, but also how it was done and the impact it has achieved. For example, in some chapters the reader sees the early seeds of what eventually grew to be the search engines that provide access to content on the World Wide Web, today’s smartphones that can tailor what they show to the needs of their owners, and the smart speakers that enrich our lives at home and on the move. We also get glimpses into how new search engines can be built for mathematical formulae, or for the digital record of a lived human life. Key to the success of the NTCIR endeavor was early recognition that information access research is an empirical discipline and that evaluation therefore lay at the core of the enterprise. Evaluation is thus at the heart of each chapter in this book. They show, for example, how the recognition that some documents are more important than others has shaped thinking about evaluation design. The thirty-three contributors to this volume speak for the many hundreds of researchers from dozens of countries around the world who together shaped NTCIR as organizers and participants. This book is suitable for researchers, practitioners, and students—anyone who wants to learn about past and present evaluation efforts in information retrieval, information access, and natural language processing, as well as those who want to participate in an evaluation task or even to design and organize one.
title 2021_Book_EvaluatingInformationRetrieval.pdf
spellingShingle 2021_Book_EvaluatingInformationRetrieval.pdf
title_short 2021_Book_EvaluatingInformationRetrieval.pdf
title_full 2021_Book_EvaluatingInformationRetrieval.pdf
title_fullStr 2021_Book_EvaluatingInformationRetrieval.pdf
title_full_unstemmed 2021_Book_EvaluatingInformationRetrieval.pdf
title_sort 2021_book_evaluatinginformationretrieval.pdf
publisher Springer Nature
publishDate 2020
url https://www.springer.com/9789811555541
_version_ 1771297536981598208