• Admin Login
    Search 
    •   Home
    • Faculty of Social Sciences
    • Search
    •   Home
    • Faculty of Social Sciences
    • Search
    JavaScript is disabled for your browser. Some features of this site may not work without it.

    Browse

    All of WIRECommunitiesTitleAuthorsIssue DateSubmit DateSubjectsTypesJournalDepartmentPublisherThis CommunityTitleAuthorsIssue DateSubmit DateSubjectsTypesJournalDepartmentPublisher

    Administrators

    Admin Login

    Filter by Category

    SubjectsInformation Retrieval (2)
    Test Collections (2)
    Key Phrases (1)Relevance Judgements (1)View MoreJournalEleventh International Conference on Digital Information Management (ICDIM) (1)Proceedings of the Conference "Lernen, Wissen, Daten, Analysen" (1)AuthorsMakary, Mireille (2)Oakes, Michael (2)Yamout, Fadi (2)Year (Issue Date)2016 (2)Types
    Conference contribution (2)

    Local Links

    AboutThe University LibraryPublications PolicyDeposit LicenceCORESubmit item

    Statistics

    Display statistics
     

    Search

    Show Advanced FiltersHide Advanced Filters

    Filters

    Now showing items 1-2 of 2

    • List view
    • Grid view
    • Sort Options:
    • Relevance
    • Title Asc
    • Title Desc
    • Issue Date Asc
    • Issue Date Desc
    • Results Per Page:
    • 5
    • 10
    • 20
    • 40
    • 60
    • 80
    • 100

    • 2CSV
    • 2RefMan
    • 2EndNote
    • 2BibTex
    • Selective Export
    • Select All
    • Help
    Thumbnail

    Towards automatic generation of relevance judgments for a test collection

    Makary, Mireille; Oakes, Michael; Yamout, Fadi (IEEE, 2016-09-20)
    This paper represents a new technique for building a relevance judgment list for information retrieval test collections without any human intervention. It is based on the number of occurrences of the documents in runs retrieved from several information retrieval systems and a distance based measure between the documents. The effectiveness of the technique is evaluated by computing the correlation between the ranking of the TREC systems using the original relevance judgment list (qrels) built by human assessors and the ranking obtained by using the newly generated qrels.
    Thumbnail

    Using key phrases as new queries in building relevance judgements automatically

    Makary, Mireille; Oakes, Michael; Yamout, Fadi (CEUR - workshop proceedings, 2016-09-30)
    We describe a new technique for building a relevance judgment list (qrels) for TREC test collections with no human intervention. For each TREC topic, a set of new queries is automatically generated from key phrases extract-ed from the top k documents retrieved from 12 different Terrier weighting models when the initial TREC topic is submitted. We assign a score to each key phrase based on its similarity to the original TREC topic. The key phrases with the highest scores become the new queries for a second search, this time using the Terrier BM25 weighting model. The union of the documents retrieved forms the automatically-build set of qrels.
    DSpace software (copyright © 2002 - 2019)  DuraSpace
    Quick Guide | Contact Us
    Open Repository is a service operated by 
    Atmire NV
     

    Export search results

    The export option will allow you to export the current search results of the entered query to a file. Different formats are available for download. To export the items, click on the button corresponding with the preferred download format.

    By default, clicking on the export buttons will result in a download of the allowed maximum amount of items.

    To select a subset of the search results, click "Selective Export" button and make a selection of the items you want to export. The amount of items that can be exported at once is similarly restricted as the full export.

    After making a selection, click one of the export format buttons. The amount of items that will be exported is indicated in the bubble next to export format.