Back to top

Guided Research Dennis Schneider

Last modified May 31, 2023

Injecting Knowledge into Sentence Embedding Models for Information Retrieval using Adapters

Abstract

This work investigates approaches on injecting structured knowledge from knowledge graphs into transformer language models and evaluates their effect on SOTA sentence embedding models for information retrieval. In order to do this, this work focuses on using adapter-networks to inject this knowledge. Adapter-networks are neural modules of small parameter-sizes, being tuned while keeping the base model untouched.

As a result, learned patterns of the original model are preserved, alleviating the problem of "forgetting" previously learned patterns. At the same time, adapters have shown to yield competitive results while only fine-tuning on a small percentage of the parameters available in the base model.
This gives reason to investigate the effect of adapter-networks on sentence embedding models.
The results are evaluated on both general and scholarly datasets.

Research Questions

- How to enhance similarity search with sentence embedding models using knowledge adapters?


1. How to inject structured knowledge into sentence embedding models with adapters?
2. Do knowledge adapters improve information retrieval tasks of sentence embedding models?
3. How to combine domain-specific knowledge adapters for the scholarly domain?

Files and Subpages

Name Type Size Last Modification Last Editor
GR Dennis Schneider final.pdf 2,11 MB 31.05.2023
GR Dennis Schneider kickoff.pdf 307 KB 31.05.2023
GR Dennis Schneider.pdf 402 KB 31.05.2023