Loading…
Enhancing query relevance: leveraging SBERT and cosine similarity for optimal information retrieval
Whether the information is related to a concept or a product, queries are frequently employed to find out information. Different people will use different sentences while having the same objective. In our research, we generated a sentence vector using an SBERT-based model called all-MiniLM-L6-v2, wh...
Saved in:
Published in: | International journal of speech technology 2024, Vol.27 (3), p.753-763 |
---|---|
Main Authors: | , , , , |
Format: | Article |
Language: | English |
Subjects: | |
Citations: | Items that this one cites |
Online Access: | Get full text |
Tags: |
Add Tag
No Tags, Be the first to tag this record!
|
Summary: | Whether the information is related to a concept or a product, queries are frequently employed to find out information. Different people will use different sentences while having the same objective. In our research, we generated a sentence vector using an SBERT-based model called all-MiniLM-L6-v2, which is a MiniLM model calibrated on a sizable dataset of more than 1 billion training pairings. Then, we compute a similarity score using these vectors to determine how similar the provided query is to other queries previously stored within the database. Cosine similarity is the metric used to assess similarity in our work. From the pool of potential answers, we will choose the one that has the highest similarity score with its query. |
---|---|
ISSN: | 1381-2416 1572-8110 |
DOI: | 10.1007/s10772-024-10133-5 |