2 Comments
User's avatar
Bhaskar's avatar

LLMs can work with custom knowledgebase. Bedrock has the concept of a knowledgebase that can work with a set of generic LLM. You can formulate custom knowledge vectors that can modify how chatgpt responds to queries.

Do these knowledgebases provide enough functionality that customizes the LLM responses for the use cases needed and obviates the need to train an SLM ? This is a question not a statement..

Expand full comment
Chinmay Shah's avatar

A custom knowledgebase on top of LLM often involves a technique known as RAG which is different from training LLM/ SLM with custom knowledge.

You can think of RAG as a reference to context [like giving a bunch of text as a reference in a context window] which LLM can use to answer.

Conversely, training/ finetuning LLM with your custom knowledge base refers to your LLM being familiar with or "memorized" data.

For most use cases, these RAG based solutions are enough but the tradeoff is to use a longer context every time.

Expand full comment