Ollama embeddings are useful to enhance langflow's support of Ollama,
allowing users to run LLMs such as Mistral and LLama locally. Langchain
documentation can be found via [this
link](https://python.langchain.com/docs/integrations/text_embedding/ollama).
Changes:
- New `OllamaEmbeddingsComponent` class
- Associated documentation in the `Embeddings` section
In this pull request, I have:
- added Elasticsearch component on Vector Store
- functionality is basic to get started quickly
**Set up a Single-node Elasticsearch cluster on localhost on docker**
1. sudo docker network create elastic
2. sudo docker run --net elastic -p 9200:9200 -e
"discovery.type=single-node" -e "xpack.security.enabled=false" -e
"xpack.security.http.ssl.enabled=false" -d
docker.elastic.co/elasticsearch/elasticsearch:8.11.3
3. curl http://localhost:9200
{
"name" : "994a10c1dab5",
"cluster_name" : "docker-cluster",
"cluster_uuid" : "p_iQ88F-T2agFpIdzJN7Ow",
"version" : {
"number" : "8.11.3",
"build_flavor" : "default",
"build_type" : "docker",
"build_hash" : "64cf052f3b56b1fd4449f5454cb88aca7e739d9a",
"build_date" : "2023-12-08T11:33:53.634979452Z",
"build_snapshot" : false,
"lucene_version" : "9.8.0",
"minimum_wire_compatibility_version" : "7.17.0",
"minimum_index_compatibility_version" : "7.0.0"
},
"tagline" : "You Know, for Search"
}
4. curl -GET http://localhost:9200/_cat/indices
No indices are available
**Set up Langlow:**
1. make backend
2. make frontend
3. open localhost:3000
4. New Project -> Import from JSON: elasticsearch-langflow.json
**[elasticsearch-langflow.json](https://github.com/logspace-ai/langflow/files/13823444/elasticsearch-langflow.json)**
5. poetry add elasticsearch or pipenv install elasticsearch or pip
install elasticsearch
6. Select text file to load
7. Provide OpenAPI keys for OpenAIEmbeddings and ChatOpenAI
8. Build
9. Verify that document is indexed: curl -GET
http://localhost:9200/_cat/indices
health status index uuid pri rep docs.count docs.deleted store.size
pri.store.size dataset.size
yellow open test-index pt9_ZOACR8mWCNx7GO3scA 1 1 1 0 39.3kb 39.3kb
39.3kb
10. Open Chat and you can ask: "What is the document about?"
The ChatDefinition allows users to turn any flow into a Chat flow by
defining what has to run and what are the inputs and outputs.
The ChatDefinition requires a function, inputs(optional) and
output_key(optional).
The function receives a dictionary as input and can output a string or a
dict. If the output is a dict, then an output_key must be provided.
Anything can run inside the function. You can also pass methods of
pre-built classes like a Chain.
Here's an example of how to use it in a CustomComponent:
```python
from langflow import CustomComponent
from langflow.utils.chat import ChatDefinition
class Component(CustomComponent):
documentation: str = "http://docs.langflow.org/components/custom"
def build(self) -> Data:
def func(inputs, callbacks):
return {"text":'This is a simple example.'}
return ChatDefinition(func=func, inputs=[], output_key="text")
```
### Pull Request for Issue #1246
**Description**,
This pull request addresses issue #1246, which proposes the addition of
a self-query retriever according to the LangChain Vectara integration.
The self-query retriever aims to empower users with the ability to
perform queries directly within the Vectara component(vector store).
**Changes Made**
I have added one more file under
`src\backend\langflow\components\retrievers` which contains a new
VectaraSelfQueryRetriverComponent class
**Files Added:** VectaraSelfQueryRetriever.py
**langchain documentation for this component:**
https://python.langchain.com/docs/integrations/retrievers/self_query/vectara_self_query
This pull request adds a Dockerfile for building and pushing the application image. The Dockerfile includes the necessary steps to set up the environment, install dependencies, and run the application.