Fix: Required input llm for module LLMChain not found

There is a recurring intermittent bug, where Chains fail to validate, with an error "Required module ____ for module ____ not found". This happens frequently for all LLMs that inherit from the LLM class (example Cohere's wrapper).

This is caused by this chunk of code.
This also explains why the bug is intermittent and not every time.
"in" is matching LLM from source_types with BaseLLM from target_reqs. Also, this doesn't need to be a nested loop, it can be done with one loop.

I'm a user of LangFlow, and a first time contributor. Thanks!
This commit is contained in:
Sean Javiya 2023-06-28 17:09:58 -07:00 committed by GitHub
commit e1872be728
No known key found for this signature in database
GPG key ID: 4AEE18F83AFDEB23

View file

@ -30,8 +30,7 @@ class Edge:
(
output
for output in self.source_types
for target_req in self.target_reqs
if output in target_req
if output in self.target_reqs
),
None,
)