LLM Wrapper to use
Key to use for output, defaults to text
Prompt object to use
Optional llmKwargs to pass to LLM
Optional memoryOptional outputOutputParser to use
Optional config: any[]Use .batch() instead. Will be removed in 0.2.0.
This feature is deprecated and will be removed in the future.
It is not recommended for use.
Call the chain on all inputs in the list
Static deserializeLoad a chain from a json-like object describing it.
Static fromLLMA static factory method that creates an instance of TaskExecutionChain. It constructs a prompt template for task execution, which is then used to create a new instance of TaskExecutionChain. The prompt template instructs an AI to perform a task based on a given objective, taking into account previously completed tasks.
An object of type LLMChainInput, excluding the "prompt" field.
An instance of LLMChain.
Chain to execute tasks.