Ollama
#3
by
SebaGPDev
- opened
I use the model with ollama and langchain uses its example code: "https://github.com/nexusflowai/NexusRaven/blob/main/scripts/langchain_example.py", but as a consolation I see this error: ValueError: An output parsing error occurred. To return this error to the agent and have it try again, pass handle_parsing_errors=True
to the AgentExecutor. This is the error: Could not parse the output of LLM: `calculator(11.22, 33.333), ('add', 'add', 'add')).
What could I do? Or what would be the mistake? thank you so much
Code here: