You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
In latest version, if you don't add the model provider separated by a slash in front of the model name, this will return error.
File ".venv/lib/python3.11/site-packages/scrapegraphai/graphs/abstract_graph.py", line 180, in _create_llm
f"""Provider {llm_params['model_provider']} is not supported.
~~~~~~~~~~^^^^^^^^^^^^^^^^^^
KeyError: 'model_provider'
I tested adding "ollama/" in front of model name and it works. I assume this should be fixed for other providers too, but I haven't tested that personally.
0 commit comments