Skip to content

Commit 8932d5f

Browse files
authored
Update llm.rst
In latest version, if you don't add the model provider separated by a slash in front of the model name, this will return error. File ".venv/lib/python3.11/site-packages/scrapegraphai/graphs/abstract_graph.py", line 180, in _create_llm f"""Provider {llm_params['model_provider']} is not supported. ~~~~~~~~~~^^^^^^^^^^^^^^^^^^ KeyError: 'model_provider' I tested adding "ollama/" in front of model name and it works. I assume this should be fixed for other providers too, but I haven't tested that personally.
1 parent db3afad commit 8932d5f

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/source/scrapers/llm.rst

+1-1
Original file line numberDiff line numberDiff line change
@@ -30,7 +30,7 @@ Then we can use them in the graph configuration as follows:
3030
3131
graph_config = {
3232
"llm": {
33-
"model": "llama3",
33+
"model": "ollama/llama3",
3434
"temperature": 0.0,
3535
"format": "json",
3636
},

0 commit comments

Comments
 (0)