4 Comments
User's avatar
Gopinathan K. Munappy's avatar

Is it possible to use LLM other than Gemini, specifically open source LLMs?

Expand full comment
Zachary Huang's avatar

Yes! Replace the call_llm.py with your implementation:

https://the-pocket.github.io/PocketFlow/utility_function/llm.html

Expand full comment
Gopinathan K. Munappy's avatar

Thanks for your reply.

Expand full comment
Gopinathan K. Munappy's avatar

Before I get your response I tried with open source models like Deepseek r1 7b and Qwen 2.5 and I properly invoked the uvicorn server and did proceed with your website github URL and asked "what is PocketFlow?". It takes a lot of time for 1/5 iteration. So manually stopped the server. May be because of my constrained NVIDIA GeForce RTX 4060 with 8GB VRAM which is not enough to to do these kind of task.

Expand full comment