Interact with the locally deployed LLM using the below Python SDK using LM Studio.
Click here for LM Studio Python SDK documentation - https://lmstudio.ai/docs/python
Ollama by Meta is another option for you to run your LLMs locally on your laptop/PC/Mac.
Visit https://ollama.com/ for more details.
Post your questions if any & I shall try to answer.
Follow me on my social media accounts here - https://linktr.ee/krmadhukar
No comments:
Post a Comment