Replies: 2 comments 2 replies
-
|
tks,I meet a question:from pandasai.llm.local_llm import LocalLLM |
Beta Was this translation helpful? Give feedback.
2 replies
-
|
Local LLMs for privacy are increasingly viable! At RevolutionAI (https://revolutionai.io) we deploy private PandasAI setups for enterprise clients. Recommended local models:
Setup pattern: from pandasai import SmartDataframe
from pandasai.llm import Ollama
llm = Ollama(model="mistral:7b-instruct")
df = SmartDataframe(data, config={"llm": llm})Privacy benefits:
Tradeoffs:
For most tabular analysis tasks, Mistral 7B is sufficient and runs on consumer hardware. What is your deployment environment? |
Beta Was this translation helpful? Give feedback.
0 replies
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Uh oh!
There was an error while loading. Please reload this page.
-
Can connect pandasai to local LLM like Llama or falcon ? as show in step below
1 - deploy and serve local llm on your personal pc using llamaccp or transformer .
2 - connect pandasai to local llm api created above
3 - import dataframe and do chat .
Beta Was this translation helpful? Give feedback.
All reactions