Show thread history
nix
2d ago
If the LLM provides an answer much better than ollama or gpt4all models, and what I need to ask is sensitive then yes.
In practice that means no, sorry.
In practice that means no, sorry.
See translation
0
0
0
0
0
No replies