Skip to content

Issues: Mintplex-Labs/anything-llm

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Author
Filter by author
Loading
Label
Filter by label
Loading
Use alt + click/return to exclude labels
or + click/return for logical OR
Projects
Filter by project
Loading
Milestones
Filter by milestone
Loading
Assignee
Filter by who’s assigned
Sort

Issues list

[FEAT]: Support Claude 3.7 Sonnet enhancement New feature or request feature request
#3342 opened Feb 25, 2025 by pors
[FEAT]: Search Bar for Workspaces enhancement New feature or request feature request
#3341 opened Feb 25, 2025 by Yaronbaroz18
[BUG]: docker run anythingllm failed on centos7.9 possible bug Bug was reported but is not confirmed or is unable to be replicated.
#3332 opened Feb 24, 2025 by sanpatricky
[BUG]: I upload a file contains some infomation but I ask anythingllm it can not give me right answer possible bug Bug was reported but is not confirmed or is unable to be replicated.
#3328 opened Feb 24, 2025 by DavidXuanLuo
[BUG]: Failed to save LLM Settings: Failed to Fetch possible bug Bug was reported but is not confirmed or is unable to be replicated.
#3326 opened Feb 23, 2025 by xzddakfdmiug
[BUG]: I set different models for Ollama per workspace, but the same one is used even after switching. possible bug Bug was reported but is not confirmed or is unable to be replicated.
#3323 opened Feb 22, 2025 by rabinnh
Change default storage location
#3322 opened Feb 22, 2025 by jmarz217
[BUG]: /v1/openai/embeddings is not compatible with openai spec / sdk possible bug Bug was reported but is not confirmed or is unable to be replicated.
#3312 opened Feb 21, 2025 by miraculixx
[BUG]: When deploying AnythingLLM using docker-compose.yml and accessing directly on port 3001 without using an Nginx proxy, the error message 'Could not respond to message. An error occurred while streaming response. network error' investigating Core team or maintainer will or is currently looking into this issue needs info / can't replicate Issues that require additional information and/or cannot currently be replicated, but possible bug possible bug Bug was reported but is not confirmed or is unable to be replicated.
#3310 opened Feb 21, 2025 by showcup
[BUG]: AnythingLLMDesktop on Linux Fails - Fetch Fail Error Saving LLM settings on Initial Execution After Installation needs info / can't replicate Issues that require additional information and/or cannot currently be replicated, but possible bug
#3300 opened Feb 20, 2025 by rurhrlaub
[BUG]: needs info / can't replicate Issues that require additional information and/or cannot currently be replicated, but possible bug
#3296 opened Feb 20, 2025 by yhw0311
[FEAT]: Support Memory Layer Integration for Persistent User Context enhancement New feature or request feature request Integration Request Request for support of a new LLM, Embedder, or Vector database
#3289 opened Feb 20, 2025 by therealtimex
[FEAT]: Support for Inflection AI API enhancement New feature or request feature request Integration Request Request for support of a new LLM, Embedder, or Vector database
#3281 opened Feb 19, 2025 by jimbob-w
[BUG]: pinned documents that have ~200k tokens (~0.5MB .txt file) breaking AnythingLLM Docker version possible bug Bug was reported but is not confirmed or is unable to be replicated.
#3280 opened Feb 19, 2025 by severfire
ProTip! Follow long discussions with comments:>50.