Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[BUG]: pinned documents that have ~200k tokens (~0.5MB .txt file) breaking AnythingLLM Docker version #3280

Open
severfire opened this issue Feb 19, 2025 · 2 comments · May be fixed by #3286
Labels
possible bug Bug was reported but is not confirmed or is unable to be replicated.

Comments

@severfire
Copy link

How are you running AnythingLLM?

Docker (local)

What happened?

  1. Pin .5MB .txt file to workspace
  2. have Google Gemini with 1m token context window set
  3. CPU goes to 100%
  4. after some time AnythingLLM breaks

Are there known steps to reproduce?

No response

@severfire severfire added the possible bug Bug was reported but is not confirmed or is unable to be replicated. label Feb 19, 2025
@timothycarambat
Copy link
Member

Are you running docker on a single CPU core with minimal RAM?
How many words is this document and are you sure about the token count or are you estimating?

@severfire
Copy link
Author

severfire commented Feb 19, 2025

@timothycarambat
I tried it on my 32GB 24 core machine on docker desktop
and with real server, with 2 cores and 4GB of ram and 6GB of swam memory

both falied.
token count was made by Google AI Studio. Confirmed under the limit. ~150k tokens. Under 1m.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
possible bug Bug was reported but is not confirmed or is unable to be replicated.
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants