Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

API keys to secure the endpoint #202

Open
phulstaert opened this issue Feb 5, 2025 · 6 comments
Open

API keys to secure the endpoint #202

phulstaert opened this issue Feb 5, 2025 · 6 comments
Labels
enhancement New feature or request

Comments

@phulstaert
Copy link

I am running LM Studio on my home server and this works great. The only issue is that I cannot expose it to the outside because everyone could use it and potentially limiting my performance. It would be nice to be able the generate api keys that I can send along with the connection (like openAI does) to authenticate.

@mkmatzat
Copy link

mkmatzat commented Feb 21, 2025

Good point, I would also like to expose the server to the net to share my powerful machine with friends.

I see lot of scanners on my mapped public port after one day.

2025-02-20 01:23:01 [DEBUG] 
Received request: GET to /favicon.ico
2025-02-20 01:23:01 [ERROR] 
Unexpected endpoint or method. (GET /favicon.ico). Returning 200 anyway
2025-02-20 01:23:04 [DEBUG] 
Received request: GET to /
2025-02-20 01:23:04 [ERROR] 
Unexpected endpoint or method. (GET http://api.ipify.org/?format=json). Returning 200 anyway
2025-02-20 11:53:39 [DEBUG] 
Received request: GET to /helpdesk/WebObjects/Helpdesk.woa
2025-02-20 11:53:39 [ERROR] 
Unexpected endpoint or method. (GET /helpdesk/WebObjects/Helpdesk.woa). Returning 200 anyway
2025-02-21 21:11:14 [ERROR] 
Unexpected endpoint or method. (GET /). Returning 200 anyway
2025-02-21 22:49:12 [DEBUG] 
Received request: GET to /
2025-02-21 22:49:12 [ERROR] 
Unexpected endpoint or method. (GET /). Returning 200 anyway
2025-02-21 22:58:07 [DEBUG] 
Received request: GET to /
2025-02-21 22:58:07 [ERROR] 
Unexpected endpoint or method. (GET /). Returning 200 anyway
2025-02-22 00:06:48  [INFO] 
Server stopped.

It's only a question of time until someone sends a

GET /v1/models

Something like an API key or at least BASIC AUTH would be nice.

@filwu8
Copy link

filwu8 commented Feb 24, 2025

You can configure other webui tools and then set api key

@phulstaert
Copy link
Author

So you mean we should use another webui tool as a sort of a specialized AI proxy? Do you have any suggestions?

It would be nice if it would be possible "out of the box" in lm studio...

@yagil yagil added the enhancement New feature or request label Feb 24, 2025
@yagil
Copy link
Member

yagil commented Feb 24, 2025

We’ll add this

@filwu8
Copy link

filwu8 commented Feb 25, 2025

We’ll add this

It's a good messages,thanks!

Don’t forget to add ssl, Otherwise, apikey can also be captured through network packet capture.

@filwu8
Copy link

filwu8 commented Feb 25, 2025

So you mean we should use another webui tool as a sort of a specialized AI proxy? Do you have any suggestions?

It would be nice if it would be possible "out of the box" in lm studio...

Yes ,I used webui + swagger ! https://swagger.io/

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request
Projects
None yet
Development

No branches or pull requests

4 participants