-
Notifications
You must be signed in to change notification settings - Fork 36
[QUESTION] 请求添加代理/Request to add proxy #1
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Comments
Hi @hai178912522, Thank you for your question about adding proxy support to mLLMCelltype. We understand that users in mainland China may face challenges accessing certain LLM APIs. We're considering implementing proxy support in a future version of the package to address this issue directly. In the meantime, there are several workarounds you can use:
We'll prioritize adding built-in proxy support in a future release. If you have any specific requirements or suggestions for this feature, please let us know. Thank you for your interest in mLLMCelltype! Chen |
感谢您提出关于添加代理支持的问题!我们已经在考虑这个功能,并计划在未来版本中实现。 我很高兴地告诉您,我们刚刚创建了一个 Discord 社区,您可以在那里获取实时更新和支持:https://discord.gg/pb2aZdG4 中国大陆用户使用指南针对中国大陆用户访问 API 的问题,我们提供以下几种解决方案: 1. 使用系统代理在 R 中设置代理: # 设置 HTTP/HTTPS 代理
Sys.setenv(http_proxy = "http://your_proxy_address:port")
Sys.setenv(https_proxy = "http://your_proxy_address:port")
# 如果需要认证
Sys.setenv(http_proxy = "http://username:password@your_proxy_address:port")
Sys.setenv(https_proxy = "http://username:password@your_proxy_address:port") 在 Python 中设置代理: # 设置代理环境变量
import os
os.environ['HTTP_PROXY'] = 'http://your_proxy_address:port'
os.environ['HTTPS_PROXY'] = 'http://your_proxy_address:port'
# 如果需要认证
os.environ['HTTP_PROXY'] = 'http://username:password@your_proxy_address:port'
os.environ['HTTPS_PROXY'] = 'http://username:password@your_proxy_address:port' 2. 使用国内可访问的模型我们已经支持多种在中国大陆可直接访问的模型:
3. 使用 OpenRouter我们最近添加了对 OpenRouter 的支持,它可以作为多种 LLM API 的代理服务。您可以使用单一 API 密钥访问多种模型。 未来计划我们计划在下一个版本中添加内置的代理支持功能,让用户可以直接在配置中指定代理设置,无需额外的环境变量配置。 Thank you for raising the question about adding proxy support! We are already considering this feature and plan to implement it in a future version. I'm pleased to inform you that we've just created a Discord community where you can get real-time updates and support: https://discord.gg/pb2aZdG4 Guide for Mainland China UsersFor users in mainland China facing API access issues, we provide the following solutions: 1. Using System ProxySetting proxy in R: # Set HTTP/HTTPS proxy
Sys.setenv(http_proxy = "http://your_proxy_address:port")
Sys.setenv(https_proxy = "http://your_proxy_address:port")
# If authentication is needed
Sys.setenv(http_proxy = "http://username:password@your_proxy_address:port")
Sys.setenv(https_proxy = "http://username:password@your_proxy_address:port") Setting proxy in Python: # Set proxy environment variables
import os
os.environ['HTTP_PROXY'] = 'http://your_proxy_address:port'
os.environ['HTTPS_PROXY'] = 'http://your_proxy_address:port'
# If authentication is needed
os.environ['HTTP_PROXY'] = 'http://username:password@your_proxy_address:port'
os.environ['HTTPS_PROXY'] = 'http://username:password@your_proxy_address:port' 2. Using Models Accessible in Mainland ChinaWe already support several models that are directly accessible in mainland China:
3. Using OpenRouterWe recently added support for OpenRouter, which can serve as a proxy for various LLM APIs. You can access multiple models with a single API key. Future PlansWe plan to add built-in proxy support in the next version, allowing users to specify proxy settings directly in the configuration without requiring additional environment variables. |
Question
Environment Information
What I've Tried
Code Example
# If applicable, provide a short code example
Additional Context
The text was updated successfully, but these errors were encountered: