Skip to content

[QUESTION] 请求添加代理/Request to add proxy #1

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Open
hai178912522 opened this issue Apr 20, 2025 · 2 comments
Open

[QUESTION] 请求添加代理/Request to add proxy #1

hai178912522 opened this issue Apr 20, 2025 · 2 comments
Labels
feature: performance Performance optimization. 性能优化 priority: high High priority issues. 高优先级问题

Comments

@hai178912522
Copy link

Question

Environment Information

  • mLLMCelltype Version:
  • Language:
  • Language Version:
  • Operating System:
  • Used LLM Models:

What I've Tried

Code Example

# If applicable, provide a short code example

Additional Context

@cafferychen777
Copy link
Owner

Hi @hai178912522,

Thank you for your question about adding proxy support to mLLMCelltype.

We understand that users in mainland China may face challenges accessing certain LLM APIs. We're considering implementing proxy support in a future version of the package to address this issue directly.

In the meantime, there are several workarounds you can use:

  1. System-wide VPN: Using a reliable VPN service that routes all your system traffic through servers outside mainland China is often the most straightforward solution. When your VPN is active, mLLMCelltype should be able to connect to all LLM APIs without additional configuration.

  2. Configuring proxy settings in your R/Python environment:

    For R users:

    # Set proxy in your R session
    Sys.setenv(http_proxy = "http://your_proxy_address:port")
    Sys.setenv(https_proxy = "http://your_proxy_address:port")
    
    # If your proxy requires authentication
    Sys.setenv(http_proxy = "http://username:password@your_proxy_address:port")
    Sys.setenv(https_proxy = "http://username:password@your_proxy_address:port")

    For Python users:

    # Set proxy environment variables
    import os
    os.environ['HTTP_PROXY'] = 'http://your_proxy_address:port'
    os.environ['HTTPS_PROXY'] = 'http://your_proxy_address:port'
    
    # With authentication if needed
    os.environ['HTTP_PROXY'] = 'http://username:password@your_proxy_address:port'
    os.environ['HTTPS_PROXY'] = 'http://username:password@your_proxy_address:port'
  3. Using local API alternatives: For some models, you might consider using local API alternatives that are accessible in mainland China, such as Qwen models through Alibaba Cloud.

We'll prioritize adding built-in proxy support in a future release. If you have any specific requirements or suggestions for this feature, please let us know.

Thank you for your interest in mLLMCelltype!

Chen

@cafferychen777 cafferychen777 added feature: performance Performance optimization. 性能优化 priority: high High priority issues. 高优先级问题 and removed question labels Apr 28, 2025
@cafferychen777 cafferychen777 changed the title [QUESTION] 请求添加代理 [QUESTION] 请求添加代理/Request to add proxy Apr 28, 2025
@cafferychen777
Copy link
Owner

@hai178912522

感谢您提出关于添加代理支持的问题!我们已经在考虑这个功能,并计划在未来版本中实现。

我很高兴地告诉您,我们刚刚创建了一个 Discord 社区,您可以在那里获取实时更新和支持:https://discord.gg/pb2aZdG4

中国大陆用户使用指南

针对中国大陆用户访问 API 的问题,我们提供以下几种解决方案:

1. 使用系统代理

在 R 中设置代理:

# 设置 HTTP/HTTPS 代理
Sys.setenv(http_proxy = "http://your_proxy_address:port")
Sys.setenv(https_proxy = "http://your_proxy_address:port")

# 如果需要认证
Sys.setenv(http_proxy = "http://username:password@your_proxy_address:port")
Sys.setenv(https_proxy = "http://username:password@your_proxy_address:port")

在 Python 中设置代理:

# 设置代理环境变量
import os
os.environ['HTTP_PROXY'] = 'http://your_proxy_address:port'
os.environ['HTTPS_PROXY'] = 'http://your_proxy_address:port'

# 如果需要认证
os.environ['HTTP_PROXY'] = 'http://username:password@your_proxy_address:port'
os.environ['HTTPS_PROXY'] = 'http://username:password@your_proxy_address:port'

2. 使用国内可访问的模型

我们已经支持多种在中国大陆可直接访问的模型:

  • 阿里云通义千问(Qwen2.5)
  • 智谱 GLM-4
  • MiniMax
  • Stepfun

3. 使用 OpenRouter

我们最近添加了对 OpenRouter 的支持,它可以作为多种 LLM API 的代理服务。您可以使用单一 API 密钥访问多种模型。

未来计划

我们计划在下一个版本中添加内置的代理支持功能,让用户可以直接在配置中指定代理设置,无需额外的环境变量配置。


Thank you for raising the question about adding proxy support! We are already considering this feature and plan to implement it in a future version.

I'm pleased to inform you that we've just created a Discord community where you can get real-time updates and support: https://discord.gg/pb2aZdG4

Guide for Mainland China Users

For users in mainland China facing API access issues, we provide the following solutions:

1. Using System Proxy

Setting proxy in R:

# Set HTTP/HTTPS proxy
Sys.setenv(http_proxy = "http://your_proxy_address:port")
Sys.setenv(https_proxy = "http://your_proxy_address:port")

# If authentication is needed
Sys.setenv(http_proxy = "http://username:password@your_proxy_address:port")
Sys.setenv(https_proxy = "http://username:password@your_proxy_address:port")

Setting proxy in Python:

# Set proxy environment variables
import os
os.environ['HTTP_PROXY'] = 'http://your_proxy_address:port'
os.environ['HTTPS_PROXY'] = 'http://your_proxy_address:port'

# If authentication is needed
os.environ['HTTP_PROXY'] = 'http://username:password@your_proxy_address:port'
os.environ['HTTPS_PROXY'] = 'http://username:password@your_proxy_address:port'

2. Using Models Accessible in Mainland China

We already support several models that are directly accessible in mainland China:

  • Alibaba Cloud Qwen2.5
  • Zhipu GLM-4
  • MiniMax
  • Stepfun

3. Using OpenRouter

We recently added support for OpenRouter, which can serve as a proxy for various LLM APIs. You can access multiple models with a single API key.

Future Plans

We plan to add built-in proxy support in the next version, allowing users to specify proxy settings directly in the configuration without requiring additional environment variables.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature: performance Performance optimization. 性能优化 priority: high High priority issues. 高优先级问题
Projects
None yet
Development

No branches or pull requests

2 participants