Conversation
- 在 pyproject.toml 中添加 socksio 依赖以支持 SOCKS 代理 - 在 create_proxy_client 中检测 SOCKS 代理并在缺少 socksio 时提供清晰的安装指引 - 统一 openai_embedding_source 和 openai_tts_api_source 使用 create_proxy_client Fixes AstrBotDevs#5757 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
Summary of ChangesHello, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed! 此拉取请求旨在通过引入集中化的代理客户端创建逻辑来增强系统的网络连接能力,特别是增加了对 SOCKS 代理的支持。它通过提供明确的安装指导来改善用户体验,以防缺少 SOCKS 代理所需的依赖。此外,它还重构了 OpenAI 服务的现有集成,使其能够利用这一统一的代理处理机制,从而提升了代码的健壮性和一致性。 Highlights
Changelog
Using Gemini Code AssistThe full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips. Invoking Gemini You can request assistance from Gemini at any point by creating a comment using either
Customization To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a Limitations & Feedback Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here. You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension. Footnotes
|
There was a problem hiding this comment.
Code Review
This PR unifies proxy client creation logic using the create_proxy_client helper function, adding SOCKS proxy support and improving error handling for proxy-related issues. A security concern was identified regarding the potential leakage of proxy credentials in logs and error messages; the create_proxy_client function should be updated to sanitize proxy URLs by masking any embedded passwords. The refactoring of openai_embedding_source and openai_tts_api_source is well-done, and the clear error messages for missing socksio dependency enhance user experience. A small optimization is suggested for the _is_socks_proxy function to improve conciseness.
- 添加 _sanitize_proxy_url 函数,在日志和错误消息中隐藏代理密码 - 简化 _is_socks_proxy 函数,使用元组参数 - 修复 Gemini Code Assist 指出的安全问题 Co-Authored-By: Claude Opus 4.6 <noreply@anthropic.com>
There was a problem hiding this comment.
Hey - 我发现了 3 个问题
给 AI 助手的提示
Please address the comments from this code review:
## 单条评论
### 评论 1
<location path="astrbot/core/utils/network_utils.py" line_range="108-117" />
<code_context>
+ Returns:
+ Sanitized proxy URL with credentials masked (e.g., "http://****@host:port")
+ """
+ try:
+ from urllib.parse import urlparse, urlunparse
+
+ parsed = urlparse(proxy)
+ # Any userinfo in netloc should be masked to avoid leaking tokens/passwords.
+ if "@" in parsed.netloc and parsed.hostname:
+ host = parsed.hostname
+ if ":" in host and not host.startswith("["):
+ host = f"[{host}]"
+ netloc = f"****@{host}"
+ if parsed.port:
+ netloc += f":{parsed.port}"
+ return urlunparse(
+ (
+ parsed.scheme,
+ netloc,
+ parsed.path,
+ parsed.params,
+ parsed.query,
+ parsed.fragment,
+ )
+ )
+ except Exception:
+ pass
+ return proxy
</code_context>
<issue_to_address>
**🚨 issue (security):** 在解析出错时回退到原始代理地址,仍然可能在日志中泄露凭据。
由于 `_sanitize_proxy_url` 中的任何异常都会导致返回原始 `proxy`,如果解析失败(例如 URL 格式错误、奇怪的编码),凭据仍然可能暴露出来。因为这个辅助函数是用于安全日志记录的,在异常情况下返回一个完全打码的占位值(例如 `"****"` 或者信息极少的值)会比直接返回原始输入更安全。
</issue_to_address>
### 评论 2
<location path="tests/unit/test_network_utils.py" line_range="6-8" />
<code_context>
+from astrbot.core.utils import network_utils
+
+
+def test_sanitize_proxy_url_masks_password_credentials():
+ proxy = "http://user:secret@127.0.0.1:1080"
+ assert network_utils._sanitize_proxy_url(proxy) == "http://****@127.0.0.1:1080"
+
+
</code_context>
<issue_to_address>
**suggestion (testing):** 为 sanitize_proxy_url 增加无凭据 URL、格式错误 URL 和 IPv6 主机的测试用例。
当前测试只覆盖了基础的凭据打码场景。请同时补充以下情况的覆盖:
1. 无用户信息的 URL(例如 `http://127.0.0.1:1080`、`https://example.com`),以确认它们会原样返回。
2. 明显无效或格式错误的值(例如 `"not a url"`、空字符串),以确认在宽泛的 `try/except` 下它们会原样返回。
3. 带凭据的 IPv6 主机(例如 `http://user:secret@[::1]:1080`),以确保对方括号/冒号的处理不会破坏 `netloc`。
这些用例有助于在后续重构中守住行为不变。
</issue_to_address>
### 评论 3
<location path="tests/unit/test_network_utils.py" line_range="28-37" />
<code_context>
+ assert not network_utils._is_socks_proxy("http://127.0.0.1:1080")
+
+
+def test_log_connection_failure_redacts_proxy_in_error_text(monkeypatch):
+ proxy = "http://token@127.0.0.1:1080"
+ captured = {}
+
+ def fake_error(message: str):
+ captured["message"] = message
+
+ monkeypatch.setattr(network_utils.logger, "error", fake_error)
+
+ network_utils.log_connection_failure(
+ provider_label="OpenAI",
+ error=RuntimeError(f"proxy connect failed: {proxy}"),
+ proxy=proxy,
+ )
+
+ assert "http://token@127.0.0.1:1080" not in captured["message"]
+ assert "http://****@127.0.0.1:1080" in captured["message"]
</code_context>
<issue_to_address>
**suggestion (testing):** 为 log_connection_failure 增加在未提供代理以及错误信息中不包含代理字符串时的测试用例。
当前测试覆盖了 `proxy` 参数与异常消息中都包含相同 URL 的情况。为了更全面地测试 `log_connection_failure`,请另外添加以下场景:
1. `proxy=None`(或为空):确认在这种情况下日志中不会包含代理信息,并且在做打码处理时不会导致崩溃。
2. 错误消息本身不包含代理字符串:确认日志在 "代理地址" 字段中仍然使用打码后的代理值,同时保持错误文本本身不变。
</issue_to_address>帮我变得更有用!请在每条评论上点 👍 或 👎,我会根据你的反馈改进后续的审查。
Original comment in English
Hey - I've found 3 issues
Prompt for AI Agents
Please address the comments from this code review:
## Individual Comments
### Comment 1
<location path="astrbot/core/utils/network_utils.py" line_range="108-117" />
<code_context>
+ Returns:
+ Sanitized proxy URL with credentials masked (e.g., "http://****@host:port")
+ """
+ try:
+ from urllib.parse import urlparse, urlunparse
+
+ parsed = urlparse(proxy)
+ # Any userinfo in netloc should be masked to avoid leaking tokens/passwords.
+ if "@" in parsed.netloc and parsed.hostname:
+ host = parsed.hostname
+ if ":" in host and not host.startswith("["):
+ host = f"[{host}]"
+ netloc = f"****@{host}"
+ if parsed.port:
+ netloc += f":{parsed.port}"
+ return urlunparse(
+ (
+ parsed.scheme,
+ netloc,
+ parsed.path,
+ parsed.params,
+ parsed.query,
+ parsed.fragment,
+ )
+ )
+ except Exception:
+ pass
+ return proxy
</code_context>
<issue_to_address>
**🚨 issue (security):** Falling back to the original proxy on parsing errors can still leak credentials in logs.
Because any exception in `_sanitize_proxy_url` causes it to return the original `proxy`, credentials can still be exposed if parsing fails (e.g., malformed URL, odd encodings). Since this helper is intended for safe logging, it would be safer to return a fully redacted placeholder (e.g., `"****"` or a minimally informative value) on exceptions instead of the raw input.
</issue_to_address>
### Comment 2
<location path="tests/unit/test_network_utils.py" line_range="6-8" />
<code_context>
+from astrbot.core.utils import network_utils
+
+
+def test_sanitize_proxy_url_masks_password_credentials():
+ proxy = "http://user:secret@127.0.0.1:1080"
+ assert network_utils._sanitize_proxy_url(proxy) == "http://****@127.0.0.1:1080"
+
+
</code_context>
<issue_to_address>
**suggestion (testing):** Add tests for sanitize_proxy_url on URLs without credentials, malformed URLs, and IPv6 hosts.
The current tests only cover the basic credential redaction case. Please also add coverage for:
1. URLs without userinfo (e.g. `http://127.0.0.1:1080`, `https://example.com`) to confirm they are returned unchanged.
2. Clearly invalid or malformed values (e.g. `"not a url"`, empty string) to confirm they are returned as-is under the broad `try/except`.
3. IPv6 hosts with credentials (e.g. `http://user:secret@[::1]:1080`) to ensure the bracket/colon handling doesn’t break the `netloc`.
These cases will help guard behavior in future refactors.
</issue_to_address>
### Comment 3
<location path="tests/unit/test_network_utils.py" line_range="28-37" />
<code_context>
+ assert not network_utils._is_socks_proxy("http://127.0.0.1:1080")
+
+
+def test_log_connection_failure_redacts_proxy_in_error_text(monkeypatch):
+ proxy = "http://token@127.0.0.1:1080"
+ captured = {}
+
+ def fake_error(message: str):
+ captured["message"] = message
+
+ monkeypatch.setattr(network_utils.logger, "error", fake_error)
+
+ network_utils.log_connection_failure(
+ provider_label="OpenAI",
+ error=RuntimeError(f"proxy connect failed: {proxy}"),
+ proxy=proxy,
+ )
+
+ assert "http://token@127.0.0.1:1080" not in captured["message"]
+ assert "http://****@127.0.0.1:1080" in captured["message"]
</code_context>
<issue_to_address>
**suggestion (testing):** Add tests for log_connection_failure when no proxy is provided and when the error message doesn’t contain the proxy string.
This test covers the case where both the `proxy` argument and the exception message share the same URL. To more fully exercise `log_connection_failure`, please also add tests for:
1. `proxy=None` (or empty): confirm it logs without proxy info and doesn’t crash while sanitizing.
2. An error whose message does not contain the proxy string: confirm the log still uses the sanitized proxy in the "代理地址" field while leaving the error text unchanged.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
There was a problem hiding this comment.
Pull request overview
This PR improves AstrBot’s proxy handling by adding SOCKS proxy detection, providing a clearer missing-dependency error for SOCKS support, and strengthening proxy credential redaction in logs across provider clients.
Changes:
- Add SOCKS proxy detection and proxy URL sanitization/redaction to shared network utilities.
- Unify OpenAI embedding/TTS providers to use the shared
create_proxy_clienthelper. - Add unit tests for proxy URL redaction and SOCKS scheme detection; add
socksiodependency.
Reviewed changes
Copilot reviewed 5 out of 5 changed files in this pull request and generated 1 comment.
Show a summary per file
| File | Description |
|---|---|
astrbot/core/utils/network_utils.py |
Adds SOCKS detection, proxy URL sanitization, and redacted logging for connection failures / proxy usage. |
astrbot/core/provider/sources/openai_embedding_source.py |
Switches to create_proxy_client for consistent proxy behavior and logging. |
astrbot/core/provider/sources/openai_tts_api_source.py |
Switches to create_proxy_client for consistent proxy behavior and logging. |
tests/unit/test_network_utils.py |
Adds regression tests for proxy credential masking and SOCKS scheme detection. |
pyproject.toml |
Adds socksio dependency to support SOCKS proxies with httpx. |
be0097c to
729363f
Compare
26d801b to
729363f
Compare
Fixes #5757
Modifications / 改动点
SOCKS 代理支持
socksio依赖以支持 SOCKS 代理create_proxy_client中检测 SOCKS 代理,缺少依赖时提供清晰的安装指引create_proxy_client代理凭据脱敏
_sanitize_proxy_url函数脱敏代理 URL修改的文件:
astrbot/core/utils/network.pyastrbot/core/provider/sources/openai_embedding_source.pyastrbot/core/provider/sources/openai_tts_source.pyrequirements.txttests/unit/test_network_utils.pyThis is NOT a breaking change. / 这不是一个破坏性变更。
Screenshots or Test Results / 运行截图或测试结果
单元测试通过:
Checklist / 检查清单
requirements.txt和pyproject.toml文件相应位置。/ I have ensured that no new dependencies are introduced, OR if new dependencies are introduced, they have been added to the appropriate locations inrequirements.txtandpyproject.toml.