Skip to content

Conversation

@jenny-miromind
Copy link
Collaborator

@jenny-miromind jenny-miromind commented Oct 2, 2025

This PR refactors how API keys are managed across the model clients. Previously, openai_client.py and anthropic_client.py directly read from environment variables (e.g., OPENAI_API_KEY, ANTHROPIC_API_KEY). This caused conflicts when using OpenAI’s official models as LLM-as-Judge while also switching to non-OpenAI but OpenAI-API-compatible model providers.

Key changes include:

  • Removed environment variable lookups from clients: Clients now rely solely on the api_key passed via configuration.
  • Added api_key to default configuration: A placeholder ("") ensures the field exists in all configs.
  • Updated base client initialization: api_key is now read from configuration using cfg.llm.get("api_key").
  • Adjusted benchmarking and tracing scripts: These scripts now explicitly pass API keys from environment variables into the configuration when needed, maintaining support for LLM-as-Judge workflows.

This ensures a clean separation of concerns:

  • Clients only consume the keys explicitly provided by configuration.
  • Environment variables are injected at the orchestration/script level, avoiding unintended coupling between different client usages.

@jenny-miromind jenny-miromind changed the title Fix bug Refactor model clients to use config-provided API key Oct 2, 2025
@jenny-miromind jenny-miromind merged commit f658ef3 into main Oct 2, 2025
1 check passed
@jenny-miromind jenny-miromind deleted the fix_client branch October 2, 2025 14:49
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants