ModelConfig
ModelProviderConfig
Bases: ConfigBaseModel
config for model provider
Source code in utu/config/model_config.py
11 12 13 14 15 16 17 18 19 20 21 | |
type
class-attribute
instance-attribute
type: Literal[
"chat.completions", "responses", "litellm"
] = "chat.completions"
model type, supported types: chat.completions, responses
model
class-attribute
instance-attribute
model: str = get_env('UTU_LLM_MODEL')
model name
base_url
class-attribute
instance-attribute
base_url: str | None = None
model provider base url
api_key
class-attribute
instance-attribute
api_key: str | None = None
model provider api key
ModelSettingsConfig
Bases: ConfigBaseModel, ModelSettings
ModelSettings in openai-agents
Source code in utu/config/model_config.py
24 25 26 27 | |
ModelParamsConfig
Bases: ConfigBaseModel
Basic params shared in chat.completions and responses
Source code in utu/config/model_config.py
30 31 32 33 34 35 | |
ModelConfigs
Bases: ConfigBaseModel
Overall model config
Source code in utu/config/model_config.py
38 39 40 41 42 43 44 45 46 47 48 49 | |
model_provider
class-attribute
instance-attribute
model_provider: ModelProviderConfig = Field(
default_factory=ModelProviderConfig
)
config for model provider
model_settings
class-attribute
instance-attribute
model_settings: ModelSettingsConfig = Field(
default_factory=ModelSettingsConfig
)
config for agent's model settings
model_params
class-attribute
instance-attribute
model_params: ModelParamsConfig = Field(
default_factory=ModelParamsConfig
)
config for basic model usage, e.g. query_one in tools / judger
termination_max_tokens
class-attribute
instance-attribute
termination_max_tokens: int | None = None
max tokens for the model, used in truncation logic