Litellm config.yaml

Where is the config.yaml that litellm uses in crewai?

There is no YAML as LiteLLM is done programmatically with the code

@matt so there is no way as in the .yaml file for litellm proxy to set parameters for a llm? I guess its back to the same old question. How do you set up a llm with parameters in crewai now and I know you said you were looking into it.

The LLM class is being refactored to allow for this - crewAI/src/crewai/llm.py at main · crewAIInc/crewAI · GitHub

@matt how will we initialize a model with parameters with this change?
nvm found it in new doc and upgrade to 0.63.1