I am trying to use mistral model for my agent and i am getting an error. I am making a mistake but don't know where

i am getting the following error:
BadRequestError: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=mistralai/mistral-small-3.2-24b-instruct:free\n Pass model as E.g. For ‘Huggingface’ inference endpoints pass in completion(model='huggingface/starcoder',..) Learn more: Providers | liteLLM"