Nineteenai doesnt work with the usual llm setup code

NineteenAI gives free access to Lllama 70B which is what im trying to use. The api and stuff can be found here:

NineteenAI | Decentralised Inference.

(I have given the specific error below, but its basically that i cannot set up the llm)

This is what i was trying to do:

@CrewBase
class AiNews():
	"""AiNews crew"""
	agents_config = 'config/agents.yaml'
	tasks_config = 'config/tasks.yaml'
	llm = LLM(
		api_key="rayon_******643iy5YauprMqfjJM9tbV",
    	base_url="https://api.nineteen.ai/v1",
		model = "hugging-quants/Meta-Llama-3.1-70B-Instruct-AWQ-INT4",
		)

@agent
def retrieve_news(self) -> Agent:
	return Agent(
		config=self.agents_config['retrieve_news'],
		tools = [SerperDevTool()],
		verbose=True,
		llm = self.llm
	)

@agent
def website_scraper(self) -> Agent:
	return Agent(
		config=self.agents_config['website_scraper'],
		tools=[ScrapeWebsiteTool()],
		verbose=True,
		llm= self.llm
	)

The rest of the code is like the usual default poem crew from crewai.

The error being recieved is basically this:

litellm.exceptions.BadRequestError: litellm.BadRequestError: LLM Provider NOT provided. Pass in the LLM provider you are trying to call. You passed model=hugging-quants/Meta-Llama-3.1-70B-Instruct-AWQ-INT4