Crewai result.token_usage not matching with LLMs token Usage count

Hello,

My crew consists of two agents. I have written function to calculate cost per execution of crew. For every execution, I capture prompt_tokens and completion_tokens and pass it to function to calculate cost of that execution. I want to pass that cost to end user who initiated crew. Problem is Token count via crewai’s usage_token api doesn’t match with the one on LLM Side. I am using “claude-3-5-sonnet-20241022” model and when I check on Anthropic’s api console I see more tokens for same request. So I am not able to pass exact cost to user. Can you please suggest what is the best way to get exact cost per crew execution ?

        mycrew = mycrewclass().crew()
        result = mycrew.kickoff(inputs=inputs)
        costs = calculate_cost(mycrew.usage_metrics.prompt_tokens,mycrew.usage_metrics.completion_tokens)

Code of calculate_cost looks like below

def calculate_cost(input_tokens, output_tokens) → float:
“”“Calculate total cost based on current Claude 3.5 Sonnet pricing”“”
input_price_per_1k = os.environ.get(‘INPUT_PRICE_PER_1K’) or ‘you-will-never-guess’
output_price_per_1k = os.environ.get(‘OUTPUT_PRICE_PER_1K’) or ‘you-will-never-guess’
input_cost = (input_tokens/1000) * float(input_price_per_1k)
output_cost = (output_tokens/1000) * float(output_price_per_1k)
return input_cost + output_cost