Calculate the prompt token for each and every tasks in agent

How to get the prompt token after each and every task? → I need this to raise the userdeifned exception if the token limit exceeded.

TaskOutput doesn’t have the attribute of usage metrics. Help me on this.

3 Likes

I was wondering the same issue and it was an dead end. I was trying to use from langchain.callbacks import get_openai_callback or
from langchain.callbacks.openai_info import OpenAICallbackHandler but it yields no result. If you have something in your mind, please share. Thank you!