when this method enters into crew tasks, its should throw the user-defined exception if it exceed the token limit.
please help me on this logic.
user defined exception based on token limit.
i need to validate the tasks input context length and add the user defined message into output report.
example: task input token is 40k , max_token=32k, then its usually throws a exceeding max_token limit, right. In this place i need to add the user defined error message.
File "/Users/paarttipaa/ProjectTask/GithubProj/slc_code_explanation_project/SLC_Step02_Crewai/work/crewai/javadesigndocgen/.venv/lib/python3.12/site-packages/litellm/llms/watsonx/completion/handler.py", line 420, in handle_text_request with self.request_manager.request( ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^ File "/Library/Frameworks/Python.framework/Versions/3.12/lib/python3.12/contextlib.py", line 137, in __enter__ return next(self.gen) ^^^^^^^^^^^^^^ File "/Users/paarttipaa/ProjectTask/GithubProj/slc_code_explanation_project/SLC_Step02_Crewai/work/crewai/javadesigndocgen/.venv/lib/python3.12/site-packages/litellm/llms/watsonx/completion/handler.py", line 698, in request raise WatsonXAIError(status_code=500, message=str(e)) litellm.llms.watsonx.common_utils.WatsonXAIError: Error 400 (Bad Request): {"errors":[{"code":"invalid_input_argument","message":"Invalid input argument for Model 'mistralai/mistral-large': the number of input tokens 39055 cannot exceed the total tokens limit 32768 for this model","more_info":"https://cloud.ibm.com/apidocs/watsonx-ai"}],"trace":"f7766797cddbd9e1e32c3595ede7e127","status_code":400}
this error is arises because of input token limitation.
I am trying to handle this error with user defined exception message and continue the remaining process during kick-off for each execution. help me on this…
@tonykipkemboi facing difficulties in implementing the above exception class
In kickoff_foreach(), if the exception is occur after nth iteration, then its prints a exception message and continues the remaining iteration process. Please help on this.