Client session memory

Hi, I would like to build a kind of personal shopper under rest technology using the crewai framework… I’m thinking about the better approach to agent memory keeping in mind next:

  • Agent should have a session memory for each client conversation
  • Agent should have a long term memory for products, faqs, offers, …
  • Perhaps, could be a long term memory for client, accross differents shops…
    Looking at crew memory, it seems it cannot provide support for these use cases.
    Any approach? Perhaps memory to false and building specifics components for that?

Thanks in advance!!!

Hi @rolmovel,
What you describe is functionality that is normally provided by backend storage of an e-commerce site, normally via some type of DB (MySql; Mongo, etc). Where entities as you describe are indexed by session_id (_sid) collected from the HTTP requests that the BE receives. TBH: It would not make sense to ask a crew to store such things.
The way I woul dgo is to create a custom tool that gets what you want from an external datasource.

Others may have other opinions.

1 Like

Hi @Dabnis , what a nice debate!!!

Not sure, I think there will be a time in which both technologies should live together… In this particular use case, have you ever seen any online store beeing able to have search capabilities as crew? Any backend process having a kind of reasoning capabilities as crew has?

In my opinion, having more isolated crews, and perhaps more scalables could open oportunities to better web applications.

I really think that assistants should change the corseted way we have to navigate the web today…
but I’m just philosophizing!!!

Hi @rolmovel,
In my opinion’, one of the main issues of mordern AI is that solutions at our level are Python based, but Python based apps are in the minority when it comes to cloud/server based applications.
Python was originally just a scripting language, not really a software language that was meant for such large scale deployments.
A practical example: A simple cloud bassed application created is such as Golang; Rust, etc will typically have a Docker container size of 50-100 Mb. The same application in Python would produce a Docker containerr of 100’s of Mb.
Do a Google search for ‘Why is my Python container so large?’. I’ve seen Python apps that resultt in Docker containers of 1 Gb+. While it is possible to deploy such systems, the cost of doing so is far more expensive in terms of resources and cost.
There is a saying : ‘Python is holding AI hostage!’ Having worked in the industry for a long time, I am aware of major investment in terms of man hours to port AI related Python AI functionality to languages such as Go; Rust, Rust being the favoured option.
I could write a book about why Python deployed containers in a kuberneties environment is considered bad practice in part due to the same reasons that I describe above.
I appload your thoughts, but such will not happen at scale while people are using Python for AI applications.

To answer your original question: ‘have you ever seen any online store beeing able to have search capabilities as crew?’, No, but I have seen other ‘server-side’ AI applications do this with e-commerce applications, these server-side AI applications are not created using Python.

Think of it like this: We have all seen how much time it takes to run a crew, and yes, variable dependant on Tasks’, hardwre resources etc. In the ‘real’ world, backend systems have to deal with many thousands of requests a second on an API endpoint. How would your crew deal with that scenario?

Having wrote all of the above, CrewAI does have a cloud based solution, but I’m assuming that’s more of a SAS type thing. Deploying to you rown server-side environment may not be the best commercial choice!

**REM: Just my opinion, but you can easily do your research on this. **REM technology continues to improve.

Regards.
JP

1 Like