Hi,
I have a streamlit that dump the output of crew.kickoff
as simple as the following
if user_input := st.chat_input("Type your question here..."):
# Add user message to chat history
st.session_state.conversation_history.append({"role": "user", "content": user_input})
# Display user message in chat message container
with st.chat_message("user"):
st.markdown(user_input)
# # Generate AI response
with st.spinner("Thinking..."):
result = crew.kickoff(inputs={
'conversation_history': st.session_state.conversation_history,
'user_input': user_input,
'system_prompt': sys_prompt
})
# # Display AI response in chat message container
with st.chat_message("assistant"):
st.markdown(result)
# Add assistant response to chat history
st.session_state.conversation_history.append({"role": "assistant", "content": result})
I have spent hours already how to progressively stream the output;
there is a component offered by streamlit called st.write_stream; but it expect a generator; I wonder if there is a snippet that can help me to get it to work.
I already have
llm = LLM(
model="openai/gpt-4o",
stream=True # Enable streaming
)
and my agent is empowered by that.
I can see in the console the output is printed as streaming but that’s not the case on streamlit
itself