quiero una app con una interfaz de chat, que me permita interactuar con un modelo LLM. no definas ningun c贸digo para el modelo. Solo concentrate en la interfaz. Quiero que sea lo mas similar posible a la de ChatGPT
Drop files here
or click to upload
import streamlit as st
# Set page config for a cleaner look
st.set_page_config(
page_title="Chat Interface",
page_icon="馃挱",
layout="wide"
)
# Custom CSS for styling
st.markdown("""
<style>
.stChatMessage {
background-color: #f7f7f8;
padding: 1rem;
border-radius: 0.5rem;
margin-bottom: 1rem;
}
.stChatMessage[data-testid="assistant"] {
background-color: white;
}
.stTextInput {
position: fixed;
bottom: 3rem;
width: calc(100% - 4rem) !important;
padding: 1rem;
background-color: white;
border-radius: 0.5rem;
box-shadow: 0 0 10px rgba(0,0,0,0.1);
}
</style>
""", unsafe_allow_html=True)
# Initialize chat history
if "messages" not in st.session_state:
st.session_state.messages = []
# Display chat messages from history on app rerun
for message in st.session_state.messages:
with st.chat_message(message["role"]):
st.markdown(message["content"])
# Chat input
if prompt := st.chat_input("Escribe un mensaje aqu铆..."):
# Add user message to chat history
st.session_state.messages.append({"role": "user", "content": prompt})
# Display user message in chat message container
with st.chat_message("user"):
st.markdown(prompt)
# Here you would normally process the message with your LLM
# For now, we'll just echo back a placeholder response
with st.chat_message("assistant"):
response = "Este es un mensaje de placeholder. Aqu铆 es donde integrar铆as tu modelo LLM para generar respuestas reales."
st.markdown(response)
# Add assistant response to chat history
st.session_state.messages.append({"role": "assistant", "content": response})
Hi! I can help you with any questions about Streamlit and Python. What would you like to know?