runtime error

user_proxy (to assistant): Tell me about this project, and the libary, then also tell me what I can use it for: https://www.gradio.app/guides/quickstart -------------------------------------------------------------------------------- Traceback (most recent call last): File "/home/user/app/app.py", line 37, in <module> user_proxy.initiate_chat( File "/home/user/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 521, in initiate_chat self.send(self.generate_init_message(**context), recipient, silent=silent) File "/home/user/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 324, in send recipient.receive(message, self, request_reply, silent) File "/home/user/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 452, in receive reply = self.generate_reply(messages=self.chat_messages[sender], sender=sender) File "/home/user/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 764, in generate_reply final, reply = reply_func(self, messages=messages, sender=sender, config=reply_func_tuple["config"]) File "/home/user/.local/lib/python3.10/site-packages/flaml/autogen/agentchat/conversable_agent.py", line 596, in generate_oai_reply response = oai.ChatCompletion.create( File "/home/user/.local/lib/python3.10/site-packages/flaml/autogen/oai/completion.py", line 758, in create raise ERROR ImportError: please install flaml[openai] option to use the flaml.autogen.oai subpackage.

Container logs:

Fetching error logs...