runtime error

Exit code: 1. Reason: successful Traceback (most recent call last): File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 304, in hf_raise_for_status response.raise_for_status() File "/usr/local/lib/python3.10/site-packages/requests/models.py", line 1024, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 500 Server Error: Internal Server Error for url: https://api-inference.huggingface.co/pipeline/feature-extraction/sentence-transformers/all-mpnet-base-v2 The above exception was the direct cause of the following exception: Traceback (most recent call last): File "/home/user/app/app.py", line 36, in <module> vectorstore = Chroma.from_documents(documents = data, embedding = embeddings) File "/usr/local/lib/python3.10/site-packages/langchain_chroma/vectorstores.py", line 921, in from_documents return cls.from_texts( File "/usr/local/lib/python3.10/site-packages/langchain_chroma/vectorstores.py", line 882, in from_texts chroma_collection.add_texts(texts=texts, metadatas=metadatas, ids=ids) File "/usr/local/lib/python3.10/site-packages/langchain_chroma/vectorstores.py", line 389, in add_texts embeddings = self._embedding_function.embed_documents(texts) File "/usr/local/lib/python3.10/site-packages/langchain_huggingface/embeddings/huggingface_endpoint.py", line 107, in embed_documents responses = self.client.post( File "/usr/local/lib/python3.10/site-packages/huggingface_hub/inference/_client.py", line 273, in post hf_raise_for_status(response) File "/usr/local/lib/python3.10/site-packages/huggingface_hub/utils/_errors.py", line 371, in hf_raise_for_status raise HfHubHTTPError(str(e), response=response) from e huggingface_hub.utils._errors.HfHubHTTPError: 500 Server Error: Internal Server Error for url: https://api-inference.huggingface.co/pipeline/feature-extraction/sentence-transformers/all-mpnet-base-v2 (Request ID: 3ubq9mL43OY-jNfLx6Bap) Model too busy, unable to get response in less than 60 second(s)

Container logs:

Fetching error logs...