Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Writing graph with 0 nodes, 0 edges #109

Open
ericskh2 opened this issue Dec 14, 2024 · 0 comments
Open

Writing graph with 0 nodes, 0 edges #109

ericskh2 opened this issue Dec 14, 2024 · 0 comments

Comments

@ericskh2
Copy link

Hi, every time I ran insert() in using_llm_api_as_llm+ollama_embedding.py for a few hours, I received the sentence INFO:nano-graphrag:Writing graph with 0 nodes, 0 edges, then received an error openai.APIConnectionError: Connection error. Could you share suggestions for a possible fix? Thank you very much.

⠴ Processed 1015 chunks, 41892 entities(duplicated), 15247 relations(duplicated)
INFO:httpx:HTTP Request: POST https://api.deepseek.com/chat/completions "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST https://api.deepseek.com/chat/completions "HTTP/1.1 200 OK"
⠦ Processed 1016 chunks, 41923 entities(duplicated), 15256 relations(duplicated)
INFO:httpx:HTTP Request: POST https://api.deepseek.com/chat/completions "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST https://api.deepseek.com/chat/completions "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST https://api.deepseek.com/chat/completions "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST https://api.deepseek.com/chat/completions "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST https://api.deepseek.com/chat/completions "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST https://api.deepseek.com/chat/completions "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST https://api.deepseek.com/chat/completions "HTTP/1.1 200 OK"
INFO:httpx:HTTP Request: POST https://api.deepseek.com/chat/completions "HTTP/1.1 200 OK"
⠧ Processed 1017 chunks, 41943 entities(duplicated), 15265 relations(duplicated)
INFO:nano-graphrag:Writing graph with 0 nodes, 0 edges
Traceback (most recent call last):
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpx/_transports/default.py", line 72, in map_httpcore_exceptions
    yield
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpx/_transports/default.py", line 257, in __aiter__
    async for part in self._httpcore_stream:
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 367, in __aiter__
    raise exc from None
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpcore/_async/connection_pool.py", line 363, in __aiter__
    async for part in self._stream:
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpcore/_async/http11.py", line 349, in __aiter__
    raise exc
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpcore/_async/http11.py", line 341, in __aiter__
    async for chunk in self._connection._receive_response_body(**kwargs):
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpcore/_async/http11.py", line 210, in _receive_response_body
    event = await self._receive_event(timeout=timeout)
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpcore/_async/http11.py", line 220, in _receive_event
    with map_exceptions({h11.RemoteProtocolError: RemoteProtocolError}):
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpcore/_exceptions.py", line 14, in map_exceptions
    raise to_exc(exc) from exc
httpcore.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/openai/_base_client.py", line 1565, in _request
    response = await self._client.send(
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpx/_client.py", line 1688, in send
    raise exc
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpx/_client.py", line 1682, in send
    await response.aread()
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpx/_models.py", line 913, in aread
    self._content = b"".join([part async for part in self.aiter_bytes()])
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpx/_models.py", line 913, in <listcomp>
    self._content = b"".join([part async for part in self.aiter_bytes()])
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpx/_models.py", line 931, in aiter_bytes
    async for raw_bytes in self.aiter_raw():
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpx/_models.py", line 989, in aiter_raw
    async for raw_stream_bytes in self.stream:
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpx/_client.py", line 150, in __aiter__
    async for chunk in self._stream:
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpx/_transports/default.py", line 256, in __aiter__
    with map_httpcore_exceptions():
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/contextlib.py", line 153, in __exit__
    self.gen.throw(typ, value, traceback)
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/httpx/_transports/default.py", line 89, in map_httpcore_exceptions
    raise mapped_exc(message) from exc
httpx.RemoteProtocolError: peer closed connection without sending complete message body (incomplete chunked read)

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/research/d2/msc/khsew24/nano-graphrag/examples/using_llm_api_as_llm+ollama_embedding.py", line 168, in <module>
    insert(args.documents_path)
  File "/research/d2/msc/khsew24/nano-graphrag/examples/using_llm_api_as_llm+ollama_embedding.py", line 127, in insert
    rag.insert(file_contents_list)
  File "/research/d2/msc/khsew24/nano-graphrag/nano_graphrag/graphrag.py", line 207, in insert
    return loop.run_until_complete(self.ainsert(string_or_strings))
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/asyncio/base_events.py", line 649, in run_until_complete
    return future.result()
  File "/research/d2/msc/khsew24/nano-graphrag/nano_graphrag/graphrag.py", line 296, in ainsert
    maybe_new_kg = await self.entity_extraction_func(
  File "/research/d2/msc/khsew24/nano-graphrag/nano_graphrag/_op.py", line 383, in extract_entities
    results = await asyncio.gather(
  File "/research/d2/msc/khsew24/nano-graphrag/nano_graphrag/_op.py", line 326, in _process_single_content
    glean_result = await use_llm_func(continue_prompt, history_messages=history)
  File "/research/d2/msc/khsew24/nano-graphrag/nano_graphrag/_utils.py", line 177, in wait_func
    result = await func(*args, **kwargs)
  File "/research/d2/msc/khsew24/nano-graphrag/examples/using_llm_api_as_llm+ollama_embedding.py", line 60, in llm_model_if_cache
    response = await openai_async_client.chat.completions.create(
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/openai/resources/chat/completions.py", line 1490, in create
    return await self._post(
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/openai/_base_client.py", line 1832, in post
    return await self.request(cast_to, opts, stream=stream, stream_cls=stream_cls)
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/openai/_base_client.py", line 1526, in request
    return await self._request(
  File "/research/d2/msc/khsew24/conda_envs/nano-graphrag/lib/python3.10/site-packages/openai/_base_client.py", line 1599, in _request
    raise APIConnectionError(request=request) from err
openai.APIConnectionError: Connection error.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant