Skip to content

Using AI feature (local model and online) leading to errors #1106

@amru39

Description

@amru39

Describe the bug :


Using the AI feature leading to errors with both local and online models.

To Reproduce :


Running AI chat leads to errors as shown below.

Local Ollama with LLAMA3:8b

File "/home/rahul/Downloads/QualCoder/src/qualcoder/ai_async_worker.py", line 108, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/src/qualcoder/ai_llm.py", line 728, in _retrieve_similar_data
descriptions = self.generate_code_descriptions(code_name, code_memo)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/src/qualcoder/ai_llm.py", line 703, in generate_code_descriptions
res = self.large_llm.invoke(code_descriptions_prompt, response_format={"type": "json_object"}, config=config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 393, in invoke
self.generate_prompt(

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1019, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 837, in generate
self._generate_with_cache(

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1085, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 1142, in _generate
return generate_from_stream(stream_iter)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 168, in generate_from_stream
generation = next(stream, None)
^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 1096, in _stream
with context_manager as response:

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/openai/lib/streaming/chat/_completions.py", line 150, in enter
raw_stream = self.__api_request()
^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/openai/_utils/_utils.py", line 286, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 1147, in create
return self._post(
^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/openai/_base_client.py", line 1259, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/openai/_base_client.py", line 1047, in request
raise self._make_status_error_from_response(err.response) from None

AI Error:
NotFoundError:

OpenAI API:

File "/home/rahul/Downloads/QualCoder/src/qualcoder/ai_async_worker.py", line 108, in run
result = self.fn(*self.args, **self.kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/src/qualcoder/ai_llm.py", line 728, in _retrieve_similar_data
descriptions = self.generate_code_descriptions(code_name, code_memo)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/src/qualcoder/ai_llm.py", line 703, in generate_code_descriptions
res = self.large_llm.invoke(code_descriptions_prompt, response_format={"type": "json_object"}, config=config)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 393, in invoke
self.generate_prompt(

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1019, in generate_prompt
return self.generate(prompt_messages, stop=stop, callbacks=callbacks, **kwargs)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 837, in generate
self._generate_with_cache(

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 1085, in _generate_with_cache
result = self._generate(
^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 1142, in _generate
return generate_from_stream(stream_iter)
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_core/language_models/chat_models.py", line 168, in generate_from_stream
generation = next(stream, None)
^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/langchain_openai/chat_models/base.py", line 1096, in _stream
with context_manager as response:

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/openai/lib/streaming/chat/_completions.py", line 150, in enter
raw_stream = self.__api_request()
^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/openai/_utils/_utils.py", line 286, in wrapper
return func(*args, **kwargs)
^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/openai/resources/chat/completions/completions.py", line 1147, in create
return self._post(
^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/openai/_base_client.py", line 1259, in post
return cast(ResponseT, self.request(cast_to, opts, stream=stream, stream_cls=stream_cls))
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^

File "/home/rahul/Downloads/QualCoder/env/lib/python3.12/site-packages/openai/_base_client.py", line 1047, in request
raise self._make_status_error_from_response(err.response) from None

AI Error:
RateLimitError:

Expected behavior :


Error should not occur.

Screenshots :

Desktop (please complete the following information):

  • OS: LinuxMint
  • Version 22.2 Cinnamon

Additional context :


Using QualCoder 3.7.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions