Skip to content

re-init#503

Open
Sir1002 wants to merge 1 commit intoAsyncFuncAI:mainfrom
Sir1002:main
Open

re-init#503
Sir1002 wants to merge 1 commit intoAsyncFuncAI:mainfrom
Sir1002:main

Conversation

@Sir1002
Copy link
Copy Markdown

@Sir1002 Sir1002 commented Apr 2, 2026

No description provided.

Copy link
Copy Markdown
Contributor

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request shifts DeepWiki towards a local-first architecture by defaulting to Ollama for LLM and embedding services and introducing a local repository type for data processing. It integrates promptfoo for regression testing, accompanied by new VS Code tasks and documentation. Review feedback identifies several hardcoded absolute paths that limit portability across environments. Additionally, there is a potential TypeError in the local file handling logic when repo_url is missing, redundant string replacements in the streaming chat implementation, and a regression in the frontend where language preferences are no longer persisted in local storage.

@@ -0,0 +1,214 @@
const DEFAULT_API_BASE_URL = 'http://127.0.0.1:8001';
const DEFAULT_REPO_URL = '/Users/samroku/dev/deepwiki-open';
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

The DEFAULT_REPO_URL is hardcoded to a specific local path (/Users/samroku/dev/deepwiki-open). This will cause the provider to fail for any other user. Consider using an environment variable or a relative path.

label: deepwiki-local-ollama
config:
apiBaseUrl: http://127.0.0.1:8001
repoUrl: /Users/samroku/dev/deepwiki-open
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

high

Hardcoded absolute path detected. This configuration is not portable and will fail on other environments.

3. Run `npm run promptfoo:ci` for the full local regression suite
4. Optional: inspect the latest run with `npm run promptfoo:view`

The default eval config lives in `promptfooconfig.yaml` and targets the local repository path `/Users/samroku/dev/deepwiki-open`, so it stays inside your local-first workflow by default.
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The path /Users/samroku/dev/deepwiki-open is hardcoded. This makes the documentation and the default configuration specific to your local machine. Please use a placeholder or a relative path to ensure it works for other contributors.

Comment on lines +703 to +707
if repo_type == "local" or (
repo_url
and not repo_url.startswith("http://")
and not repo_url.startswith("https://")
):
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

If repo_type is set to "local" but repo_url is missing or null, os.path.abspath(repo_url) will raise a TypeError. A check should be added to ensure repo_url is provided when repo_type is "local".

Suggested change
if repo_type == "local" or (
repo_url
and not repo_url.startswith("http://")
and not repo_url.startswith("https://")
):
if (repo_type == "local" or (
repo_url
and not repo_url.startswith("http://")
and not repo_url.startswith("https://")
)) and repo_url:

Comment on lines +516 to 520
text = extract_ollama_stream_text(chunk).replace('<think>', '').replace('</think>', '')
if text:
logger.debug("Streaming Ollama chunk with %s chars", len(text))
text = text.replace('<think>', '').replace('</think>', '')
yield text
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The <think> and </think> tags are being replaced twice. They are already removed during the assignment of text on line 516, making the replacement on line 519 redundant.

Suggested change
text = extract_ollama_stream_text(chunk).replace('<think>', '').replace('</think>', '')
if text:
logger.debug("Streaming Ollama chunk with %s chars", len(text))
text = text.replace('<think>', '').replace('</think>', '')
yield text
text = extract_ollama_stream_text(chunk).replace('<think>', '').replace('</think>', '')
if text:
logger.debug("Streaming Ollama chunk with %s chars", len(text))
yield text

Comment on lines +641 to 644
text = extract_ollama_stream_text(chunk).replace('<think>', '').replace('</think>', '')
if text:
text = text.replace('<think>', '').replace('</think>', '')
yield text
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

Redundant replacement of <think> tags. The tags are already removed on line 641.

Suggested change
text = extract_ollama_stream_text(chunk).replace('<think>', '').replace('</think>', '')
if text:
text = text.replace('<think>', '').replace('</think>', '')
yield text
text = extract_ollama_stream_text(chunk).replace('<think>', '').replace('</think>', '')
if text:
yield text

Comment on lines +106 to +107
const storedLanguage =
typeof window !== 'undefined' ? detectBrowserLanguage() : 'en';
Copy link
Copy Markdown
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

medium

The removal of localStorage logic for language persistence is a regression. Users will now lose their language preference whenever the page is refreshed or the session ends.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant