Skip to content

Conversation

@him6794
Copy link

@him6794 him6794 commented Jan 16, 2026

Description

This PR fixes a Cross-Site Scripting (XSS) vulnerability in the llm-chat-app-template where user-generated content from the AI chat responses was not properly sanitized before being inserted into the DOM.

What was fixed:

  • XSS Vulnerability: Chat messages were being inserted into the DOM using .innerHTML without sanitization, allowing potential execution of malicious scripts
  • Solution: Replaced .innerHTML with .textContent for user-generated content and implemented proper HTML escaping
  • Impact: Prevents injection attacks where malicious code could be executed in users' browsers through crafted AI responses

Changes:

  • Updated public/chat.js to use safe DOM manipulation methods
  • Added HTML escaping utilities in src/index.ts
  • Updated dependencies in package.json and package-lock.json
  • Improved TypeScript configurations for better type safety

Checklist

  • Template Metadata
    • template directory ends with -template
    • "cloudflare" section of package.json is populated
    • template preview image uploaded to Images
    • README is populated and uses <!-- dash-content-start --> and <!-- dash-content-end --> to designate the Dash readme preview
    • .gitignore file exists
    • package.json contains a deploy command
    • package.json contains private: true and no version field

Note: This PR is a security fix for an existing template. All template metadata requirements were already satisfied. The changes focus solely on remediating the XSS vulnerability.

Copilot AI review requested due to automatic review settings January 16, 2026 05:09
Copy link

Copilot AI left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Pull request overview

This PR successfully addresses a Cross-Site Scripting (XSS) vulnerability in the llm-chat-app-template by replacing unsafe .innerHTML usage with safer alternatives like .textContent and DOM methods. The changes also include dependency updates, configuration formatting improvements, and backend modifications to switch from streaming to non-streaming responses.

Changes:

  • Fixed XSS vulnerability in frontend by replacing .innerHTML with .textContent for user-generated content
  • Updated backend to use non-streaming API responses with returnRawResponse: true
  • Updated dependencies and improved TypeScript/JSON formatting
  • Added a new setup script for workers-for-platforms-template

Reviewed changes

Copilot reviewed 6 out of 11 changed files in this pull request and generated no comments.

Show a summary per file
File Description
llm-chat-app-template/public/chat.js Replaced .innerHTML with .textContent to prevent XSS attacks
llm-chat-app-template/src/index.ts Modified to return raw streaming response from Workers AI
llm-chat-app-template/package.json Updated dependency versions and improved formatting
llm-chat-app-template/wrangler.jsonc Updated compatibility date and improved formatting
llm-chat-app-template/worker-configuration.d.ts Updated TypeScript type definitions
llm-chat-app-template/tsconfig.json Improved JSON formatting consistency
llm-chat-app-template/src/types.ts Improved TypeScript formatting consistency
llm-chat-app-template/public/index.html Minor CSS formatting improvements
workers-for-platforms-template/scripts/setup.js Added new setup script (not related to XSS fix)

💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant