Skip to content

feat: Add configurable data point limit for Traffic Over Time#7

Merged
NotYuSheng merged 4 commits intomainfrom
feature/timeline-data-point-limit
Feb 2, 2026
Merged

feat: Add configurable data point limit for Traffic Over Time#7
NotYuSheng merged 4 commits intomainfrom
feature/timeline-data-point-limit

Conversation

@NotYuSheng
Copy link
Owner

Implement automatic interval adjustment to limit the number of data points returned by timeline endpoints, improving performance and rendering for large PCAP files.

Changes:

  • Add AnalysisProperties configuration class with timeline limits
  • Add max-timeline-data-points (default: 1000) to application.yml
  • Add calculateOptimalInterval() method in TimelineService
  • Update timeline endpoints to accept optional maxDataPoints parameter
  • Add validation for maxDataPoints (10-10,000 range)
  • Update frontend timelineService to support maxDataPoints parameter

The system automatically adjusts the time interval when expected bins exceed the limit, preserving full time range coverage and data accuracy through aggregation rather than sampling.

Implement automatic interval adjustment to limit the number of data points
returned by timeline endpoints, improving performance and rendering for large
PCAP files.

Changes:
- Add AnalysisProperties configuration class with timeline limits
- Add max-timeline-data-points (default: 1000) to application.yml
- Add calculateOptimalInterval() method in TimelineService
- Update timeline endpoints to accept optional maxDataPoints parameter
- Add validation for maxDataPoints (10-10,000 range)
- Update frontend timelineService to support maxDataPoints parameter

The system automatically adjusts the time interval when expected bins exceed
the limit, preserving full time range coverage and data accuracy through
aggregation rather than sampling.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
@gemini-code-assist
Copy link

Summary of Changes

Hello @NotYuSheng, I'm Gemini Code Assist1! I'm currently reviewing this pull request and will post my feedback shortly. In the meantime, here's a summary to help you and other reviewers quickly get up to speed!

This pull request introduces a significant enhancement to the traffic over time analysis feature by allowing users to configure a maximum limit for data points returned by timeline endpoints. This is crucial for improving the performance and rendering efficiency, especially when dealing with large PCAP files that could otherwise generate an excessive number of data points. The system now intelligently adjusts the time interval for binning, ensuring that the full time range is covered while keeping the data point count within a manageable limit, thus providing a smoother user experience without sacrificing data accuracy through aggregation.

Highlights

  • Configurable Data Point Limit: Introduced a configurable maximum data point limit for timeline endpoints, allowing control over the number of data points returned for traffic over time analysis.
  • Automatic Interval Adjustment: Implemented an automatic interval adjustment mechanism in the backend service to dynamically increase the binning interval when the number of expected data points exceeds the configured limit, ensuring optimal performance and rendering.
  • New Configuration Properties: Added a new AnalysisProperties configuration class to manage analysis-related settings, including maxTimelineDataPoints, minTimelineInterval, and autoAdjustInterval.
  • API Endpoint Enhancements: Updated existing timeline API endpoints (/api/timeline/{fileId} and /api/timeline/{fileId}/range) to accept an optional maxDataPoints parameter, with validation for its acceptable range (10-10,000).
  • Frontend Integration: Modified the frontend timelineService to support passing the optional maxDataPoints parameter to the backend timeline APIs.

🧠 New Feature in Public Preview: You can now enable Memory to help Gemini Code Assist learn from your team's feedback. This makes future code reviews more consistent and personalized to your project's style. Click here to enable Memory in your admin console.

Using Gemini Code Assist

The full guide for Gemini Code Assist can be found on our documentation page, here are some quick tips.

Invoking Gemini

You can request assistance from Gemini at any point by creating a comment using either /gemini <command> or @gemini-code-assist <command>. Below is a summary of the supported commands on the current page.

Feature Command Description
Code Review /gemini review Performs a code review for the current pull request in its current state.
Pull Request Summary /gemini summary Provides a summary of the current pull request in its current state.
Comment @gemini-code-assist Responds in comments when explicitly tagged, both in pull request comments and review comments.
Help /gemini help Displays a list of available commands.

Customization

To customize Gemini Code Assist for GitHub experience, repository maintainers can create a configuration file and/or provide a custom code review style guide (such as PEP-8 for Python) by creating and adding files to a .gemini/ folder in the base of the repository. Detailed instructions can be found here.

Limitations & Feedback

Gemini Code Assist may make mistakes. Please leave feedback on any instances where its feedback is incorrect or counter productive. You can react with 👍 and 👎 on @gemini-code-assist comments. If you're interested in giving your feedback about your experience with Gemini Code Assist for Github and other Google products, sign up here.

You can also get AI-powered code generation, chat, as well as code reviews directly in the IDE at no cost with the Gemini Code Assist IDE Extension.

Footnotes

  1. Review the Privacy Notices, Generative AI Prohibited Use Policy, Terms of Service, and learn how to configure Gemini Code Assist in GitHub here. Gemini can make mistakes, so double check it and use code with caution.

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a valuable feature for performance improvement by limiting the number of data points in timeline queries and automatically adjusting the interval. However, a high-severity Denial of Service (DoS) vulnerability was identified due to missing validation of the interval query parameter, which could lead to application crashes or infinite loops if a zero or negative value is provided. To address this and improve overall robustness, I recommend using Spring's declarative validation framework for input validation in the controller, which will simplify the code, eliminate duplication, and provide more appropriate HTTP error responses. A minor simplification on the frontend is also suggested. These improvements will make the new feature more resilient, secure, and easier to maintain.

NotYuSheng and others added 2 commits February 2, 2026 11:17
Resolved merge conflicts by keeping our new maxDataPoints functionality
while applying the linting and formatting changes from main:
- TimelineController.java: Applied Google Java Style formatting
- TimelineService.java: Applied Google Java Style formatting, added ChronoUnit import
- timelineService.ts: Applied consistent semicolon usage

The feature implementation remains intact with automatic interval
adjustment to respect the configurable data point limit.
Security fixes:
- Add interval parameter validation in both timeline endpoints (must be >= 1)
- Add defensive check in TimelineService to prevent division by zero
- Add start/end timestamp validation (start must be before end)

Code quality improvements:
- Refactor URL construction in frontend for better readability

These changes address high-severity security concerns raised in code review
regarding potential DoS vulnerabilities from invalid interval values.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
@NotYuSheng
Copy link
Owner Author

/gemini review

Copy link

@gemini-code-assist gemini-code-assist bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Code Review

This pull request introduces a valuable feature for performance optimization by adding a configurable data point limit for timeline queries and automatically adjusting the time interval. However, a critical security vulnerability was identified due to a lack of access control on the timeline endpoints, which could lead to Insecure Direct Object Reference (IDOR) attacks. Additionally, potential Denial of Service (DoS) risks are present from inefficient algorithmic complexity and possible integer overflow in the binning logic, and a log injection vulnerability was found in the range-based timeline endpoint. From a code quality perspective, a high-severity bug in TimelineService miscalculates expected bins, potentially exceeding data point limits. There are also opportunities to refactor duplicated validation logic in TimelineController and simplify frontend code by leveraging apiClient capabilities.

…ation

- Replace manual validation with Spring @validated annotations for proper 400 error responses
- Fix integer division bug using ceiling division to accurately estimate bin count
- Fix integer overflow risk in interval calculation for large time ranges
- Optimize generateTimelineBins from O(M*N) to O(M) by calculating bin indices directly
- Fix log injection by parsing timestamps before logging
- Simplify TypeScript query parameter construction using axios params object

These changes improve security, correctness, performance, and code quality
based on automated code review feedback.

Co-Authored-By: Claude Sonnet 4.5 <noreply@anthropic.com>
@NotYuSheng NotYuSheng merged commit d420e2c into main Feb 2, 2026
2 checks passed
@NotYuSheng NotYuSheng deleted the feature/timeline-data-point-limit branch February 2, 2026 08:22
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

1 participant