-
Notifications
You must be signed in to change notification settings - Fork 12
feat: pydantic models for type checking S3 uploads #261
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
📝 WalkthroughWalkthroughAdds a new Pydantic-based models package, wires those models into S3Client JSON validation and upload flow, updates project metadata to include pydantic and the models package in the wheel, and moves a couple of unconditional runtime imports/refactors in utility modules and API client placement. Changes
Sequence Diagram(s)sequenceDiagram
participant Client
participant S3Client
participant Validator as Pydantic Validator
participant FileSystem
participant S3 as AWS S3
Client->>S3Client: upload_json_file(local_path, s3_name)
S3Client->>FileSystem: read JSON file
FileSystem-->>S3Client: json_data
S3Client->>S3Client: _get_validator(s3_name)
alt validator found
S3Client->>Validator: validate(json_data)
alt validation passes
Validator-->>S3Client: valid
S3Client->>S3: upload object
S3-->>S3Client: upload success
S3Client-->>Client: True
else validation fails
Validator-->>S3Client: ValidationError
S3Client->>S3Client: log error
S3Client-->>Client: False
end
else no validator
S3Client->>S3: upload object (no validation)
S3-->>S3Client: upload success
S3Client-->>Client: True
end
Estimated code review effort🎯 3 (Moderate) | ⏱️ ~22 minutes Poem
🚥 Pre-merge checks | ✅ 2 | ❌ 1❌ Failed checks (1 warning)
✅ Passed checks (2 passed)
✏️ Tip: You can configure your own custom pre-merge checks in the settings. ✨ Finishing touches
🧪 Generate unit tests (beta)
Thanks for using CodeRabbit! It's free for OSS, and your support helps us grow. If you like it, consider giving us a shout-out. Comment |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 2
🤖 Fix all issues with AI agents
In `@apps/dataflow/src/clients/_s3_client.py`:
- Line 46: In the S3 client constructor (e.g., class S3Client.__init__), remove
the redundant duplicate assignment "self.logger = get_app_logger()" (the second
instance) since self.logger is already set earlier; keep the first
initialization and delete the later one so the logger is only assigned once.
In `@apps/dataflow/src/models/buildings.py`:
- Line 22: The hitbox field on the building model is declared as "hitbox:
list[GeoCoordinate] | None" but has no default, so JSON that omits the key will
fail validation; update the model to provide a default (e.g., set hitbox to None
by default or use a field/default_factory as your model style uses) so omitted
hitbox is treated as None, matching how code and entrances are handled and
ensuring optional input is accepted.
🧹 Nitpick comments (1)
apps/dataflow/src/models/graph.py (1)
17-18: Consider defining a typed model forto_floor_info.Using
dict[str, Any]reduces type safety. If the structure of floor transition info is well-defined, consider creating a dedicated Pydantic model (e.g.,FloorTransitionInfo) to enable validation of its contents.
Summary by CodeRabbit
New Features
Chores
✏️ Tip: You can customize this high-level summary in your review settings.