You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Please leave one option from the following and delete the rest:
Bug fix (non-breaking change which fixes an issue)
New feature (non-breaking change which adds functionality)
New Integration (non-breaking change which adds a new integration)
Breaking change (fix or feature that would cause existing functionality to not work as expected)
Non-breaking change (fix of existing functionality that will not change current behavior)
Documentation (added/updated documentation)
All tests should be run against the port production environment(using a testing org).
Core testing checklist
Integration able to create all default resources from scratch
Resync finishes successfully
Resync able to create entities
Resync able to update entities
Resync able to detect and delete entities
Scheduled resync able to abort existing resync and start a new one
Tested with at least 2 integrations from scratch
Tested with Kafka and Polling event listeners
Tested deletion of entities that don't pass the selector
Integration testing checklist
Integration able to create all default resources from scratch
Completed a full resync from a freshly installed integration and it completed successfully
Resync able to create entities
Resync able to update entities
Resync able to detect and delete entities
Resync finishes successfully
If new resource kind is added or updated in the integration, add example raw data, mapping and expected result to the examples folder in the integration directory.
If resource kind is updated, run the integration with the example data and check if the expected result is achieved
If new resource kind is added or updated, validate that live-events for that resource are working as expected
Send event_type and resync_start_time in lakehouse API requests ✨ Enhancement
Walkthroughs
Description
• Replace data_type parameter with event_type enum in lakehouse API
• Add resync_start_time parameter to track event creation timestamps
• Implement timestamp validation to prevent future timestamps
• Add comprehensive timestamp tracking to webhook events and raw results
Diagram
flowchart LR
A["Webhook/Resync Event"] -->|"created_at timestamp"| B["WebhookEvent"]
B -->|"propagate timestamp"| C["WebhookEventRawResults"]
C -->|"resync_start_time + event_type"| D["post_integration_raw_data"]
D -->|"validate timestamp"| E["Validation Layer"]
E -->|"add to request body"| F["Lakehouse API"]
G["LakehouseEventType Enum"] -->|"RESYNC or LIVE_EVENT"| D
1. New tests call non-existent isotime ☑ 🐞 Bug✓ Correctness
Description
test_resync_webhook_timestamp_tracking asserts request_body["type"] and calls datetime.isotime(),
which will fail (KeyError/AttributeError) because the implementation sets eventType and serializes
timestamps via isoformat().
The new tests check keys/methods that don't exist: the request body does not include "type" and
datetime has no isotime() method; the implementation uses resync_start_time.isoformat() and
event_type.value under "eventType".
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution
### Issue description
`test_resync_webhook_timestamp_tracking.py` contains incorrect assertions and a non-existent datetime method call:
- Asserts `request_body["type"]` even though the implementation does not set it.
- Uses `resync_time.isotime()` / `webhook_time.isotime()` which will raise `AttributeError`.
### Issue Context
The production code emits:
- `resyncStartTime` as `datetime.isoformat()`
- `eventType` as `LakehouseEventType.<...>.value`
### Fix Focus Areas
- port_ocean/tests/core/test_resync_webhook_timestamp_tracking.py[185-276]
- port_ocean/clients/port/mixins/integrations.py[317-327]
### Expected changes
- Replace `request_body["type"]` assertions with `request_body["eventType"]`.
- Replace `.isotime()` with `.isoformat()`.
- Ensure assertions match the actual serialized output (`resyncStartTime` should equal the exact isoformat string sent).
ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools
post_integration_raw_data normalizes resync_start_time to UTC for the future-time check, but then
serializes the original resync_start_time into the request body, which can emit naive timestamps or
non-UTC offsets despite the UTC intent.
+ if resync_start_time is not None:+ # Normalize both timestamps to UTC for comparison+ # If resync_start_time is naive, treat it as UTC+ if resync_start_time.tzinfo is None:+ resync_time_utc = resync_start_time.replace(tzinfo=timezone.utc)+ else:+ resync_time_utc = resync_start_time++ now_utc = datetime.now(timezone.utc)+ if resync_time_utc > now_utc:+ raise ValueError(+ f"resync_start_time cannot be in the future: {resync_start_time}"+ )+
logger.debug(
"starting POST raw data request", raw_data=raw_data, operation=operation
)
Evidence
The function computes resync_time_utc (including adding tzinfo for naive datetimes) but does not
use it for serialization; it uses resync_start_time.isoformat() instead, so a naive datetime will
be sent without timezone information.
The issue below was found during a code review. Follow the provided context and guidance below and implement a solution
### Issue description
`post_integration_raw_data` normalizes `resync_start_time` into `resync_time_utc` for validation, but serializes `resync_start_time` (not the normalized value) into `body['resyncStartTime']`. This can produce inconsistent output when callers pass naive datetimes or non-UTC tz-aware datetimes.
### Issue Context
The code already computes a normalized value for comparison; it should reuse that for serialization to keep validation/serialization consistent.
### Fix Focus Areas
- port_ocean/clients/port/mixins/integrations.py[298-327]
### Expected changes
- Normalize with:
- `resync_time_utc = resync_start_time.replace(tzinfo=timezone.utc)` when naive
- otherwise `resync_time_utc = resync_start_time.astimezone(timezone.utc)`
- Serialize `body['resyncStartTime'] = resync_time_utc.isoformat()`.
- (Optional consistency) compute `extractionTimestamp` using `datetime.now(timezone.utc)`.
ⓘ Copy this prompt and use it to remediate the issue with your preferred AI generation tools
ⓘ The new review experience is currently in Beta. Learn more
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Add this suggestion to a batch that can be applied as a single commit.This suggestion is invalid because no changes were made to the code.Suggestions cannot be applied while the pull request is closed.Suggestions cannot be applied while viewing a subset of changes.Only one suggestion per line can be applied in a batch.Add this suggestion to a batch that can be applied as a single commit.Applying suggestions on deleted lines is not supported.You must change the existing code in this line in order to create a valid suggestion.Outdated suggestions cannot be applied.This suggestion has been applied or marked resolved.Suggestions cannot be applied from pending reviews.Suggestions cannot be applied on multi-line comments.Suggestions cannot be applied while the pull request is queued to merge.Suggestion cannot be applied right now. Please check back later.
Description
What -
Why -
How -
Type of change
Please leave one option from the following and delete the rest:
All tests should be run against the port production environment(using a testing org).
Core testing checklist
Integration testing checklist
examplesfolder in the integration directory.Preflight checklist
Screenshots
Include screenshots from your environment showing how the resources of the integration will look.
API Documentation
Provide links to the API documentation used for this integration.