Temporal Integration in Drone-Based Urban Imaging
A drone-based experiment that reconstructs an urban scene by mean-stacking sequential frames over 45 seconds.
- Capture 15 frames at 3-second intervals (1/200 shutter).
- Align + blend frames using mean stacking.
- Motion becomes noise; stable structures converge.

Temporal Sampling (Instant)
A single exposure isolates a fraction of time, preserving motion as fragmentation.



Temporal Variability (Duration)
Multiple exposures capture dynamic elements across duration.

Spatial Integration (Reconstruction)
Stacking as signal accumulation across time.
Technical Method
Captured using drone-mounted camera at 1/200 shutter speed. Sequential frames were combined using post-production blending. The process demonstrates temporal integration comparable to HDR stacking and computational exposure fusion.
– Drone altitude: 275 meters
– Shutter range: 1/200
– Interval: 3 seconds
– Frame count: 15 frames
– Simulated duration: 45 seconds
– Stacking method: Mean Stacking
Mean stacking suppresses transient motion while preserving consistent structure—producing a probabilistic image of what persisted.
Applied Interpretation
- Identifies stable circulation zones vs transient activity
- reveals behavioral patterns of space usage
- Removes accidental clutter (cars, pedestrians)
- Produces architecture-centric documentation
The image does not show what was present at any given instant.
It shows what was most likely to remain.
