This project explores how aerial imagery can be transformed from documentation into spatial experience. Using drone-captured urban and coastal environments, the work investigates how symmetry, recursion, and repetition alter our perception of place.
Instead of representing a location, the images construct a new one.
Each composition begins as a conventional aerial photograph. The image is then reorganized through controlled mirroring and recursive spatial folding, producing visual structures that resemble architectural interiors, tunnels, voids, and infinite volumes. The viewer no longer reads the image as landscape, but as a navigable space.
The process treats photography not as capture, but as raw spatial data.
By collapsing horizon lines, duplicating vanishing points, and generating central gravitational fields within the frame, the work produces perceptual ambiguity: sky becomes surface, water becomes wall, and cities behave like geometric containers rather than geographic sites.
The resulting images exist between documentation and simulation. They are not renders, yet they behave like 3D environments. They are photographs, yet they resist geographic interpretation.
This investigation asks a simple question: Can an image become a place?
The project situates drone imaging within a broader computational imaging practice — where the camera records reality, but the structure of the image constructs a new spatial logic.
Medium: Drone Photography, Computational Image Transformation, Spatial Composition
Keywords: Perceptual Space, Recursive Geometry, Computational Imaging, Spatial Illusion, Aerial Imaging









