This is my first large scale photogrammerty test in collaboration with a super awesome friend who was kind enough to lend his valuable time and drone for this experiment. The goal of this experiment was to learn how to use a drone to scan a fairly large environment ( a church which occupies an entire city block) and then take the images captured through the pipeline previously tested (in the sculpture experiment) to bring it into Augmented Reality as a 3D model.
Location of the Building - Jesus Sacred Heart Syriac Catholic Church, North Hollywood, Ca.
Size of Scan - 65 meters length and 38 meters width approx.
Type of Drone - DJI Mavik Air with an 1/2.3”12-megapixel onboard camera.
Time of day - around noon on a fairly overcast day preventing hard shadows.
Number of Pictures taken - 299.
Software used to generate the model - Reality Capture
Software for AR - Unity3D with the Vuforia AR engine
The first step in the reconstruction is to align the images. The software sometimes cant find overlaps between images taken during different flight-paths of the drone so to overcome that, I had to manually assign control points between two different datasets in order to merge them both into one model. If you look closely at the above picture, you will see some blue dots on the side of the building - those are the manual control points.
I then generated a mesh with texture from the pointcloud (above) and cropped the geometry so that I would generate a model that is usable in terms of memory. The entire landscape including surrounding buildings etc was around 40 million tris - after cropping it down to just the church area, I landed at around 28 million and then the decimation process brought it down to around under 1 million - this was now viewable in AR.
3D scan in Augmented Reality, scaled down to fit on a piece of paper viewed using a Samsung S7 phone - powered by Unity 3D and the Vuforia AR engine.