Sunday, October 25, 2015

Navigation Map Construction

Introduction:

The purpose of this exercise was for us to create maps to be used in navigation activities in the future. We created two maps, one with the World Geodetic Survey 1984 (WGS84) coordinate system, and one with the Universal Transverse Mercator (UTM) coordinate system.

Methods:

We needed to create a map using each coordinate system, with grid lines that were appropriate for the respective systems. I generated elevation rasters and contour lines from LiDAR the city acquired in 2013. I also added maps of the trails that I had previously collected with a Trimble Juno 3b when conducting research. I think the trails will be a good reference for knowing how far past or before  trail the markers are. I used 10ft contours, as they would show the terrain better than a 5 meter contour, but be less distracting than the 2ft contours my professor offered. I refrained from adding an aerial image, as I have a fair understanding of the Priory's landscape and I didn't think the map needed any additional clutter. After designing the maps, I added both my and one of my partner's pace counts, so we would have them recorded for reference in the field.
Figure 1: The UTM zones for the lower 48 U.S. States.
I used UTM zone 15N to create my reference map.
Universal Transverse Mercator breaks the whole planet into 60 different zones that are 6 degrees wide, which reduces overall distortion of the areas within the grid. Locations are measured as "northings" and "eastings", or distances from the equator and from a false origin. The central meridian of each zone is given an easting of 500,000, so all values are within the zone have a positive easting value.
Figure 2: My UTM based reference map

The World Geodetic Coordinate System 1984 (WGS84), provides coordinates as their locations on Earth's ellipsoid surface. It is used by GPS for locating positions anywhere on earth's surface and for creating extremely large scale maps.
Figure 3: My WGS84 based reference map

Discussion:

The UTM map will be extremely helpful when navigating with the compass and pace counts, as all distances on the map will be calculable in meters.

The WGS84 map won't be very helpful for navigation with the compass and pace counts, as the grid has .00056 degree intervals, which translates to 44.0776 meters. This awkward interval will make any sort of distance calculations difficult when in the field. The map will, however,  be extremely useful for the GPS navigation activity, as all of the coordinate locations will be easily identifiable on the map.

Conclusion: 

I am intrigued to see how well or poorly my navigation maps actually perform when in the field. I am expecting the trails data to be a helpful reference, though they may prove to be an unnecessary distraction.


Sources:

http://pubs.usgs.gov/fs/2001/0077/report.pdf

I used this page to convert the distances from decimal degrees to meters: https://en.wikipedia.org/wiki/Decimal_degrees

Sunday, October 18, 2015

Unmanned Aerial Systems

Introduction:

What is an Unmanned Aerial System (UAS)? According to the FAA, "A UAS is the unmanned aircraft (UA) and all of the associated support equipment, control station, data links, telemetry, communications and navigation equipment, etc., necessary to operate the unmanned aircraft." A UAS differs from a remote control (RC) vehicle, in that it is operates along a preplanned course without being directly controlled by the operator.

Methods:

Part 1: Demonstration Flight

Figure 1: The DJI Phantom.

In order to introduce our class to major concepts and capabilities of UAS, our professor manually flew a DJI Phantom and captured aerial images under the university's footbridge. He selected this flight location, because there was a lot of open vertical space, and it wouldn't draw too much attention. Although it was technically being flown manually, the Phantom was fairly autonomous, hovering in position even when the operator completely removed their hands from the remote.
Figure 2: The Phantom in flight.
The Phantom had a camera connected via a 2 axis gimbal. The gimbal is an adjustable arm, which allows the camera to be rotated/stabilized both vertically and horizontally. The gimbal also allows the camera to be adjusted to capture images off and on nadir (from directly above).

Part 2: Software

After the demonstration flight was completed, we returned to the lab to learn how to process imagery, generate flight plans used to fly automated missions, and gain better understanding of how different types of aerial vehicles operate.

Mission Planner:
Using the software "Mission Planner", I designed a flight plan based on the following scenario: "The City of Eau Claire would like to know how many tombstones Lakeview Cemetery contains, and wants to use UAS to find out. What camera should you use, and at what altitude should you fly?".
Figure 3: Flight plan using the Canon SX230 HS, flying at 98m.
Figure 4: Flight plan using the Canon 5d Mark II, flying at 98m.
I generated two separate flight plans for the same area, using different Canon camera sensors. The flight using the SX230 HS required 11 passes at 98m altitude to cover the whole area with 3.02 cm/pixel spatial resolution, for a total of 88 images. The flight using the 5d Mark II required just over 8 passes at 28m altitude to cover the whole area with 3.08cm/pixel spatial resolution, for a total of 56 images. The SX230 HS would require 19 minute flight time, and the 5d Mark II would only require a 15 minute flight time. The SX230 HS has a 12.1 megapixel sensor, and the 5d Mark II has a 21.1 megapixel sensor.

The 5d Mark II would seem like the natural choice, as it provides better quality in less time, however the devil is in the details. The the Canon SX230 HS weighs 223g and has an MSRP of $350. The camera body alone of the Canon 5d Mark II weighs 810g, a comparable lens (Canon EF 24mm 2.8 IS) weighs 280g, for a combined weight of 1,090g and with a combined cost of $1,800. These higher weights would require a UAV with a higher payload capacity, further increasing the cost.

Real Flight Flight Simulator:

Using the software "RealFlight 7.5" I gained experience flying both fixed-wing and multi rotor aircraft. I flew two different multi rotors: the Octocopter 1000 and the Tricopter 9000. The tricopter was extremely unstable, but very fast. The octocopter was very stable, and had an attached gimbal which allowed the adjustment of an attached DSLR camera mounted on the bottom of the craft. The tricopter did not perform well in situations where micro adjustments were needed, and it crashed often when attempting to navigate through tight locations like the obstacle course (Figure 5). The octocopter, while slower, responded extremely well to micro adjustments, and handled the obstacle course and the construction site with ease (Figure 6). The octocopter also performed extremely well in high-wind conditions (Figure 7).

Figure 5: The tricopter in flight.

Figure 6: Navigating the octocopter through the tight spaces of a construction site.
Figure 7: The octocopter was stable even with a 15mph crosswind.
Next, I flew two fixed wing aircraft: a flying-wing called the 'Slinger', and a Vertical Take-off and Landing (VTOL) jet called the 'Harrier'. The slinger could operate at substantially higher speeds than either of the multi rotors previously tested, but was substantially more difficult to fly (Figure 8). It required higher speeds to maintain flight, and as a result, had difficulties trying to navigate the tight spaces in which the octocopter was comfortable (Figure 9). The slinger also required long horizontal distances to take off, unlike the the multi rotors. The slinger performed the best when tight spaces weren't an issue, and when it could take advantage of its greater speed.

Figure 8: The slinger in flight.
Figure 9: The result of trying to navigate the construction site in Figure 8 with the slinger.
The Harrier was a strange hybrid between the slinger and the multi rotors, as a result of its VTOL capabilities. If its turbines were directed towards the ground it could hover, and yet sustain flight at high speeds with the turbines directed directly behind it (Figure 10). It was the fastest of any of the aircraft flown, and required the longest turning distances when in full flight mode. It had the most difficulties with tight spaces because of its incredibly high speed.

Figure 10: The harrier's turbines are visible to the left
of the bomb-shaped capsule under its wing.
Pix4D: 

Pix4D was used to process imagery collected by the flight described in Part 1. The software analyzed the photographs, generated a point cloud, and created a DSM and Orthomosaic.
Figure 11: The DJI Phantom's images contained GPS data
Pix4D used this data to stich the images together and generate
a point cloud, DSM, and an orthomosaic

The point cloud could be used to add additional points to locations where existing LiDAR data has gaps, or generate extremely high-resolution TINs (Figure 12). The point cloud had extremely high point density of 0.007m (or .276in) (Figure 13). For comparison, the LiDAR point cloud the city obtained in 2013 for the same area has a point density of 0.45m (or 17.724in) (Figure 14).
Figure 12: The generated point cloud.
Figure 13: Properties for the LasDataset created from the Phantom data.

Figure 14: Properties for the Las Dataset from the Citywide LiDAR survey.


The DSM recorded subtle changed in topography extremely well, and has extremely small pixel sizes, allowing identification of individual rocks (Figure 16).
Figure 12: A hillshade created from the DSM, to show the topography.
Figure 16: The DSM and Ortho have the same incredibly small pixel size of 0.00204m
This equates to 2.04mm or 0.08in.
In 2013, the City obtained aerial imagery with pixel sizes of 3in

The Orthomosaic allows for 3D models of the landscape to be created. It was created along with the DSM, so it matches perfectly with the DSM when viewed in 3D viewers, like ArcScene. The orthomosaic has 37.3 TIMES higher resolution than the existing aerial images the city obtained in 2013 (Figure 16).

Discussion:

UAS Scenario:

"A mining company wants to get a better idea of the volume they remove each week. They don’t have the money for LiDAR, but want to engage in 3D analysis"

LiDAR may be the to-go buzzword for many in the industry these days, but UAS is the way of the future. UAV photogrammetry can obtain be used to calculate volumetrics with only 0.1% error when compared to LiDAR, and for a fraction of the price. I have prepared two example scenarios for you to help determine which type of vehicle will best suit your needs.

For performing volumetrics before and after blasting, we would suggest using a 3DR X8-M. The X8-M is a multirotor aircraft with: vertical takeoff and landing capabilities, an imaging speed of 8 m/sec, and great navigability in tight spaces. The only negative of this platform is its maximum flight time of 14 minutes, necessitating additional batteries if larger survey areas are desired.

Figure 17: Flying a small wall prior to blasting with the 3DR X8-M could be done in 1 minute,
with 1.5cm/pixel ground resolution.
 For flying larger open-pit mines, or strip mines, we would suggest using the 3DR Aero-M. The Aero-M is a fixed wing platform with an imaging speed of 23m/sec and a maximum flight time of 40 minutes. The negatives of this platform are: it requires a longer turn radius at the end of each flight line - lengthening the overall length of the flight, it requires clear runways for takeoff and landing, and this platform does not perform well in tight spaces.

Figure 18: Flying a 0.8 km2 area with the 3DR Aero-M can be done in 16 minutes,
with 3.00cm/pixel ground resolution. 
If you are looking to map extremely long strip mines, we would suggest using the 3DR Aero-M. In every other situation, the 3DR X8-M will be the best platform.

After recording imagery, we will process it using Pix4D to create point clouds which will allow for the calculation of volumetric information. Your company will be able to know exactly how much material was moved by each blast, know the volume of your tailings piles, and know exactly how much the piles are growing each month. If you would like more information about how accurate UAV photogrammetry is when compared to LiDAR, click here.

More information can be found at the following links:
https://pix4d.com/mines-quarries/
http://blog.pix4d.com/post/127642506041/supporting-blasting-operations-with-uas
http://blog.pix4d.com/post/115946312406/accurate-volume-estimation-with-non-ideal-flight


Sources:
https://www.faa.gov/uas/faq/#qn1


Saturday, October 3, 2015

Distance Azimuth Survey

Introduction:

Technology can fail, and often does right when it is needed most. This can have devastating effects on field collection of data, if one is not prepared for it. This lab was intended to teach us a method to for collecting data, when left with only the most rudimentary equipment. This method, distance-azimuth, requires only a compass and some means of measuring distance, like a measuring tape. This allows it to be a backup in situations when other equipment is rendered inoperable or unavailable. When standing at a known point, one can determine the location of a second point, by determining the compass heading needed to travel between the two, and by determining the distance between the two points. This allows for trigonometric calculation of the second point's location on the earth's surface.

My group chose to survey the campus commons, as the area had a large number of possible data collection points (Figure 1). As it is a relatively open area, points are easily identified on aerial images. This will allow for my group to easily compare the accuracy of our collected data to existing aerial imagery.


Figure 1: One view of the survey area.

Methods:

The class was divided into groups of two. We were to use a TruPulse laser distance finder to determine the distance between our location and the desired point. The TruPulse had a built-in compass, so we used the azimuth information it determined. We recorded the locations of rocks and trees by recording the Object ID, Distance, Azimuth, and Type in an Excel spreadsheet. We used a Trimble Juno 3b to record the coordinates of our survey site.

Figure 2: Myself 'firing' the laser at rocks and trees
 to determine distance and azimuth.

Figure 3: Ally entering points into the spreadsheet in the field.

We conducted our survey between 2:52 PM and 3:38 PM on September 30, 2015. The weather was clear and sunny, and the campus commons was very busy. We recorded points from a fixed location with myself leaning against a small tree to stabilize the laser, and with Ally recording the measurements and types into a laptop (Figures 2,3). After we finished collecting the points, we obtained the geographic coordinates from the Juno and added the coordinates to X and Y labelled columns in the excel spreadsheet (Figure 4).

Figure 4: A portion of our excel spreadsheet.
Note the adjusted labels of the X and Y columns.
Next, we imported our adjusted spreadsheet into a file geodatabase using the table to table (single) import tool in Esri ArcCatalog. After the table was imported, we used the Bearing Distance to Line tool in Esri ArcMap to convert our feature table to lines (Figure 5). Our first attempts to convert the feature table to lines resulted in extremely skewed data, at which point we realized our X and Y coordinates were accidentally swapped (Figure 4). After correcting this user error, the results displayed properly (Figure 6).

Figure 5: The 'Bearing Distance to Line' tool.
If your X-Y data comes from a GPS, make sure your spatial reference is WGS84!


Figure 6: The output of 'Bearing Distance to Line'

Next, the 'Vertices to Points' tool was used to convert the endpoints of the generated lines into an independent feature class (Figure 7). By selecting the 'Point Type' 'END', the output feature class will contain only the endpoints of the bearing lines (Figure 8).

Figure 7: The 'Vertices to Points' tool.
Make sure you set the 'Point Type' to 'END' to only get the endpoints.
Before comparing the points to the aerial imagery, I projected the points to the Eau Claire County Coordinate System (WKID: 103417) so the aerial images and points would be in the same coordinate system (Figure 9).
Figure 8: Output of the 'Vertices to Points' tool.
Figure 9: The 'Project' tool with Eau Claire's County coordinate system selected.

Figure 10: The complete data flow model in ArcGIS Model Builder.

After the point locations were generated by the 'Vertices to Points' tool, we were able to truly compare the results of the survey to the aerial image. The initial results were quite inaccurate for the entire surveyed area (Figure 11). We believed this was due to the inaccuracies of the GPS we used. In order to compensate for this inaccuracy, we compared the aerial image in ArcMap to a more updated image in Google Maps that featured the newly planted trees on campus, and added a new point where we thought the origin should be (Figure 10). After bringing the new spreadsheet into my geodatabase, I created a new map with the new origin point. The new origin point brought the points closer to their original locations, but I thought the accuracy could be further increased by compensating for the magnetic declination. The declination for our location is around 1 degree 6 minutes West, according to the NOAA Geomagnetic Calculator. I added a column to calculate new azimuth values by adding the existing values with the geomagnetic declination of -1.1 degrees. After adding the new spreadsheet to my geodatabase and mapping the points, it delivered higher accuracy than any of the previous points (Figure 11).

Figure 11: Points from the original GPS point.
Figure 12: Original azimuth values from the new origin point.
Figure 13: Adjusted azimuth values from the new origin point.
Discussion:

'Firing' the laser from a sitting position allowed the point collector to maintain the same position for the entire length of the survey without the use of a tripod. It did however have one major drawback, as some rocks became obscured by the slight decline in elevation between the data collection site and the Little Niagra Creek (Figure 14). The accuracy of the point locations seems to decline as the distance between the observer and the surveyed point increases. This inaccuracy might have been caused by the equipment, or possibly the slight tremor in my hands causing the laser to be fired off-target.

Figure 14: The red polygon shows the area viewed in the two images.
The top image was taken at the same location and perspective from which the laser was 'fired'.
The bottom image was taken at the same X-Y location as the top image, but was captured while standing.
If I were to do this lab again, I would set the laser on a tripod to eliminate any possibility of shaking hands affecting the recorded data, and to eliminate inaccuracy caused by the low surveying perspective. I would've also moved locations several times to shorten the distance between the survey location and the points surveyed to compensate for any mechanical inaccuracy. The tripod's positions would be measured using distance-azimuth from the corners of buildings, so a GPS wouldn't be needed. I would also make sure that the points I'm surveying have actually been captured on an aerial image or another form of control.

Conclusion: 

The technique allowed my team to collect points over a larger area in a substantially faster time than other survey techniques would've allowed. This technique could be used when collecting points at a distance when survey-grade accuracy isn't necessary, like if one were to map out all of the locations of "Wall Drug" billboards along US-90 in South Dakota. For surveying applications, this technique has been replaced by the 'total station' which uses the same principles, but with substantially higher accuracy. The survey points with the new origin point and azimuth adjusted for the geomagnetic declination had sub-meter accuracy in some locations, but had irregular patterns of inaccuracy. I believe this may have been a result of electromagnetic interference (EMI), but as my partner and I didn't have a second compass to verify the strength of EMI at our survey location, we can't be sure.


Sources:
http://www.ngdc.noaa.gov/geomag-web/