Monday, May 18, 2015

Field Methods: Unmanned Aerial Vehicle Flight and Mission Planning Processes

UAS Mission Planning and Flying

Conditions
Our study was conducted at the University of Wisconsin-Eau Claire Priory on Monday, May 11, 2015. Conditions were collected using a Kestrel 3000 unit. Temperatures were around 11.6° Celsius and conditions were overcast with winds out of the east at three miles per hour, gusting up to 8 miles per hour. The study was conducted between 1500-1800. My group consisted of the entire GEOG 336 Field Methods course.

Introduction and Study Area
Many applications exist for unmanned aerial systems (UAS's). However, unless you operating the UAS, or watching someone operate it, it is hard to comprehend just how much planning and safety checking is actually occurring. A multitude of sensors, wires, cables, computer programs, camera triggers, and flight planning is involved before the system even gets off the ground. In this exercise, the class was involved with the mission planning, preflight safety check, and flight phases of UAS operation.

As stated before, this exercise took place at the Priory, a 120 acre property owned by the University of Wisconsin-Eau Claire. The Priory is within Eau Claire County, in the town of Washington, approximately three miles south of the University of Wisconsin-Eau Claire campus. The north edge of the Priory is bounded by the eastbound lane of Interstate 94, while the south edge is bounded by Priory Road (Figure 1). This area is referred to as being in bedrock uplands, so relief is very great at the Priory. Fluvial processes and anthropogenic tendencies have created a number of very steep, deep, and long gullies that transect the property. A resistant bedrock ridge interpreted as Late Cambrian Mt. Simon Formation bedrock exists, upon which the main building of the Priory has been built upon. The area that the two flights were conducted in was located approximately 0.15 km southwest of the main building on the property. This location was chosen because it was devoid of almost any restricting factors. It was easy to steer away from, or over trees, it was away from pavement so that if it did fall it would have come down on grass, and it was away from the public, so no one (other than one of us if we were not paying attention) could have gotten injured from this exercise.

Figure 1. Map showing the location and layout of the Priory.

Methods
The following figures (Figure 2 to Figure 10) highlight the various steps involved in the preflight and flight stages of UAS mission planning:

Figure 2. This first things on the checklist for preflight setup deal with weather and making sure that everything is connected. If these steps are not carried out and you are halfway through a job when a large storm rolls in, then you've just wasted a significant amount of time that has to be regathered. If a propeller flies off as the UAS is coming in for a landing then it is going to flip and potentially break on impact. Proper preflight checking is a necessary precaution before any flight. The second portion of this first image shows the electronic checking, such as battery life, current voltage, how many satellites are in orbit above you, and 

Figure 3. The next few steps listed at the top os this screen are ensuring the UAS is ready for take off. After everything has been checked on the unit and with the electronics, spectators are cleared for launch and the UAS takes flight. After the flight has concluded, the batteries are disconnected, sensors are checked, and the UAS is checked to make sure it is entirely shut down before it is packed away again.


Figure 4. This shows a photograph of a mission planning program that is created and the route devised through use of a tablet. The tablet program is great for reducing the overall amount that needs to be taken into the field, however it lacks greatly in its' functionality versus the computer program for mission planning. The computer program shows specifications during flight and provides constant monitoring, while the tablet program is just good for less precise, easier missions.

Figure 5. The IRIS UAS. This system was a lower power, smaller unit than the other UAS flown during this exercise. Given strong winds, this unit performed quite well and remained stable during flight, aside from a few small rolls with the gusts. The unit is controlled by the controller seen behind the unit. This shows a number of specifications on it as well, relating to the current flight of the UAS.

Figure 6. Dr. Joe Hupy bringing the IRIS UAS in for a landing after the initial flight. This flight flew a basic zig-zagging route and, as the unit was equipped with a GoPro, it acquired a number of pictures that would be able to be mosaicked together to provide a larger mosaic image of the area flown.

Figure 7. Classmate Michael Bomber setting up the Matrix UAS unit. This was a higher-powered, larger UAS than the IRIS. After the preflight check was completed and everything was in order, Dr. Hupy took off with unit. A video at the bottom of this blog documents the flight of the Matrix. Strong winds and the size of the Matrix did not combine for a smooth flight, showing us that it does not take significant winds to throw off the UAS systems.

Figure 8. Myself and classmate Aaron Schroeder. We were setting up the survey station so that we would be able to later collect ground control points to properly georeference the aerial imagery once we processed the data captured by the mounted GoPro.

Figure 9. Classmates Joel Weber and Michael Bomber using intersecting tar lines to collect discernible ground control points from the landscape around us to georeference the aerial imagery.
Results
Figure 10. An aerial image taken from the IRIS UAS as it was traveling along its' designated flight plan. The image was taken with a GoPro camera mounted onto the IRIS and shows an image with very nice resolution and a wide angle lens to properly capture a wide viewing window.

Figure 11. A mosaicked image from the flight of the Matrix UAS unit. This unit was not in the for very long, however it was flying long enough to provide this mosaicked image. A few pixels are missing under the silver box located on the right side of the image, but other than that it is a good image. 

Discussion
After conducting the flights for the IRIS and Matrix units, it is apparent that a number of things need to be considered and checked before the flight can truly be greenlit. Immediately, as the IRIS was getting set up, something happened and the battery pack was instantly fried. Luckily we were very close to campus, otherwise this issue may have spelled the end of our UAS experience that day. Dr. Hupy was always slightly wary about the wind during this exercise and with good reason. After the Matrix unit cleared the treetops, issues with flight began almost immediately. The unit had one large roll that immediately cancelled the flight and brought the unit back to the ground. In a professional world, conditions such as those would have cancelled the launch before we even got into the field. 

Conclusion
This was a fantastic entry into the world of UAS flying and is a promising look at what is to come for the university as it deploys a course on Unmanned Aerial Systems in the coming fall semester. All that goes into mission planning, safety checking, and flying the UAS is more than enough to fill an entire semester of work. The potential and applications for the UAS world are continuing to boom. I was recently told of a new way to keep elephants from leaving reservation lands by using UAS's. To the elephants they sound like bees, so when the UAS approaches, the elephant flees back to the reserve and back to safety.  This is just one of the many applications that drones have and I am excited to see just how it takes off in the years to come.

Thursday, May 14, 2015

Field Methods: Navigation with GPS

Navigation with a GPS device using a UTM Coordinate System

Introduction and Study Area
This exercise builds off of the previous exercise involving navigating using a map and compass. I did not take place in the act of navigating through the study area, the Priory, because I actually taught the class the orienteering methods necessary to complete the short navigation course. The orienteering exercise was a brief introduction to the methods necessary to navigate using more primitive methods, such as a map and compass, versus the now normal methods of using GPS devices such as a Garmin eTrex, Trimble Juno3 series handheld unit, or a variety of other handheld GPS units available on the market, today.

This more recent exercise builds off of the orienteering exercise, however it allows us to utilize the handheld GPS systems that are generously available to us for use through the University of Wisconsin-Eau Claire Department of Geography and Anthropology. My group, consisting of Nicholas Bartelt, Les Warren, and myself, used a Trimble Juno 3 series to complete the exercise. The objective of this exercise was to develop a course for the next section of Geography 336: Field Methods for the fall semester of 2015. As our initial orienteering exercise took place on the UW-Eau Claire property known as the Priory, this exercise did too. We were to conduct a five point course for the incoming GEOG336 students to navigate.

As stated before, this exercise took place at the Priory, a 120 acre property owned by the University of Wisconsin-Eau Claire. The Priory is within Eau Claire County, in the town of Washington, approximately three miles south of the University of Wisconsin-Eau Claire campus. The north edge of the Priory is bounded by the eastbound lane of Interstate 94, while the south edge is bounded by Priory Road (Figure 1). This area is referred to as being in bedrock uplands, so relief is very great at the Priory. Fluvial processes and anthropogenic tendencies have created a number of very steep, deep, and long gullies that transect the property. A resistant bedrock ridge interpreted as Late Cambrian Mt. Simon Formation bedrock exists, upon which the main building of the Priory has been built upon.

Figure 1. Map showing the location and layout of the Priory.

Methods
The completion of this exercise was carried out without issue. We started by plotting points on a previously developed navigation map of the Priory that utilized a Universal Transverse Mercator (UTM) coordinate grid (Figure 2).

Figure 2. Map developed for the orienteering exercise that was utilized for the GPS navigation exercise. The map was developed by Les Warren. 

As stated before, the GPS unit used for this exercise was the Trimble Juno 3 series handheld unit. The Juno 3 has been used for numerous exercises prior to this and we have become very familiar with how to properly record field data with it (Figure 3).

Figure 3. This image shows the Trimble Juno 3 Series. This unit was used to record the locations of the points we picked to serve as one of the new courses for the next Field Methods class.
In the field, large trees were selected at the locations that we had determined we wanted points at. The trees were marked with pink survey tape and numbered based on the group number and what number the point was in the track (ex. 2-1, 2-2, 2-3, 2-4, 2-5).

Results
Five trees were selected and the course was spread out over a sizable terrain. Figure 4 shows a table with the XY coordinates for the points listed. Figures 5-9 show the trees with the survey tape on them that were chosen in the field. Two maps were produced to show the elevation difference and the slope of the course we laid out (Figure 10-11).

Figure 4. Table that shows each point, what the point was placed on, and the XYZ coordinates for each point. XY are in reference to UTM coordinates and Z is listed in meters above mean sea level.


Figure 5. Point 2-1.

Figure 6. Point 2-2.

Figure 7. Point 2-3.

Figure 8. Point 2-4.

Figure 9. Point 2-5.

Figure 10. Data showing elevation at the Priory. This course experiences a difference of elevation of 42.1 meters. 

Figure 11. Slope steepness by percent at the Priory. This course we chose traverses some steep slopes and through one large gully.
Discussion and Conclusion
There are a few potential sources of issue for the students who will be running this course in the fall. One issue, which I observed when watching the two years worth of students conduct the orienteering exercise, was that there seems to be an aversion to going down into gullies. In order to properly conduct an accurate exercise, using proper methods of orienteering, one must try to maintain straight lines and count their paces. If the students are skirting around the edges of the gullies then their results will hardly be accurate. Another issue I noticed was groups not using their numbers to the best of their abilities. If a group had two or three people then proper resource allocation needs to be practiced. One person should act as a runner, one as an anchor, and one as the navigator. This will ensure that if mistakes are made then the groups can return and reevaluate. Most groups never seemed to get over this. Misuse of resources and an aversion to gullies seem to be the largest downfalls groups encounter. If they are able to overcome these issues then they complete this exercise with relative ease.

This was an easier exercise that allowed us to use our experiences and devise a course for future GEOG 336: Field Methods students. This exercise did not require an immense amount of critical thinking, though it did help to reinforce some of the skills we had learned before, like how to read maps and the land around us and how to properly utilize and extract data from the Juno 3 units.

Sunday, April 5, 2015

Field Methods: Conducting a Distance/Azimuth Survey

Conducting a Distance/Azimuth Survey

Introduction
Technology can fail. While it may be hard to believe, the technology we lean on so heavily has to potential to stop working, be inaccessible, or be too infuriating to understand. As geographers, it is easy to forget that many of the complex things we can do now for surveying and navigation were once done by measuring and using a compass. Presently, we have a plethora of technology at our fingertips that can quickly solve our problems. However, we must be ready to conduct field work without the use of a Juno or eTrex. This goal of this exercise was to better learn how to manage if technology is unavailable to us. 

In order to properly conduct a distance/azimuth survey, one needs to pieces of equipment: a compass and a measuring device (tape measure, measuring wheel, etc.). The compass allows us to find the azimuth. Azimuth is described as the distance of something in relation to a 360° compass (NOAA). This can tell us the direction (0-360°) of an object in relation to our position. The measuring unit can then tell us how far away the object we are curious about is. If we know our latitude and longitude we can determine where something was on a map in relation to our position. For this exercise (ironically), we utilized a TruPulse Laser system to determine our distance and azimuth, however the procedure was the same if we had had a compass and measuring device. Using the TruPulse we were able to acquire the azimuth of an object to us and the horizontal distance of an object to us. My partner, Michael Bomber, and I chose to acquire car color and make. This survey was conducted in the parking lot behind Phillips Hall and the Davies Student Center on the University of Wisconsin Campus on April 2 at 12:30 (Figure 1). After acquiring the data, we brought the data into ArcMap for processing. 
Figure 1. Map showing the study area that the exercise was conducted in.

Method
Before heading into the field we created a geodatabase for our data to be put into when we returned from the field. In the field we positioned ourselves at two different locations. The first was off of the southeast corner of Phillips Hall (Figure 2). We took sixty-nine data points at this first location, recording the color and company of cars parked in the parking lot. We then moved to a vent off of the southeast corner of Davies Student Center (Figure 3). The survey was conducted by mounting the TruPulse laser rangefinder on a tripod to maintain a consistent location with this to collect our data from. 

Figure 2. Panoramic photo of the first location data was collected from off of the southeast corner of Phillips Hall. Sixty-nine different data points were collected from this location.

Figure 3. Panoramic photo of the second location data was collected from off of the southeast corner of the Davies Student Center. Thirty-one different data points were collected from this location.

Once we returned from the field, the data was imported from Excel to ArcMap with an ObjectID field and Latitude/Longitude points for the proper points. Two different locations were used so there are two different coordinates of latitude and longitude. In ArcMap, the Bearing Distance to Line tool was used to create a series of lines based off of the starting XY coordinates, the distance field, and the azimuth (Figure 4 and Figure 5). The points collected are based off of the earth and not a projected coordinate system, so when using the tool the WGS 1984 coordinate system should be used. 

Figure 4. The Bearing Distance to Line tool. The Excel table was the Input Table and the appropriate fields were designated from the Excel table. Bearing units were taken in Degrees, distance was taken in Meters, and the Spatial Reference was in GCS_WGS_1984.
Figure 5. The study area and the result of the Bearing Distance to Line tool. The result was a series of lines originating for a common point and extending to a point based on the azimuth and distance recorded. The ends of these lines match up with the vehicles we collected data for.

Next, the Feature Vertices to Point tool was used to create points where the end of the line was (Figure 6 and Figure 7). One thing to remember is to make sure to select the "End" under the Point Type, or the correct vertice may not be created.

Figure 6. The Feature Vertices to Point tool. Data from the Bearing Distance to Line tool was input and the end vertices were requested to plot. 

Figure 7. Result of the Feature Vertices to Point conversion. The red points are the resulting points from the end vertices.

After the feature vertices to point tool was run the points were joined by ObjectID with the excel table to allow for color and company to be shown.

Results and Discussion
The resulting data can be expressed in a series of maps and graphs (Figures 8-12):
Figure 8. Graph showing the number of cars of various corresponding colors. Red, Silver, and Blue were the most common colors.

Figure 9. Spatial distribution of the color of cars from where we collected them. 

Figure 10. Zoomed in view of the distribution of points we collected for color.

Figure 11. Graph showing the number of cars of various corresponding companies. Toyota, Chevy, Ford, and Honda were by far the most common companies.

Figure 12. Map showing the location of various company makes of vehicles. 

There were a few things we wanted to pay attention to when collecting our data that past courses came up against. We made sure that the object we were pointing the laser rangefinder at was not too small, too far away, or too moving. Other classes had issues with trying to collect points like this and they came up against some issues because of the difficulty of collecting points like this. We also had to make sure that the locations we used were findable on a current map of the area. The basemap we used was from 2013, so the new Davies Student Center was in place and the parking lot was mostly similar to what it is now. We were able to find the points we stood at with ease. 

Conclusion
This exercise helped to teach us how to record points in the field by conducting a distance/azimuth survey instead of using a GPS device. This is a extremely helpful tool that can help us to conduct surveys if our technology was to fail or give out. Even though we used a laser rangefinder to conduct this survey we understand the method of how to conduct this survey and even used a compass a little to compare what the compass read versus what the laser rangefinder was telling us. The skills learned in this exercise help to develop a base with which to broaden our geospatial navigational skills in the remainder of this course.

References Cited
NOAA. (n.d.). Glossary - NOAA's National Weather Service. Retrieved April 5, 2015, from http://w1.weather.gov/glossary/index.php?letter=a

Sunday, March 8, 2015

Field Methods: ArcPad Data Collection Part 1

ArcPad Data Collection Part 1

Introduction
In a previous exercise entitled "Geodatabases, Attributes, and Domains," from March 1, 2015, I walked through the proper steps for creating and preparing a geodatabase for field research (click HERE). In this instructional blog, I highlight just how important going through the proper methods of geodatabase construction are to easy, concise, and accurate field measurements. This blog will instruct as to the proper methods of deploying the newly created geodatabase into the field. In this case we are using a Trimble Juno 3 Series Handheld global positioning system (GPS) device (Figure 1).

Figure 1. This image shows the Trimble Juno 3 Series. This unit was used in the field for data collection.

This device is equipped with two different types of software for data collection. One of these data collection programs is called TerraSync, the native GIS platform for Trimble units (Trimble, 2015). The second program is called ArcPad, one of the many programs in the ArcGIS suite from Esri (Esri, 2015). As ArcPad is part of the ArcGIS suite, it offers seamless integration with ArcGIS for Desktop, ArcGIS for Server, and ArcGIS for Online (Esri, 2015). We will be using ArcPad to record our data while in the field, as we will be able to utilize the power of our already created geodatabase, created in ArcGIS, with all its inherent parameters to quickly and accurately collect data.

The procedures to properly prepare our geodatabase for deployment onto the Trimble Juno units is very brief, though if done incorrectly can lead to issues in the field. For this exercise we were collecting data on the University of Wisconsin-Eau Claire campus (Figure 2).

Figure 2. This image shows the projected study area for the ArcPad Data Collection exercise.

Methods
The first step to preparing a geodatabase for deployment is to start a new ArcMap session. Open ArcMap and navigate to the folder that the geodatabase has been stored in. The first thing to be done is to add in the point feature class created prior that data points will be created for (Figure 3). NOTE: Failure to do this will result in the basemap image being displayed first and will slow down the map in the field.

Figure 3. This figure shows what the table of contents for a newly opened ArcMap file, after the point file for data has been dropped in first (Trimble, 2015).

In this case we will only be using one feature class for data collection, however, if more were being used then this would be the time to add them. For example, one project I worked on looked at the distribution and amount of invasive strain of reed canary grass on the banks of the Lower Chippewa River. Data collection for this project came in the form of points, lines, and polygons. Points were for very small patches of reed canary grass. Lines were for thin lines of reed canary grass along the banks. Polygons were for larger patches of reed canary grass that required us to dock our canoes and walk around the extent of the polygon. In this case, it was crucial that the proper feature be easily laid out within the ArcPad menu for quick data collection.

The next step is to add the backdrop into the map. For this step, it is necessary to select an image that will not be a large blur when zoomed into the proper level. Personally, I believe that it is also necessary to select an image that allows for some field checking. If in an urban setting, chose an image that allows you to see the study area and pick an area of reference that is discernible on the map. This will allow you to check to see if your data points are being projected properly in the field by seeing where the point is placed on the ArcPad interface in reference to your actual location. Finally, ensure that the image used is zoomed into an area that encompasses the study area, as this level of zoom will be cached and preserved for when the file is opened on ArcPad.

Following this, save the ArcMap file containing the feature class and properly zoomed image. Now we need to check out the data for use on ArcPad. The first step is to go to the Customize menu on ArcMap and select "Extensions" (Figure 4).

Figure 4. This image shows where the Extensions option can be found within the ArcMap toolbar. 
Select ArcPad Data Manager in order to check out the extension for use in ArcMap. Then go to Customize again and hover over the Toolbars dropdown menu and select ArcPad Data Manager Toolbar. This will allow for the data to be checked out and made ready for ArcPad. Select the first icon to the right of the "ArcPad Data Manager" text icon. This is the "Get Data for ArcPad" icon (Figure 5).

Figure 5. This shows the ArcPad Data Manager Toolbar and all of its options (Esri, 2015).
Click "Next" on the initial page. Select Action menu and choose "Checkout all Geodatabase Layers" (Figures 6-8). Unfortunately, when creating this blog, I did not have access to a version of ArcMap with the ArcPad Data Manager as I was using a remotely sourced desktop version of ArcMap. However, I will be using images from a previous student, Lee Fox, who took this course in spring 2014.


Figure 6. This image shows the second window where the data is initially checked in (Fox, 2014). Select the Action menu and select "Checkout all Geodatabase Layers."
The next step is to specify a name for the folder that will be created for the data. Under the "Specify a name for the folder that will be created to store the data" list the name of the folder, paying attention to not include spaces in the folder name (Figure 7). Make sure that under "Where do you want this folder to be stored?:" that the ArcMap file that you have saved is the one selected.

Figure 7. This image shows the Select Output Options menu within the Get Data From ArcPad menu (Fox, 2014).
After clicking next to get to the next window the final option is to ensure that the "Create the ArcPad data on this computer now" option is selected and then select "Finish" (Figure 8). If the process is successful a screen will appear after the process has finished stating that the operation was successful.

Figure 8. This figure shows the final screen that needs to be input in the the Get Data From ArcPad wizard (Fox, 2014). Selecting finish will start the process of making the data ready for deployment.
Once the folder containing the deployable data has been created make sure to copy it in case the data is compromised in any way. This will ensure that the original data is unaltered and will be immediately redeployable in need be. Make sure the Trimble Juno unit is connected to the computer, copy the folder again, and paste it onto the SD drive of the Trimble Juno.

Once you return from the field all that needs to be done is the data needs to be copied from its location on the SD and back into the folder created in the previous process as the checkout folder. Make sure the proper extension (ArcPad Data Manager) is turned on as before and use the tool "Get Data from ArcPad" (Figure 5).

Results
Using the Trimble Juno 3 Series in the field we were able to gather results for Surface Temperature, 2 Meter Temperature, Wind Chill, Wind Speed, Humidity, Dew Point, and Ground Cover. This data was collected over 16 points. The results were used to create two sample maps (Figure 9 and Figure 10).

Figure 9. This image shows the map created from the temperature taken at the surface. Darker blue represents colder areas while darker green represents warmer areas. The warmest areas are located located over blacktop.
Figure 10. This image shows the wind speed over areas of campus. Areas of dark green represent lower wind speeds while dark red represents higher wind speeds.

Conclusion
Proper movement of a geodatabase to ArcPad for deployment can make or break a data collection outing. If the geodatabase is not moved over properly it can slow down the entire process and make field collection of data very problematic and slow. Properly moving it over, however, will usually result in a streamlined method of data collection that allows for quick response time from the software and easy recording practices. This will allow the next step of the exercise, collecting significantly more data points, to be done in an easy and streamlined manner. The geodatabase used in this exercise was selected for deployment for the entire class in order to provide a standardized method of data collection. This will allow all the data to be merged in a quick and easy manner after the data collection process has been conducted.

References Cited
Esri. (2015). ArcPad - Mobile Data Collection & Field Mapping Software. Retrieved March 7, 2015, from http://www.esri.com/software/arcgis/arcpad
Esri. (2015). ArcPad User Guide. Retrieved March 7, 2015, from http://webhelp.esri.com/arcpad/8.0/userguide/index.htm#arcpad_data_manager/concept_datamanager.htm

Fox, L. (2014, March 3). Field Activity #7: ArcPad Data Collection. Retrieved March 8, 2015 from http://uwecleefoxmethods.blogspot.com/2014/03/field-activity-8-arcpad-data-collection.html

Trimble. (2015). Trimble – Juno 3 Series Handheld | Trimble Agriculture. Retrieved March 7, 2015, from http://www.trimble.com/Agriculture/juno-3.aspx

Trimble. (2015). TerraSync. Retrieved March 7, 2015, from http://www.trimble.com/mappingGIS/TerraSync.aspx

Sunday, March 1, 2015

Field Methods: Unmanned Aerial Systems Mission Planning

Unmanned Aerial Systems Mission Planning

Introduction
Unmanned Aerial Systems (UAS) constitute a wide variety of remotely controlled aerial systems, ranging from fixed-wing, single-rotor, and multi-rotor aerial vehicles (Colomina and Molina, 2014; Military Factory, 2015). The versatility of these systems allow for a plethora of uses from helping with precision agriculture, to 3D mapping, to helping with search and rescue efforts (Handwerk, 2013). UAS's have been around for quite some time now and were originally developed for use as military reconnaissance. In recent years, focus has shifted from only using unmanned aerial systems for use in military operations to recreational and commercial use. A multitude of technology exists to outfit these UAS systems for the multitude of tasks they will be used for. Certain cameras and sensors that can be put on the aircraft have different capabilities (Colomina and Molina, 2014). These different sensors can see in different spectral ranges, spectral bands, thermal sensitivities and visible band resolutions (Colomina and Molina, 2014). Armed with this knowledge, it will be possible to assess a number of different scenarios and determine the best way to use unmanned aerial systems to solve the issues presented.


Scenario 1
A power line company spends lots of money on a helicopter company monitoring and fixing problems on their line. One of the biggest costs is the helicopter having to fly up to these things just to see if there is a problem with the tower. Another issue is the cost of just figuring how to get to the things from the closest airport.

In the situation provided above, a rotary wing UAS, such as the Draganflyer S6, would be the most fitting to complete this task (Figure 1). The S6 retails for $8995 per unit and comes equipped with a Sony QX100 digital camera with HD live stream video capabilities.

Figure 1. This shows an example of a rotary wing UAS. There are four rotors on this aerial system, improving maneuverability, stability, allowing for direct vertical or horizontal movements, and allowing for the ability to hover (Draganfly.com).

Rotary wing systems have a number of strategic advantages over fixed wing systems that make rotary wing systems better for this type of a job. The rotary wing system is able to move vertically and horizontally as need be and are also able to maneuver with greater agility than the fixed wing counterparts (UAV Insider, 2013). However, there are a number of questions that would need to be asked before a decision was made.
  • What is the necessary flight range?
  • What is the necessary flight duration?
  • What weather conditions can be expected?
Rotary wing systems are more mechanically advanced than fixed wings, leading to shorter flight ranges and lower speeds (UAV Insider, 2013). If the distance of flight is too great, a rotary wing system will possibly not be able to accomplish the task because it will be out of range or the amount of battery will not be great enough to finish the mission. Weather is also a constant factor to be wary off with UAS's, as they are small and are susceptible to being pushed around by wind higher in the sky.

If all necessary conditions are met, the next steps will be to decide on the proper equipment for the job. In this case, sensors will not be a necessary piece of equipment, because only items within the visible spectrum need to be examined. A good camera that will be able to relay live feed video back to field headquarters station will be a necessary piece of equipment. This will allow the fliers to see where the downed lines are and record information that will allow others to know where to go and what the issues may be to fix the lines.

A number of live feed cameras exist that are mountable on UAS's. One particular camera, the Sony QX100, retails for $1595 (Figure 2). The Sony QX100 is gyro stabilized, easily mountable to a UAS, and transmits a live feed that can be synced to a smart phone with a Sony app, allows the user to control the zoom, and allows the user to trigger the shutter. The Sony QX100 has a 20 Megapixel lens, effective resolution of 5472 x 3649, and video resolution of 1440 x 1080 (Draganfly.com).

Figure 2. This camera is mountable on a UAS and provides live feed to a smart phone and the ability for the smart phone user to control the zoom and shutter (Draganfly.com).
This equipment should provide the necessary equipment to properly monitor the power lines for damage and provide a visual avenue to remotely assess the damage.

Scenario 2
A pineapple plantation has about 8000 acres, and they want you to give them an idea of where they have vegetation that is not healthy, as well as help them out with when might be a good time to harvest.

Given the study area in the situation provided above, a fixed wing UAS would be the most fitting to complete this task (Figure 3).

Figure 3. This shows a fixed wing system being launched. The Falcon system can carry a payload of 2 lbs, making it perfect for the equipment that will be explained. It also is able to travel long distances and has over a 3 mile range, making it a prime candidate for this type of mission (Falconunmanned.com).

A fixed wing provides longer flight durations and higher speeds that make this system better equipped for large areas, however it does require a runway for takeoff and landing, unlike the rotary wing system. The fixed wing in Figure 3 is the Falcon, retailing for $12000.

A number of questions should be addressed to better determine if unmanned aerial systems provide a viable option to assist the needs of the pineapple plantation.
  • What is the potential cause of the unhealthy vegetation?
  • What time of year is it?
  • What have current weather conditions been like?
The factor that will be examined is the Normalized Difference Vegetation Index (NDVI), which essentially determines the density of green in a certain area (NASA EO). Healthy green plants absorb wavelengths in the visible spectrum and reflect wavelengths in the near infrared spectrum. Unhealthy plants with less chlorophyll are not as able to absorb the visible spectrum wavelengths and instead take in more of the near infrared spectrum. (Figure 4 and Figure 5).

Figure 4. This shows the electromagnetic spectrum from ultraviolet wavelengths to far infrared wavelengths (Vividlight.com). The two wavelength ranges necessary from this study are the visible light range and the near infrared rage. The visible range will be used to monitor the color of pineapples as they grow. The near infrared spectrum will be used to calculate a NDVI index.

Figure 5. This shows the amount of absorption of the visible and near infrared light spectrums (NASA EO). The health vegetation on the left absorbs mostly visible light, reflecting roughly 8% of it, while reflecting 50% of the near infrared light. The unhealthy vegetation on the right reflects only 40% of the near infrared spectrum and reflects over three times the amount of visible light as the healthy vegetation.

A few pieces of equipment will be necessary for this assessment. The Tetracam Lightweight Agricultural Digital Camera (ADC-Lite) provides imaging in the 450-1050 wavelengths, perfect for capturing visible and near infrared imagery, and retails at $3795 (Figure 6). This can be used to create a NDVI image, using a program such as Erdas Imagine.

Figure 6. This shows the Lightweight Agricultural Digital Camera (ADC-Lite) (Tetracam.com). This camera acquires imagery in the wavelengths between 450-1050 nm (Colomina and Molina, 2014).
A higher end visible light spectrum camera, such as the Sony Nex-7 would be able to show vegetation colors and potentially help to show when the pineapples are ripe and have changed colors. This unit retails for $1099 and a multitude of lenses can be purchased to enhance the zoom capabilities of the camera (Figure 7).

Figure 7. The Sony Nex 7 is a favorite visible light spectrum camera for UAS purposes (Colomina and Molina, 2014).
Conclusion
The use of unmanned aerial systems is a growing industry. Commercial and recreational uses continue to expand as low altitude airspace continues to fill up with more and more users. This ability of a company or business to utilize UAS to assess issues that may arise offers an interesting alternative to issues previously solved using conventional aerial methods or by pricey ground reconnaissance. As the industry continues to grow, UAS will continue to grow in demand and everything from precision agriculture, to monitoring protected herds of animals, to monitoring chemicals in the atmosphere will provide us with a never ending well of data to analyze. Being informed and understanding the equipment necessary to complete a mission are essential concepts to making the use of a UAS a cost-saving venture.

References Cited
Colomina, I., & Molina, P. (2014). Unmanned aerial systems for photogrammetry and remote sensing: A review. ISPRS Journal of Photogrammetry and Remote Sensing, 92, 79-97. Retrieved March 1, 2015, from http://www.sciencedirect.com/science/article/pii/S0924271614000501

Draganfly.com. (n.d.). Sony QX100 Camera System with Single Axis Stabilized Camera Mount. Retrieved March 1, 2015, from http://www.draganfly.com/sku/DF-QX100I-1B.php5#Zoom

Falconunmanned.com. (n.d.). Retrieved March 2, 2015, from http://www.falconunmanned.com/

Handwerk, B. (2013, December 2). 5 Surprising Drone Uses (Besides Amazon Delivery). Retrieved March 1, 2015, from http://news.nationalgeographic.com/news/2013/12/131202-drone-uav-uas-amazon-octocopter-bezos-science-aircraft-unmanned-robot/

Krock, L. (2002). Spies that fly. Retrieved March 1, 2015, from http://www.pbs.org/wgbh/nova/spiesfly/uavs.html

Military Factory. (2015, January 4). UAV and Drone Aircraft. Retrieved March 1, 2015, from http://www.militaryfactory.com/aircraft/unmanned-aerial-vehicle-uav.asp

NASA EO. (n.d.). Measuring Vegetation (NDVI & EVI). Retrieved March 1, 2015, from http://earthobservatory.nasa.gov/Features/MeasuringVegetation/measuring_vegetation_2.php

Tetracam.com. (n.d.). Tetracam Products. Retrieved March 1, 2015, from http://www.tetracam.com/Products1.htm

UAV Insider. (2013, September 8). Rotary Wing vs Fixed Wing UAVs. Retrieved March 1, 2015, from http://www.uavinsider.com/rotary-wing-vs-fixed-wing-uavs/

Vividlight.com (n.d.). Spectrum of Light. Retrieved March 1, 2015, from http://www.vividlight.com/29/images/Spectrum%20of%20Light.jpg