Assessments Page

Search This Blog

About Me

My photo
I am a third year student studying unmanned aerial systems at Purdue University.

Sunday, November 17, 2019

Using a C-astral Bramor PPX for search and rescue

Introduction

When a person goes missing for any period it can be a troublesome situation. In this scenario, a person has been reported missing in Yellowstone National Park. They were reported missing after not returning home and have been missing for over 24 hours with no sign of where they could have gone. Due to the lack of information on where the person is missing this search and rescue (SAR) will be more a more endurance-based operation. Due to the perceived length of this mission, a fixed-wing will be the best choice, so for this mission, the C-Astral Bramor PPX will be used. This also plays into the type of search and rescue operation, there are two types, hasty or exhaustive. Being that the missing person has been missing for 24 hrs and there is no sign of where they were heading, an exhaustive search will yield the best results. This will give systematic coverage over Yellowstone to ensure that no location is missed and giving the best chance at finding the missing person.

Method
Flight location
As this is a national park flying traditionally would be restricted, so attaining a waiver to fly in the area is a necessity. This can both complicate things and provide benefits, as gaining the waiver will require time, it will also mean the airspace will be clear of other drones (which minimizes a collision risk). Before any flight, it is good practice to always check the sectional chart of the area to determine any flight hazards that need to be avoided. As Yellowstone is a national park the airspace is pretty clear when compared to somewhere like a city. This makes the search and rescue plan (from an airborne perspective) relatively straight forward

Figure 1, The airspace around Yellowstone

Current Conditions

This scenario that was given's goal is to be as real as possible and in search and rescue, one does not always get to wait until clear weather. To determine the weather in the area a METAR is needed, and to get that an identifier is needed for the nearest weather station. In this case, it is the box-like airport to the left of Yellowstone Lake. Its name is West Yellowstone, and its identifier is KWYS, this was conveniently gathered off Skyvector.com, which is the same place where Figure 1 is from. Figure 2 is the raw METAR which gives all of the information needed to fly safely in a confined format.

Figure 2

In the previous blog post, it was covered how to decode a standard METAR. Some of the quick things of this one are that there are 13-knot winds with visibility down to 1 statute mile (SM) and some light snow. The rest of the excess of information will be in figure 3, a full decoded version of the METAR.

Figure 3

With winds at 13 knots, it is really important to consider the max wind speed of the UAS platform used. This will ensure that control will be kept the duration of the flight with no "sketchy" moments. There is also light snow, which is also really important to keep in mind for all the equipment that is onboard the aircraft and if they can get wet. As the system is used the parts will get warm which will melt the snow and allow for the water to seep into cracks. What can also occur is the snow will melt, then refreeze on the airframe, so checking the airframe between launches and keeping a close eye onto its flight characteristics will be required. This will be hard with visibility at 1 SM and most likely require a live video feed from the aircraft to the ground station to help monitor the aircraft.

Aircraft

When choosing a platform for extended SAR it is important to keep in mind the operating characteristics of the common types of airframes. Whether it is a fixed-wing or multi-rotor, they each have their pros and cons. Given that this is a long-running operation a fixed-wing will provide the longest flight time while also allowing us to carry the necessary equipment. For this operation, a C-astral Bramor PPX is going to be used. This aircraft has a maximum wind speed of 23 knots which is below the current conditions, it also can be operated in extreme cold (-25 degrees Celius). Due to the rough conditions that the operation will be running in it is important to make sure that the aircraft will not be pushed to extremes and preform outside its envelope.

The Aircraft, C-Astral Bramor PPX


Sensor

There are two options with the Bramor when it comes to sensors an Altum Multispec and Sony RX1. They are both good cameras but because of the fact that there many trees around Yellowstone the ability to color filter the images post-process will make for an easier time to decipher the missing person from a tree.

Search Area

With a bit of searching online, one can find that an average healthy person can walk 20 to 30 miles a day. As Yellowstone is known for being mountainous and hard to traverse it is reasonable to assume that the person covered about half that area, this gives us a search area of about 15 miles. Depending on where the individual was when he went missing is going to change where the search area will be. Given the two circles in figure 4, one has its center around Old Faithful (the red circle), this is a common tourist attraction and it is likely that they visited the attraction and they could have gone missing from there. The second orange circle is centered around the "West Thumb" by the lake assuming that they might have been nearby the lake and they got lost around there.

Figure 4


Discussion

Due to the fact that this person has been missing for longer than 24 hours, it is important to cover an area that is larger than the area they could have walked in that time. This will require a base camp and area to launch and recover the Bramor. The Bramor will fly in a lawnmower pattern that will slowly cover the entire search area. This is to ensure that no area is missed and provide the largest chance at finding the missing person. The process of the SAR will be to launch the aircraft, have it fly a portion of the search area, land and download the data then relaunch the aircraft. When the data is downloaded it is to be processed and then looked over to search for the missing person. After the data is processed there will be points of the data in which the missing person will potentially be, it is then the job of the base team to direct the rescuers from the base camp to the missing person. This process is to be repeated until either the search area has been exhausted or the person has been found.

Conclusion

With the support of an unmanned platform a search and rescue operation that would have included multiple full-sized aircraft and provided a large bill to go with. Using an unmanned platform will allow the operation to cover the area quicker and gather data that is even more accurate than what can be gathered with a more expensive aircraft. By using an unmanned platform SAR operations can be completed quicker, cheaper and with less disruption to current daily airspace operations.

Monday, November 11, 2019

Decoding METARs TAFs and AIRMET

Introduction
In the word of aviation, it is helpful to convey a lot of information quickly. One of the ways they do this is through the use of abbreviations. One of the ways that these abbreviations really go crazy is through METARs (MEteorlogical Terminal Aviation Routine) and TAFs (Terminal Aerodrome Forcast), which are best described as fancy weather reports that include everything a pilot will need to know. With a lot of abbreviations, there needs to be a set format that everyone knows to understand the information conveyed. In many cases the basic set format is as stated:

Location - Date and Time - Wind conditions - Visibility - Temperature - Pressure - and then extra information.

This will be the METAR that will be decoded, the TAF is underneath the METAR and will be covered second


Decoding and Discussion

METAR:



When decoding a METAR it is helpful to remember how the information is presented, first up is the location. This is given as the callsign of the airport that released the METAR, In this case, it comes from the Purdue airport known as KLAF. The next block of numbers 081554Z is the date and time of the report. The date is the first two 08 for the 8th of November (the current month). The next 4 and the Z is the time it was taken in Zulu time (hence the Z), so it is 15:54 Z. Zulu time is also known as Universal Time Coordinated or UTC time (UTC is 5 hours ahead of eastern time to give a more general reference). The next number block 25004KT states the direction of the wind and the speed in knots (KT) so it is coming from 250 degrees at 04 knots. The block after is visibility, which is given as this 10SM CLR, as it is given it says the visibility is 10 statute miles and that it is clear (CLR).

The next two parts of the info given are the temperature/dewpoint and the altimeter setting. In this case, it is given as M02/M09, typically an M wouldn't be included but in this case, there was cold weather and M indicates a negative temperature, M02 means that the temperature is minus 2 degrees celsius, and M09 is the dewpoint and is at minus 9 degrees celsius. Altimeter setting is given as A3051, the letter can be either A or Q and that is the measurement that is used. "A" denotes inches of mercury (Hg), and "Q" is hectopascals (hPa) or millibar (mb) as they both are the same. In this case, it is given as A3051, for this measurement a decimal goes in the middle of the 4 numbers, making this specific measurement 30.51 inches of Hg.

Finally, there is the remarks section, indicated by RMK, everything after this is extra information that doesn't have a specific spot in the METAR already. After RMK it states A02, which means that the METAR was done by specialized automated equipment that CAN tell the difference between rain and snow, if it was A01 it wouldn't be able to tell the difference and a pilot would need to take that into consideration if they were flying into an airport with A01 equipment. The final thing that is stated is SLP338 which is Sea Level Pressure in millibars (mb) or hectopascals (as stated earlier they are equal). This is probably the hardest part to decode as it can get a little complicated. To start place a 10 or 9 before the first digits (338), a 9 will be used if the 3 digits are greater than 500, in this case, because 338 is less than 500 it gets a 10 in front. The last digit is also a decimal, so to properly state that final number it is 1033.8 mb (or hPa).  Using all the information that is given through this METAR you have all the information needed to understand the weather around a specific airport.

TAF:


The TAF is a forecast for the day. The first part is the same as a METAR, it gives the location, KLAF, and the date/time in UTC, Nov 08th at 11:25 UTC (081125Z), the other part 0812/0912 is how long the TAF is valid for (starts at Nov. 08 at 1200 UTC and ends at Nov. 09 at 1200 UTC). The next portion to touch on is the forecast change indicators, stated in their own separate lines. The three that are listed in the TAF are as stated: FM081700, FM090000, and FM091000. These are forecasts and are read similar to a traditional date and time on a METAR but the FM means from. This is to tell anyone listening to the TAF the predicted conditions from that date and time. There are three types of change indicators, FROM (FM), BECOMING (BMCMG) and TEMPORARY (TEMPO), from means, that there will be a rapid change, usually occurring within less than an hour. Becoming is used when a gradual change is expected and is given when the change is going to take about two hours and is stated as BMCMG 1416, for example, the 4 numbers are actually 2 times (as minutes arent that important) 1400 UTC and 1600 UTC. Finally, there is a temporary group which is used if a condition is expected to rise and is expected to not last more than an hour, this is used mainly if there is a given state of the weather and the state might pass into another level occasionally at a few points within a TAFs forecast. and example of which can be SCT030 TEMPO 1923 BKN030, to decode it means that there is expected scattered clouds at 3,000ft and they will temporarily become broken cloud cover at 3,000ft between 1900 UTC and 2300 UTC (Cloud coverage and density will be discussed later, so just understanding the basics of what TEMPO means is okay for now)

Back to the top line, it's onto the wind, and just like a METAR, it is given as direction it's coming from and then speed in knots, so 32005KT is translated as 320 degrees and 5 knots. VRB is also used in the forecast part, stated as VRB03KT, this just means that the wind is coming from a varied wind direction and not a specific degree value.

The next part is visibility, and again it's stated just like a METAR in statute miles (SM), what's not like a METAR is that it is only stated up to 6 SM, anything greater is given a P to denote. For each of the "from" portions and the top portion, visibility is at greater than 6 SM or P6SM. The last portion of this is the cloud condition. The way this is done is to consider the sky in 1/8ths. Clear is 0/8ths of the sky is covered, FEW is 1-2 eighths covered, Scattered (SCT) is 3-4 eighths, broken (BKN) is 5-7 eighths and finally overcast (OVC) is when the whole sky is covered with clouds. They are listed with numbers after them to tell pilots what level the clouds are at in feet. To translate two zeros are needed to be put after the numbers, so if the TAF read FEW250 it would mean there were few clouds at 25,000 ft. The final part of the TAF is an amendment or AMD, and everything afterward is describing the amendment. In this particular case, the amendment is limited (LTD) to cloud visibility and wind until Nov. 9th at 1400 UTC.

Airmet:

An Airmet is an area that is around airports and states a specific type of condition in that area. Figure 1 here is a view from SkyVector, a website that offers a digital sectional chart and can overlay Airmets to visually see the areas affected and not have to just make sense of the block of text that is an Airmet (see figure 2)

Figure 1
Figure 2
To go through the text version of an airmet (where you'll get all the information) it is similar to a METAR or TAF but a little different. The airmet that will be decoded is the large green one in figure 1. Green is for turbulence, the light blue is icing effects, purple (in the bottom corner) is instrument flight rules and one that isnt in the image but is used is pink and that means mountain obstruction. The first part of this airmet is where it came from and is reported from a few different airports at a specific time. This airmet is stated as an update for turbulence and low-level wind shear and is valid until Nov. 8th at 2100 UTC. The next large block of text underneath is the description of areas affected based on airports and states. First, it lists states affected in their two-letter abbreviations, then goes through and lists it via airports with their abbreviated names. After that block of text, the reader of the airmet comes to the next series of red text stating that there is moderate turbulence, in this case, it is specifically between 2 flight levels, 27,000ft and 37,000ft (just like in METARs you need to add two zeros to the end to get the flight level in ft). It again lists that the condition is continuing beyond 2100 UTC and are expected to end sometime around 0000 and 0300 UTC and it is valid from 2100 to 0300 UTC.

Conclusion

Being able to be given a METAR, TAF or Airmet and decern information from them can be both really helpful and determine if the flight will be both successful and safe. If conditions are too rough outside for a UAS platform it can be quickly discerned from an experienced pilot reading a METAR. A TAF is really important too as it states a forecast, and how quickly the conditions are expected to change. Airmets lastly isn't as important as the previous two, they typically are for much higher altitudes that no UAS platform will legally be flying at. Although they can still give important information like mountain obstruction and icing conditions, still it can be a factor in flying the UAS platform. A METAR and TAF together can really tell a UAS pilot everything they need to know to have a safe flight and not pose a danger to bystanders and aircraft alike.


Wednesday, November 6, 2019

Operating a UAS in complex airspace

Introduction

When looking at a sectional chart, especially in crowded cities, they can look very complicated. They can contain many different symbols and confusing circles that are hard to figure out what they mean if one doesn't know where to look. In many cases, you can find the necessary information on the side of the sectional charts, which explain anything and everything found on the chart for that specific area. The three scenarios are of complex airspace, each figure has a series of questions that need to be answered with them, is approval needed to perform an operation in that area, what are potential hazards inside the airspace and important aspects of the airspace that should be noted.

Scenarios and discussion

Scenario 1
In this scenario (figure 1) we are using a quadcopter for an inspection of a tower circled in orange (the red circle is a no-fly zone Trump tower as this complex airspace happens to be over New York City). To answer the first question that was stated yes approval is needed to fly in this area. This would come from JFK as they have airspace control in this area. A potential hazard that 2 runways facing this tower that need to be considered, so traffic will be heavy in the area. Along with the increased traffic from the runways, there are a few important aspects of this airspace that need to be considered. Trump tower as stated earlier is a no-fly zone so geo-locking the quadcopter is a must. Also, it is worth mentioning that because this is a city that there will be tall buildings to avoid and they can cause interference and increase the risk of an accident. Finally, there are tall buildings in the area and some have high-intensity lights. It will be crucial that the flight takes place before they are turned on as they have a chance of blinding the sensors used and make the data unusable.

Scenario 2
The second scenario is to use a fixed-wing to map Carrington Island (circled in red). As approval is not needed to fly in this area, the first question is answered. Some potential hazards of the area include the restricted airspace that is just to the west of the island. There is also a VFR checkpoint and route that runs just to the north and southeast respectively. These important aspects are things that should be avoided, the restricted airspace especially (so geo-locking the fixed-wing is required). The VFR route will most likely have a higher concentration of aircraft following the route. Although they will be at a higher altitude than is legally allowed to fly a fixed-wing UAS it is still a good idea to know that there could be a higher volume of air traffic along the route. The last thing to consider in this scenario is the isogonic line that runs right by the island. This will probably require that the compass will need to be calibrated.

Scenario 3
In scenario 3 an analysis of a forest on fox island is required (red oval is fox island). Approval for this flight is needed and will come from Gray AAF (GRF), it also wouldn't be a bad idea, in this case, to also alert Tacoma Narrows (TIW) that there will be operations in the area. The potential hazards of flying this island are that there two airports (GRF and TIW) right by the area of operation, and there is a group obstruction just to the east of TIW. The hazards are that there is class E airspace 700ft above the island, and again as a UAS should never be flown that high per FAA regulations, it is still good to note. There will also be a very high volume of air traffic in the area due to the airports around, also the high obstacles just east of TIW will be a good thing to avoid.

Conclusion

Being able to identify hazards in the airspace is essential to safe ops of a UAS. Sectional charts can be a useful way to determine what aviation hazards are in the area and what specific areas to avoid due to restrictions and other special cases. Being able to determine the hazards that are on these charts will separate a good UAS pilot from a professional.

Tuesday, November 5, 2019

Using False color in ArcGIS

Introduction

This project was looking at controlled burn fields and using false colors to visualize healthy and unhealthy vegetation. This is possible due to the sensor used on the aircraft, which along with the tradition RGB bands it gathers both near-infrared (nIR) and infrared (IR) bands. Visualizing these bands is done by replacing the specific RGB bands with the IR or nIR bands. In the images the red band was replaced with IR or nIR, the green band was replaced with the red band of the visual light spectrum and the blue band was replaced with the green band

The first two figures are the standard RGB photo of a field where a controlled burn took place, figure one will be the before picture and figure 2 will be the after.

Figure 1
Figure 2




The images are slightly different, as the post-burn was taken later in the day (as the fields had to be burned before the next series of images were taken) One will notice a small patch of green in one of the fields as the burn didn't completely burn all the vegetation. This leads to an interesting color splotch in the center of the field when the colors are changed. 









One will also notice the fact that the post-burn image is duller and less colorful as it was again taken later in the day and didn't have as much sun to brighten the image. This will lead to duller color changing when the colors are replaced. This is an important fact to keep in mind when taking IR images, as the light of the day will influence how the image is produced.






Method and Discussion

Taking these images and using the color filtering process stated earlier as IR to appear in the red band, red in the green band and green in the blue band. Seen in figure 3 (pre-burn) and figure 4 (post-burn)

Figure 3
Figure 4





In the images, healthy vegetation appears as red, this is because of healthy vegetation reflects IR strongly, as it is damaging to vegetation. This makes the whole image very pink and red as there is a lot of healthy vegetation.












In the post-burn images, the 5 burned fields appear green due to the color filtering. As it can be easier to see the large green patches in a farmer's field of unhealthy vegetation other than picking out brown patches of plants verse just dirt. There is also a small pink patch in one of the fields, which was touched on earlier. This is just a patch of healthy vegetation that wasn't burned.





The next ability with the camera was to look at the normalized difference vegetation index (NDVI) this measures the plant health. This is considered the classic indicator of plant health, as live green plants absorb solar energy they appear white in the images. Black, in contrast, is the dead/unhealthy vegetation, which can be seen in the 5 burn fields. Figure 5 is the pre-burn and Figure 6 is the post-burn.

Figure 5
Figure 6














Conclusion

Using both these options when viewing farmland can be very beneficial instead of trying to discern unhealthy vegetation from other brown and unhealthy looking things around the vegetation. This can help focus farmers so that they know where they should apply chemicals and other methods to save their unhealthy crops. This can save time and money for farmers as they now don't have to spray the whole field with chemicals and just an affected spot. 

Friday, October 18, 2019

Learning ArcGIS Online

Introduction

This project was undertaken as a way to learn how the program ArcGIS worked to better get to know the program. As the tutorial that was chosen to follow was based on the online version ArcGIS online was the one used. In this particular example, it was used to determine evacuation routes around Houston, TX before a hurricane.

Method

As the tutorial held the user's hand for the project it was pretty straight forward and walked the user through all the steps in order to have a developed map with all the features needed to asses where to focus resources. First, it took the routes which are published by the federal government and had them be overlayed onto the map. Second, it had the user look for Bayous, seen in Figure 1, which has a tendency to flood quickly during a hurricane.
Figure 1

After the Bayous were identified the next thing to see is the percent of households without a car. This is also important because those families will be needing extra assistance to be evacuated. This was obtained using census data collected by the government and then overlayed over the map once the desired subject (percentage of households without a vehicle) was selected. On the full map view, the dark blue parts are where the highest percentage of these families are and where most of the families are at risk seen in Figure 2

Figure 2

Discussion/ Results

This program can be a little tricky, by following the tutorial it allowed the user to not get confused by the complicated procedures that the program requires in order for data to be displayed properly. This program can be used to plan and design many different scenarios. Using all the data that was displayed after following the tutorial it is possible to plan an evacuation if needed.

Conclusion

As one uses this program it becomes apparent that it doesn't have a great user interface which can make using the program difficult as many things are not intuitive. Once the user follows the tutorial and gets a good understanding of the program it is possible to do all sorts of interesting comparisons and data analysis. 

Wednesday, October 16, 2019

Thermal Insulation

Introduction

When using an IR sensor it can be important to see what different types of insulation can do to affect the heat transfer to and from objects. In this lab, 4 plastic aircraft were put into 4 separate bags with different ways to insulate. One was placed in a bag with no air, another with air, the third was taken from a refrigerator and placed into a bag with air and the fourth was wrapped into a cloth. With these 4 types of insulation and the IR sensor, it is possible to view the effectiveness of each insulation choice.

Methods

First, the two pictured are the plain aircraft and the one that has been in the fridge. You can tell that number 2 in figure 1 is from the fridge because it was darker than 1 and they both hit the water at relatively the same time.
Figure 1

In figure 2 all 4 bags are in. 1 and 2 are the same as before but 3 is the airless bag and 4 is clearly the cloth bag. 

Figure 2

As the bags have sat in the sous vide bath they got paler on the IR sensor, meaning that they started absorbing heat from the water bath. the only plane that remains dark is 4. The other three are all about the same shade now seen in figure 3

Figure 3



Discussion/Results

As seen in the figures different types of insulations work for different amounts of time. The cloth worked the best for the longest, keeping the plane the coolest. This can be useful especially when using an IR camera.

Conclusion

When using an IR sensor it is important to keep in mind how insulation will affect the readings. If a flight is to take place in the morning before the heat of the day, some things will look different based on how they are insulated against heat.

Wednesday, October 9, 2019

Using an infrared camera



Introduction

Using a forward-looking infrared camera (FLIR) a series of images were taken in both standard visible light and in infrared (IR). This was to demonstrate the properties of an IR camera and to demonstrate some of the advantages and drawbacks of using an IR camera. In lab 2 IR sensors were used, one handheld device that was easy to walk around with, the other was a DJI Zenmuse XT2 on a DJI M600, the M600 was only used as a stable platform over a sous vide cooker (figure 1).

Figure 1

Methods

IR cameras allow for some interesting effects to be seen that otherwise wouldn’t have been. For example, seen in figure 2, there is a cold can of soda and a mug that is filled with boiling water. The water appears bright white on the camera indicating that it is the hottest thing that the camera is detecting. After lifting up the mug one can see that some of the heat was transferred to the table and made a bright white ring where the mug was previously (figure 3).


Figure 2
Figure 3












The next thing that the handheld IR sensor was used on was the air condition vent in the ceiling. The vent was outputting cold air which made the surrounding ceiling tiles appear orange and warmer. The image filter on the color purple/magenta indicates the cold surface that is being cooled by the air.


Figure 4, a filter was applied to color the image but the sensor still worked the same

A thermal camera can also be used to see hot exhaust out of a jet engine. It can see hot spots on  devices that are in use seen in figure 5 and 6

Figure 5, Hot jet exhaust
Figure 6, The FLIR camera generates a lot of heat









Discussion/ Results

An IR sensor can be very helpful in identifying heat or lack of in specific areas. This can be helpful where just visual light won’t help. Although IR seems to be very beneficial it does have some drawbacks. On surfaces that reflect light easily, there can be some issues, for example, a tree's leaves can appear black or white depending on the angle they are, which can make combing an area for specific heat values and signatures difficult with an unmanned platform. Another feature that can make IR sensors difficult to use is the fact that they cannot “see” through the glass. If one were to point an IR sensor at glass there wouldn’t be any heat signatures behind it. Glass can act as a mirror in this situation and reflect back heat that is behind the camera though, seen in Figures 7 and 8. Figure 7 is a normal camera view, while figure 8 is an IR view with an RGB overlay, seen in the outline of the student on the other side of the glass

Figure 8, the thermal view reflecting from behind the camera
The outline is the student behind the glass
Figure 7, a student standing behind a glass window











Conclusion

IR sensors are very helpful in many situations but they do have drawbacks. As they don’t typically include an RGB sensor distinguishing objects from one another solely based on the heat signatures given off can be difficult. Due to the difficulty of seeing through the glass and how plants can affect readings, there can be some difficulties in using IR sensors but as long as those are kept in mind when looking in at the data and if possible couple the IR sensor with an RGB sensor then the few drawbacks that are present can be mitigated.


Purdue Wildlife Area Flights with a Mavic 2 Pro

 Introduction Through Purdue's UAS program we were able to fly Mavic 2 Pros to do automated missions through Measure Ground Control to g...