Monday, May 23, 2016

Warfare by Remote Control

Remote warfare has been the desire of armies for centuries.  This was sought out and achieved by weapons of ever increasing range.  When your force’s effective range exceeds that of the enemy you can attack the enemy without putting your own forces at risk.  This was true during Napoleon’s time and continues to be true today.  Lt. Col. John Janiszewski, chief of experimentation and analysis directorate for the Army's Unit of Action Maneuver Battle Lab said, "it's important that our Soldiers become capable of using unmanned vehicles efficiently because their use means fewer Soldiers being exposed to dangers of the battlefield” (FIND, 2006).  It may seem to some that holding the enemy at risk without placing your own troops in harm’s way is unfair, but the nature of war is to gain an advantage over the enemy.  Some will question whether unmanned, remote warfare represents a just war.
The philosopher Augustine is credited with developing Just War Theory in Western tradition (Augustine: Just War, 2002).  He identified “two aspects of war that required moral justification and guidelines: the right to go to war (Jus Ad Bellum) and the right sorts of conduct in war (Jus In Bello)” (2002).  For the most part the United States (US) Department of Defense (DoD) is using remote warfare to support their conflicts within the Laws of Armed Conflict (LOAC).  As the DoD is utilizing remote warfare in many of the same roles as manned aircraft the Jus Ad Bellum is satisfied, however, some will argue the means of remote warfare is unjust.  The supporters of this theory will cite the just war principles that state,
 “The weapons used in war must discriminate between combatants and non-combatants and civilians are never permissible targets of war, and every effort must be taken to avoid killing civilians. The deaths of civilians are justified only if they are unavoidable victims of a deliberate attack on a military target”  (Ferraro, 2010).
Unmanned aerial systems (UAS) actually can better discriminate between combatants and non‑combatants since they can conduct a long loiter and surveillance of the target before weapons release.  The operator of the unmanned aircraft (UA) has a better knowledge of a situation than an attack or bomber aircraft which only has ingress, release, and egress and has not built the pattern of life for themselves like the UAS crew will.  Others still will argue it is preferable to put your own troops in danger or even death rather than cause noncombatant deaths.  However, there are exceptions made when non-combatant death is unavoidable in order to attack a legitimate target.  In unmanned systems the executor of the war is comfortably located thousands of miles away from the conflict.  This is the ultimate high ground with regards to range as it was discussed at the beginning of this research.  In general UAS employed by the US DoD are executing the same missions as manned aircraft; however, they are employing their capabilities differently than manned aircraft whether it is sortie duration or weapons tactics, techniques, and procedures.



References

Augustine: Just War. (2002). Retrieved from Great Philosophers: http://oregonstate.edu/instruct/phl201/modules/Philosophers/Augustine/augustine_justwar.html
Ferraro, V. (2010, February 1). Principles of Just War. Retrieved from Mt Holyoke: https://www.mtholyoke.edu/acad/intrel/pol116/justwar.htm
FIND. (2006, August 16). Army Tests Remote Warfare, Soldier Performance. US Department of Defense Information.



Sunday, May 22, 2016

Right Man for the Job

The Insitu ScanEagle and General Atomics Ikhana are significantly different aircraft albeit with similar purposes; terrestrial observation.  The smaller ScanEagle is approximately 5 feet long with a wingspan just over 10 feet and a max takeoff weight of just under 50 pounds (Insitu, 2015).  The Ikhana is a variant of the Predator B.  It is significantly larger than the ScanEagle at 36 feet long and a wingspan of 66 feet.  The Ikhana is able to takeoff at 10,000 pounds (Drdla, 2015).  Both aircraft perform similar missions with their electro-optical or infrared imaging sensors.  Since both of these aircraft have similar variants that have been type certified by the Federal Aviation Administration (FAA) this case study will assume both aircraft will have certificates of airworthiness (Kasitz, 2015), (Aerospace Daily & Defense Report, 2013).
During this hiring effort the company will only focus on the operations crews.  The launch and recovery crews are not subject to the same qualifications and training.  The operational concept will look to operate the ScanEagle with one crew member, the pilot in command also acting as a sensory operator.  The Ikhana will be crewed by two members, one acting as the pilot in command and the other acting as a co-pilot and sensor operator.
Due to the nature of the positions being hired by the company, certain qualifications must be met in order to be hired.  Because of the assumption both unmanned aerial systems (UAS) will maintain a certificate of airworthiness all potential applicants will require a pilot certificate in order to be competitive for the position.  The FAA requires a pilot certificate for the pilot in command of any aircraft with an airworthiness certificate additionally a supplemental pilot, which the sensor operator will be, requires, at a minimum, private pilot’s ground school (Seipel, 2013).  In addition, applicants must be able to obtain and maintain a Flying Class II physical.
Once the individual is hired the training program will begin.  Each airframe will have a specific plan of instruction (POI) for the initial qualification training (IQT).  This will follow a logical progression beginning with fundamental knowledge about the system itself including the aircraft, ground station, launch equipment and recovery equipment.  The next area of instruction is the nominal operations of the equipment beginning with launch, then flight and sensor operation, and finally landing or recovery.  Once the knowledge base has been built, the student will progress to simulator operations and finally flying the real system in an operational environment.  Upon successful completion of a comprehensive final flight checkout the operator will be certified for unsupervised operations.  In the Ikhana system the initial certification will be in the co-pilot/sensor operator position.  A similar POI for upgrade training will take place to reach the pilot in command position.  The initial training courses will be taught by the manufacturer.  Once a team of subject matter experts is developed, training will move in-house and be taught by instructors from within the company.  The ScanEagle certification will be for the single position.  The initial Ikhana certification will be for the sensor operator position.  Upon successful completion of the upgrade training sequence, the operator will be dual certified in both pilot in command and sensor operator positions.  Periodic rechecks will occur to maintain currency in each certified position.


References
Aerospace Daily & Defense Report. (2013, June 26). FAA Type Certifies Firs UAS for Commercial Ops. Retrieved from Aviation Week Network: http://aviationweek.com/awin/faa-type-certifies-first-uas-commercial-ops
Drdla, K. (2015, February 3). NASA Airborne Science Program - Ikhana. Retrieved from NASA: https://airbornescience.nasa.gov/aircraft/Ikhana
Insitu. (2015). ScanEagle. Retrieved from Insitu: https://insitu.com/images/uploads/pdfs/ScanEagle_SubFolder_Digital_PR080315.pdf
Kasitz, K. (2015, June 15). Certifiable Predator B Completes Critical Design Review. Retrieved from General Atomics and Affiliated Companies: http://www.ga.com/certifiable-predator-b-completes-critical-design-review
Seipel, J. D. (2013, August 2). Airworthiness Certification of Unmanned Aircraft Systems and Optionally Piloted Aircraft. U.S. Department of Transportation Federal Aviation Administration.



Is It Worth It? Operational Risk Managment

I chose to create an Operational Risk Management (ORM) Assessment Tool (Figure 1) for the Boeing/Insitu ScanEagle small unmanned aerial system (SUAS).  The ScanEagle is used by the Department of Defense in an intelligence, surveillance, and reconnaissance (ISR) role.  The ScanEagle is capable of carrying electro-optic or a dual imager (Boeing, n.d.).  It has an operational ceiling above 15,000 feet and mission duration of 20 hours (Boeing, n.d.).  The ScanEagle launches from a Mark 4 launcher or the more mobile Compact Mark 4 launcher and is auto-recovered by the SkyHook recovery system which catches the aircraft mid-flight (Army-Technology, 2016).  A unique feature of the launch, control, and recovery systems is that they are interoperable with all other unmanned aircraft developed by Insitu.
The ORM assessment tool development began with the Preliminary Hazard List and Assessment (Figure 2).  To create this list I ran through a mission from launch to recovery addressing potential known risks of unmanned aerial systems (UAS) and expanded on those specific to the ScanEagle system.  Since the aircraft itself costs less than $100,000 no risk was assessed above marginal (Barnard, n.d.).  It is assumed during launch and recovery operations, personnel will not be in a place of danger to cause higher levels of risk.
Following the Preliminary Hazard List and Assessment the Operational Hazard Review and Analysis (Figure 3) was created.  When the mitigation actions of the Preliminary Hazard List and Assessment were integrated into the risks many of the probability ratings were reduced.  In addition many of the severity ratings were reduced after mitigation actions were incorporated.
The ORM assessment tool followed similar planning by going from mission planning, to launch, then on to the mission itself, and finally recovery of the aircraft.  In addition to the risks of the aircraft, the human aspect is also accounted for.  This includes the pilot crew members as well the launch and recovery crew members.  The level of mission planning begins the ORM assessment tool because it is the first factor to increase or decrease risk.  Deliberate mission planning is rated the lowest risk because more factors can be assessed and preventative measures taken.  Crews are the next area assessed in the ORM tool beginning with which portion of a crew rotation they are in.  Earlier in the rotation is rated lower because personnel are more rested and focused in this portion.  The latter portion of a rotation will show more fatigue and greater possibility of personal error.  The level of experience also affects risk.  Launch and pilot crews are assessed separately because of the unique duties they must perform.  The launch crews require fewer launches to reduce their risk due to the automated nature of the launch and the duration of sorties exposing the pilot crew to greater risk.  Since the ScanEagle can fly for up to 28 hours, the weather at the launch site may vary greatly from the weather in-flight.  For this reason each area has a specific portion of the assessment.  Day operations receive a lower risk than night operations.  During the mission it is possible for the ScanEagle to lose its link.  The lowest threat is if the aircraft returns to base while continuing the mission is the highest.  During continuation of the mission it is unknown if the link will be reestablished therefore increasing the risk.  Shorter missions are rated less risky partly because there is less opportunity for a mishap to occur.  This also ties in with the next assessment which is crew makeup.  A single crew is less of a risk than when a mission is transferred to a new crew.  A changeover briefing or procedure helps reduce this risk, however, there is no substitute for experiencing previous portions of the mission.  If a crew is non-current they must regain their currency before unsupervised operations.  Crew members that are current are assessed the lowest risk.  Although the ScanEagle is a small system, enemy threat cannot be ignored.  If intelligence assesses no risk the lowest value is assessed.  In the event of a threat then the level of risk is raised.  The final area of concern is GPS jamming.  Many areas of operation will not have any GPS threat, however, in those areas of the world where GPS jammers are proliferated the ORM risk is raised.  Once all areas are assessed the total is compared to determine the overall level of risk for a particular mission.

Figure 1. Operational Risk Management Assessment Tool




Figure 2. Preliminary Hazard List and Analysis for ScanEagle Small Unmanned Aerial System




Figure 3. Operational Hazard Review and Analysis



References
Army-Technology. (2016). Army-Technology. Retrieved from ScanEagle2 Unmanned Aircraft System, United States of America: http://www.army-technology.com/projects/scaneagle-2-unmanned-aircraft-system-uas/
Boeing. (n.d.). ScanEagle Unmanned Aerial Vehicle. Retrieved from Historical Snapshot: http://www.boeing.com/history/products/scaneagle-unmanned-aerial-vehicle.page
Barnard (n.d.). Barnard Microsystems. Retrieved from InSitu Group ScanEagle A15: http://www.barnardmicrosystems.com/UAV/uav_list/scaneagle.html



Automatic Takeoff and Landing: Manned vs. Unmanned

Takeoff and landing is not only the most stressful portion of each flight, it is the most dangerous times as well.  Between 2004 and 2013, 80% of fatal aircraft accidents occurred during takeoff and climb or descent and landing (Walker, 2015).  Often the cause of these accidents was determined to be pilot error.  One way to reduce this risk is to remove the pilot from potential pilot errors by automating these segments of the flight.  Automatic takeoff and landing systems are being integrated in manned and unmanned aircraft systems.  The Boeing 777 is one manned aircraft that has this capability.  The Boeing/Insitu ScanEagle unmanned aircraft system (UAS) is one of many UAS that incorporate automated takeoff and landing systems.  Automated takeoff and landing systems are drastically different for manned and unmanned aircraft.
The Boeing 777 utilizes a complex series of systems and triggers to progress through takeoff, climb, cruise, descent, landing, and rollout.  The first step of the B777 automated takeoff is to set throttles and begin rolling.  Autothrottles take over as speed increases.  As the aircraft accelerates above 100 knots indicated airspeed (KIAS), the flight computer records the barometric altitude which will feed the vertical navigation (VNAV) system when it is engaged.  The pitch command will rotates the aircraft.  At 50 feet the lateral navigation (LNAV) system engages, at 400 feet the VNAV engages. (B777, nd).
The landing portion of the flight begins with runway alignment.  The B777 system is capable of correcting for crosswinds and crabbing as necessary.  At 50 feet radio altitude the system will begin the flare maneuver.  Once the aircraft is less than two feet radio altitude the system will engage the rollout mode which allows for touchdown.  The autopilot will perform rudder and nose gear steering during rollout (B777, nd).  All of these modes and systems require complex interoperability among many systems to accomplish a safe flight.
The Boeing/Insitu ScanEagle uses a much more rudimentary automated takeoff and landing system.  The ScanEagle is “catapult launched from a pneumatically operated wedge launcher with a launch velocity of 25m/s [55 mph]” (Naval-Technology, 2016).  Once the ScanEagle is airborne it is commanded by a 900 MHz UHF datalink.  The ScanEagle can fly autonomously back to home station where“the patented SkyHook recovery system is used for retrieval. SkyHook catches the aircraft’s wingtip with a rope that hangs from a 50-foot-high (15-meter-high) boom” (Boeing, nd).  Essentially the landing is a controlled crash where the aircraft is snagged out of the sky by the SkyHook.  The ScanEagle does not have any landing gear.
Each of the automated takeoff and landing systems these aircraft use is appropriately suited for its mission.  The $100,000 unmanned ScanEagle is disposable when compared to $320 million Boeing 777 with up to 365 passengers (Ausick, 2014).  The ScanEagle’s failsafe actions are to crash.  It does not have an alternative to the SkyHook recovery.  The B777 automated takeoff and landing system, however, can be interrupted at any portion of the flight profile.  Additionally portions of the automated system can be disengaged such as the VNAV, LNAV, or autothrottle.  One limitation of the automated system is in contingency situations such as Flight 1549 where Capt Sullenberger landed his Airbus A320 on the Hudson River following a bird strike during takeoff.  The automated system does not have the situational awareness or alternative thinking and reactions of a human.  In this case a human pilot is necessary for safe operation.
When it comes to automated takeoff and landing the operation of the system is proportional to the aircraft operating it.  The micro-UAS ScanEagle is actually better served through simplicity.  This makes the system must more transportable and usable for forward deployed troops.  The B777, however, must operate in varying environmental conditions and airports.  More than that it is responsible for the safe transportation of its passengers and therefore requires a much more complex system to ensure safety.


References
Ausick, P. (2014, June 27). 24/7 Wall St. Retrieved from Why a Boeing 777-300ER Costs $320 Million: http://247wallst.com/aerospace-defense/2014/06/27/why-a-boeing-777-300er-costs-320-million/
Boeing. (nd). ScanEagle Unmanned Aerial Vehicle. Retrieved from Historical Snapshot: http://www.boeing.com/history/products/scaneagle-unmanned-aerial-vehicle.page
Naval-Technology. (2016). Naval-Technology. Retrieved from ScanEagle, United States of America: http://www.naval-technology.com/projects/scaneagle-uav/
Walker, R. (2015, March 28). The Globalist. Retrieved from When Do Planes Crash: http://www.theglobalist.com/when-do-planes-crash/

Shift Work


Here's my proposed 6 on - 2 off shift schedule for 4 teams.  I chose to rotate each team one shift earlier each time a change was made.  This schedule works two nights, two swings, two days, two off.   This 2-2-2 rotation is similar to what many space operations squadrons use.  Rotating this way also helps reset the body's circadian rhythm back to a day shift prior to the days off.

Beyond Line of Sight Unmanned Aerial System Ops

Drones, unmanned aerial vehicles (UAV), unmanned aircraft systems (UAS), and remotely piloted vehicles (RPA) have grown exponentially over the past two decades since they were introduced in the Balkan conflict.  These systems which began as large Department of Defense (DoD) assets have evolved both into advanced DoD systems and consumer level systems.  Most consumer UAS from the highly advanced DJI Phantom series to the five inch Micro Drone operate strictly in a line of sight configuration.  Line of sight (LOS) operations occur when the transmitter and receiver are in a line such that they can “see” each other, free from obstacles or even the curvature of the earth.  Some common examples of line of sight communications are the radio in your car or family radio service “walkie-talkies.”  Line of sight operations work well in a consumer product or tactical military operations, however, for greater endurance and range a beyond line of sight (BLOS) capability is required.
The Northrop Grumman RQ-4 Global Hawk is an unmanned high-altitude long endurance intelligence, surveillance, reconnaissance (ISR) platform capable of flying in excess of 65,000 feet with a 12,300 nm range (Northrop Grumman Systems Corporation, 2011).  This clearly exceeds any LOS communications capability.  To conquer this challenge the Global Hawk utilizes satellite communication to provide command and control (C2) and voice communications while “broadband communications via commercial satellites serve as the primary data link for transmitting imagery” (Integrated Systems Western Region, 2007).  Satellite communication can be deceiving, because it in itself is a LOS communication, however, the footprint of a geosynchronous communication satellite is much larger, approximately one-third of the earth.  Each of the transmitter and receiver needs to be within that footprint, unless other crosslinks or networked communications are used.  
The Global Hawk system “is operated by the 12th Reconnaissance Squadron at Beale Air Force Base, California, and the 348th Reconnaissance Squadron at Grand Forks AFB, North Dakota” while “aircraft are rotated to operational detachments worldwide” (Air Force, 2014).  At the launch and recovery elements (LRE) located around the world LOS communication is used, in the Pacific theater, “the LRE located at Andersen Air Force Base provides line of sight launch and recovery capability and transitions C2 and communications to a remote pilot ground station at Beale AFB for BLOS enroute flight at or above FL500” (International Civil Aviation Organization, 2011).  The Global Hawk system flies autonomously given preplanned missions or ad hoc flight profile corrections developed by the pilots.
Communication is the number one concern during LOS to BLOS operations of the Global Hawk.  Through voice or data communications or standard operating procedures command of the aircraft must be positively transferred.  The autonomous flight of the Global Hawk minimizes the human factors risks during transfer from LOS to BLOS, unless human interaction interferes.
Commercial application of BLOS capabilities can greatly reduce cost and man-hours for inspection or monitoring tasks while also increasing personnel safety.  Burlington Northern Santa Fe (BNSF) has partnered with the Federal Aviation Administration (FAA) to explore UAS operations in rural and isolated areas. Burlington Northern Santa Fe “will explore command-and-control challenges of using UAS to inspect rail system infrastructure” (FAA, 2015).  Another use would be by the National Oceanic and Atmospheric Administration (NOAA) to monitor arctic ice conditions as they impact commercial fishing and merchant ships.  The potential for commercial use of BLOS UAS is nearly limitless; however, regulations currently inhibit the expansion of the capability.


References
Air Force, U. (2014, October 27). U.S. Air Force. Retrieved from RQ-4 Global Hawk: http://www.af.mil/AboutUs/FactSheets/Display/tabid/224/Article/104516/rq-4-global-hawk.aspx
FAA. (2015, May 6). Federal Aviation Administration. Retrieved from Press Release – FAA-Industry Initiative Will Expand Small UAS Horizons: https://www.faa.gov/news/press_releases/news_story.cfm?newsId=18756
Integrated Systems Western Region, N. (2007, March 1). Northrop Grumman. Retrieved from Global Hawk Maritime Demonstration System: http://www.northropgrumman.com/Capabilities/RQ4Block10GlobalHawk/Documents/GHMD-New-Brochure.pdf
International Civil Aviation Organization. (2011). The Twenty-First Meeting of the APANPIRG ATM/AIS/SAR Sub-Group (ATM/AIS/SAR/SG/21). Global Hawk Pacific Operartions, (pp. 1-2). Bangkok, Thailand.
Northrop Grumman Systems Corporation. (2011, October 17). Northrop Grumman. Retrieved from Q4-HALE Enterprise: http://www.northropgrumman.com/Capabilities/GlobalHawk/Documents/Brochure_Q4_HALE_Enterprise.pdf

Can Unmanned Aerial Systems Integrate into the National Airspace System?

The Federal Aviation Administration’s (FAA) Next Generation Air Transportation System (NextGen) program seeks to increase efficiency, safety, and performance of the air traffic system within the United States.  The NextGen program centers around a combination of infrastructure, equipment, and procedural updates.  These upgrades include Automatic Dependent Surveillance-Broadcast (ADS-B), Collaborative Air Traffic Management Technologies, Data Communications, National Airspace System Voice System (NVS), NextGen Weather, and System Wide Information Management (SWIM) (FAA, Federal Aviation Administration, 2014).  A main capability these programs provide is the ability to share data among air traffic control centers, aircraft, airports, and commercial airline operations centers.  Increasing the data sharing amongst these agencies allows more precise decision making, enabling greater efficiency of air operations which can reduce emissions and produce savings for the operators as well as passengers.  This paper will address some of the NextGen programs that most directly relate to the introduction of unmanned aircraft systems (UAS) to the national airspace system (NAS).
The air transportation system is finally catching up with what vehicles on the ground have utilized for decades; the introduction of GPS to broadcast position, speed, and altitude.  The ADS-B Out portion of the system “uses GPS to determine an aircraft’s location, airspeed and other data. It broadcasts that information to a network of ground stations (which relays the data to air traffic controllers) and to nearby aircraft equipped to receive the data via ADS-B In” (FAA, NextGen Implementation Plan 2015, 2015).  The ADS-B In portion of the system “provides operators of properly equipped aircraft with weather and traffic information delivered directly to the cockpit” such as the Flight Information Service Broadcast (FIS-B) and the Traffic Information Service Broadcast (TIS-B) (2015).  This system will reduce the potential for human error when determining and broadcasting position, heading, and airspeed.  Pilots with ADS-B In terminals will have more accurate information available without the need for requesting data or transcribing it from the radio.  Mandatory equipage of ADS-B transponders begins in 2020, however, there is not a mandatory date for ADS-B equipage.  This will ensure all aircraft to feed the system while only those requiring the data must install ADS-B equipment. The ADS-B systems will require an increase in data communications on and off an aircraft.
We live in an increasingly data dependent world.  No longer is it acceptable to lose data capability during a flight.  NextGen Data Communications (Data Comm) closes that gap.  Data Comm enables “controllers and pilots to communicate with digitally-delivered messages” (FAA, NextGen Implementation Plan 2015, 2015).  Removing the necessity of radio voice communication will reduce miscommunications between air traffic controllers and aircraft.  The Data Comm implementation begins with predeparture clearances and route revisions.  Some realized benefits during initial stages of Data Comm rollout in Memphis and Newark included, “reduced communications time resulting in faster taxi outs, reduced delays and reduced pilot and controller workload” (2015).  With the increased implementation of the ADS-B system, Data Comm will become more important and will realize even greater benefits.
The evolution of the NAS transportation system brings with it an increased need for better, networkable communication across the country.  The NAS Voice System (NVS) answer this need.  Rather than the “current voice switches operated independently at individual facilities, NVS will use router-based communications linked through the FAA Telecommunications Infrastructure (FTI) network” (FAA, NextGen Implementation Plan 2015, 2015).  The new router based communication eliminates the obsolescence problem the current system is facing.  It will also allow “controller workload from one air route traffic control center to another as needed” (2015).  This flexibility will allow centers to adapt to surges caused by weather, maintenance, or unexpected outages.
Unmanned aircraft systems have, in effect, been using many of the features of NextGen in their operations already.  Ground control stations have been operating in a similar fashion to air traffic controllers when they are controlling multiple UAS.  This is a benefit NextGen can realize from UAS operations, the human factors of the unmanned systems.  The NextGen system relies increasingly on automation and data sharing to fulfill its goals.  Much of this relies upon the system resident on the aircraft such as ADS-B transceivers and Data Comm modules.  The addition of such equipment does not pose a large risk to manned aircraft from Cessna 152 sized aircraft up to jumbo airliners like the Boeing 787.  Unmanned aircraft in contrast are much more like satellite systems with strict constraints on size weight and power.  In addition, communications equipment associated with NextGen systems may create electromagnetic interference with the command, control, and data links used to operate the UAS.
Communication is key to successful aviation.  With manned aircraft, the pilot in command is resident in the aircraft itself.  Often large UAS such as Predators and Global Hawks are operated from a geographically separated facility.  This may require the UAS itself to act as a communications relay to transmit data from NextGen systems to the UAS then back to the command and control node.  This increases the data and communication latency.  This latency hinders the UAS pilots decision making ability as well as the response time once a decision is made.  In aviation, decision making and responses are time critical and can mean the difference in life and death.
The NextGen system was developed to address commercial and general aviation concerns in an increasingly congested NAS.  Unmanned systems were not a concern with the advent of NextGen.  For example, UAS flight hours increased nearly 6 times between 2005 and 2010 (Weatherington, 2010).  Since then, the types of UAS and quantity of flights has exploded.  The past six years has seen a boom in civil, consumer, and military UAS use.  Integration of UAS into the NAS will be a difficult task.

References
FAA. (2014, November 18). Federal Aviation Administration. Retrieved from NextGen Programs: http://www.faa.gov/nextgen/programs/
FAA. (2015). NextGen Implementation Plan 2015. Washington, DC: Office of NextGen.
Weatherington, D. (2010). Unmanned Aircraft Systems. OUSD (AT&L)/PSA.


Unmanned Aerial System Ground Control Stations Have Their Issues Too

The RQ-11 Raven is a small hand launched unmanned aircraft system (UAS) produced by AeroVironment, Inc capable of performing tactical intelligence, surveillance, and reconnaissance (ISR).  It is currently employed by the United States’ armed forces as well as allied partners and commercial entities.  The UAS is “operated by two Soldiers and has a rucksack-portable design. No specific military occupational specialty is required” (U.S. Army, 2014).  The Raven unmanned aircraft (UA) operates up to a range of 10 km from the ground control unit (GCU) commanding it (AeroVironment, 2016), however, the UA is capable of transferring command responsibilities to another GCU located beyond its standard range (Headquarters, 2006).  AeroVironment utilizes a common ground control station for their Wasp AE, Raven, and Puma UASs (2016).  By using a common ground control station, AeroVironment reduces the training burden for operators using multiple weapon systems, or transferring between UASs.  This research focuses on the Raven’s use of this ground control unit.
The Raven ground control station (GCS) has multiple interfaces.  This configuration helps to support varied conditions the users may find themselves in.  The first interface is the hand controller which resembles a personal gaming system.  The hand controller consists of a unit with a “video screen with text overlays, a joystick, a toggle switch, a four-way “hat” control and four buttons (two on the front, two on the rear)” (Stroumtsos, Gilbreath, & Przybylski, 2013).  Being a hand held unit, the display is limited in size and what can be displayed.  This unit can toggle between flight modes, zoom the camera, and display video (2013).  Given the limited number of input devices and displays, interacting with the UA becomes difficult.  Another difficulty in using the handheld GCS is the requirement for using a hood to block out extraneous glare.  While this makes seeing the display easier, it removes all situational awareness of what is happening around the operator making them more vulnerable.  AeroVironment offers another GCS solution to help negate some of the drawbacks of the handheld GCS.
The laptop interface utilizes the capabilities of the handheld GCS and adds increased display and input features.  The laptop connects to the handheld GCS via an ethernet cable.  This interface provides the ability to graphically display much more detail such as graphical heading, graphical waypoints, and the above ground altitude of a single waypoint (Stroumtsos, Gilbreath, & Przybylski, 2013).  Stroumtsos, Gilbreath, and Przybylski make the statement that “All of the shortcomings of the current GCS can be addressed using a laptop-based GCS with the appropriate software” (2013).  This is a very interesting design premise.  While software developers are capable of outstanding work, this statement may overstep the realistic bounds of software.  However, the Space and Naval Warfare Systems Center (SPAWAR) created a multi-robot operator control unit (MOCU).  The goal of this system is to integrate a single control unit for use with unmanned systems independent of which domain that system is operating in.  This utilizes the laptop interface and an X-Box 360 controller.  Both of these pieces of hardware are probably familiar to the operators which helps reduce the training burden.  A student operator only needs to learn the software portion of the system.  Although as the authors say anything can be addressed with the right software, there are other human factors issues as stake in the Raven’s GCS design.
The U.S. Army field manual on UAS operations acknowledges “a single factor such as human error or materiel failure seldom is the only cause of an aviation accident. Accidents are more likely to result from a series of contributing factors” (Headquarters, 2006), however, immediately following this paragraph it places human factors as the first bullet under “Accident Causes.”
One human factor issue is the lack of situational awareness when an operator is utilizing the handheld GCS.  The operator is unable to monitor his or her surroundings while there head is down looking inside the hood.  Additionally, some operators describe the character based symbology as difficult to understand.  One may argue that the Predator or Reaper operators also do not have situational awareness of what is occurring outside their GCS, however, the big difference is the distance between the UA and the GCS.  The Raven operators are within 10 km of the area of interest they are surveilling.  In addition, the switching between hooded view and returning to the real world “the change in brightness causes loss of vision for 5-15 seconds” (Stroumtsos, Gilbreath, & Przybylski, 2013).  This time lost could be crucial during an engagement with the enemy.  Since the need for the hood results from a dim screen a couple potential solutions arise such as operating the handheld GCS from within a shelter.  This will likely reduce the man-portability of the system if a shelter is also required.  Another option is a tinted eyeglass, goggle, or contact lens.  A study conducted in 2005 looked at the effects of absorbtive contacts lenses on 4 subjects with retinal dystrophy.  When the individuals were given the lenses, “all of them expressed great comfort and improvement of their ability to use their remaining vision more effectively”  (Fernandes, 2005).  A final option is to provide a variable brightness on the handheld GCS in order to overcome daylight.  Any of the solutions, or a combination of them, will increase the operators ability to maintain situational awareness of their surroundings.
Another issue that needs to be addressed is the specificity of the harware to a particular airframe.  The SPAWAR report states, “the majority of GCS hardware is vehicle specific and cannot be used to control other vehicles or used for any other purpose” (Stroumtsos, Gilbreath, & Przybylski, 2013).  This means an operator that flies one UA is unlikely to be able to switch UAs in the field because of different configurations and hardware.  In battle there may come situations where the Raven is not available, but other UASs are without operators.  Given a common GCS, one operator could be reasonable familiar with the use of another UAS in an emergency.  AeroVironment has begun to address this issue with the common GCS which their UASs Wasp, Raven, and Puma all utilize.  The U.S. Navy is also adapting a common GCS in the MOCU system.  The benefit of the MOCU system is that it will allow one type of GCS to operate robots below, on top, and above the sea.
The Raven is one of the most popular small UASs in use today.  It has multiple GCS configurations which can be adapted depending on the environment in which it is being employed.  There are issues with the differing interfaces, however, the manufacturer of the system as well as some of the users are working to overcome them in order to apply the Raven’s capabilities to the greatest extent possible.


References
AeroVironment, I. (2016). AeroVironment. Retrieved from UAS: RQ-11B Raven: http://www.avinc.com/uas/small_uas/raven/
Fernandes, L. C. (2005). Absorbtive and Tinted Contact Lenses for Reduction of Glare. Vision 2005 - Proceedings of the International Congress, (pp. 534-538). London.
Headquarters, D. o. (2006). Army Unmanned Aircraft System Operations. Washington DC: Headquarters, Department of the Army.
Stroumtsos, N., Gilbreath, G., & Przybylski, S. (2013). An Intuitive Graphical User Interface for Small UAS. San Diego: Space and Naval Warfare Systems Center Pacific.
U.S. Army. (2014, November 4). Retrieved from RQ-11B Raven Small Unmanned Aircraft Systems (SUAS): http://www.army.mil/article/137604/



BIO

Welcome to the blog.  This blog will support the research I conduct during my Embry Riddle Aeronatical University Human Factors of Unmanned Systems class.

I'm Jeff Roberson from Colorado Springs, CO. I'm a Major in the Air Force Reserve. I served about eight years on active duty as an ICBM operator and a space control test analyst. After getting off active duty in 2010 I joined the Air Force Reserve with the 6th Space Operations Squadron, Schriever AFB, CO, operating the Defense Meteorological Satellite Program (DMSP) constellation. I'm currently an Active Guard Reserve (AGR-full time, active duty reservist) as the Chief of Training where I manage all military training for the DMSP program including qualification, upgrade and proficiency.  While I was a traditional reservist I worked as a human factors engineer with Lockheed Martin developing the Space Based Infrared System (SBIRS) ground system and worked on the mobile system as well.  So, basically, my entire career has been in unmanned systems.

I have a BS in Behavioral Science with a Human Factors focus from the US Air Force Academy. This is my final class before my capstone to complete my MAS with a Human Factors specialty!  I initially began this adventure in 2007, but took a few diversions along the way.

I have a wife, daughter and son. Our family is wrapping up ski/snowboard season and transitioning to baseball, boating and camping season. We stay extra busy with my daughter swimming five days a week and my son's baseball practice five days a week.  I coach one of my son's baseball teams and am a USA Swimming official during my daughter's swim meets.