top of page


HMI Design
Product Design


Aerial Vehicle

August 2021

A Case Study 

The Drive

Joining planes and drones in the sky are autonomous flying vehicles. Organizations continue to invest in autonomous technology that can help reduce traffic, improve order delivery, and much more. While self-driving cars have been gaining traction in the news, ground transportation isn’t the only mode of transportation looking to go autonomous.

Flow for phase 1

Frame 2.png

The Fascination


Flying Carpet, by Viktor M. Vasnetsov, 1880. By 130 BC, a magic carpet supposedly flew King Phraates II of Parthia to battle. Flying carpets have graced folktales from Russia to Iraq. They combine two once-fantastic dreams: autonomous vehicles, and flight.

Companies such as Ford, Mercedes, and Tesla are racing to build autonomous vehicles for a radically changing consumer world. Ford, for instance, recently tripled its investment in its autonomous vehicle fleet and is testing 30 autonomous Ford Fusion hybrids in California, Michigan, and Arizona. And yet, the fingerprints of tech history can be seen in almost every aspect of their exciting new capabilities.

Levels of Autonomy

Level 0 (No Driving Automation)

Most vehicles on the road today are Level 0: manually controlled. The human provides the "dynamic driving task" although there may be systems in place to help the driver. An example would be the emergency braking system―since it technically doesn’t "drive" the vehicle, it does not qualify as automation. 

Level 1 (Driver Assistance)

This is the lowest level of automation. The vehicle features a single automated system for driver assistance, such as steering or accelerating (cruise control). Adaptive cruise control, where the vehicle can be kept at a safe distance behind the next car, qualifies as Level 1 because the human driver monitors the other aspects of driving such as steering and braking. 

Level 2 (Partial Driving Automation)

This means advanced driver assistance systems or ADAS. The vehicle can control both steering and accelerating/decelerating. Here the automation falls short of self-driving because a human sits in the driver’s seat and can take control of the car at any time. 

levels-of-driving-automation.jpg.imgw.850 1.png

Level 3 (Conditional Driving Automation)

Level 3 vehicles have “environmental detection” capabilities and can make informed decisions for themselves, such as accelerating past a slow-moving vehicle. But―they still require a human override. The driver must remain alert and ready to take control if the system is unable to execute the task.

Level 4 (High Driving Automation)

The key difference between Level 3 and Level 4 automation is that Level 4 vehicles can intervene if things go wrong or there is a system failure. In this sense, these cars do not require human interaction in most circumstances. However, a human still has the option to manually override.


Level 4 vehicles can operate in self-driving mode. But until legislation and infrastructure evolve, they can only do so within a limited area (usually an urban environment where top speeds reach an average of 30mph). This is known as geofencing. As such, most Level 4 vehicles in existence are geared toward ridesharing.

levels-of-driving-automation.jpg.imgw.850 1.png

Level 5 (Full Driving Automation)

Level 5 vehicles do not require human attention―the “dynamic driving task” is eliminated. Level 5 cars won’t even have steering wheels or acceleration/braking pedals. They will be free from geofencing, able to go anywhere and do anything that an experienced human driver can do. Fully autonomous cars are undergoing testing in several pockets of the world, but none are yet available to the general public. 

History of Autonomous Vehicle




Slides of History


Da Vinci’s Self-Propelled Cart

Centuries before the invention of the automobile, Leonardo da Vinci designed a cart that could move without being pushed or pulled. Springs under high tension provided the power to the cart, and steering could be set in advance so the cart could move along a predetermined path. A distant precursor to the car, the device is sometimes considered the world’s first robot.​


Self driving computers shrink

The computer systems that make self-driving cars work have gotten smaller over the years, and morphed from huge installations to discrete units mounted in trunks. They all handle roughly the same functions: taking in raw data from sensors, matching it with known models, and executing the appropriate movements by sending signals to the car’s steering, throttle, brakes, turn signals, and other controls.


Computer in Stanley car (DARPA), 2005 The computers and electronics that powered Stanley completely filled the car’s cargo bay.


Computer in Google self-driving Prius, 2010 (top) and Lexus, 2012 (bottom)

Autonomous functions in common aerial vehicles

Autopilot in Airplane

An autopilot is a software or tool that can only manage the aircraft under certain conditions using the vehicle's hydraulic, mechanical and electronic systems. This system, which can follow the flight plan, can stabilize speed and height as well as the location of the front of the aircraft (heading). Pilots mostly lead the aircraft in a controlled manner by autopilot except for departure and landing. Autopilot is mostly used on passenger aircrafts.


HOV in helicopter

HOV stands for automatic hover. It is included in automatic flight control systems (AFCS) that offer four-axis control. Operating a helicopter requires the pilot to use three main control systems. The pilot uses the collective stick to control the amount of lift generated by the main rotor. A cyclic stick is used to move forward, backward, right, or left. The rudder pedals are used to adjust the direction of the helicopter.

The HOV system controls all these elements, allowing the helicopter to hover in a fixed position. 


Unmanned aerial vehicle/ Drone

An unmanned aerial vehicle (UAV), commonly known as a drone, is an aircraft without any human pilot, crew, or passengers on board. UAVs are a component of an unmanned aircraft system (UAS), which include

  • UAV

  • Ground control station

  • Radio controller/ Transmitter

The flight of UAVs may operate under remote control by a human operator, as remotely-piloted aircraft (RPA), or with various degrees of autonomy,  such as autopilot assistance, up to fully autonomous aircraft that have no provision for human intervention.​


VTOL (Vertical Take-Off and Landing)

Vertical take-off and landing (VTOL) is an aircraft that can hover, take off and land vertically.

This classification can include a variety of types of aircraft including fixed-wing aircraft as well as helicopters and other aircraft with powered rotors

eVTOL (Vertical Take-Off and Landing)

An electric vertical takeoff and landing (eVTOL) aircraft use electric power to hover, take off and land vertically. 

This technology came about thanks to major advances in electric propulsion (motors, batteries, electronic controllers) and the growing need for new vehicles for urban air mobility (air taxis).


Flying car

A flying car or roadable aircraft is a type of vehicle that can function as both a personal road vehicle and an aircraft. 

Many prototypes have been built since the early 20th century, using a variety of flight technologies. Most have been designed to take off and land conventionally using a runway, although VTOL projects are increasing.

Flying cars are also a popular theme in fantasy and science fiction stories.


Urban Air Mobility

Urban air mobility leverages the sky to better link people to cities and regions, giving them more possibilities to connect.

Urban air mobility can positively contribute to a multimodal mobility system. As urban transport heads into the sky, sustainable city development becomes possible 

Multimodal mobility system

 Multimodal mobility is defined as a mobility behavior that is characterized by flexible usage and a combination of different transport modes according to the situation and to the available transport means.


Importance of Urban Air Mobility

By 2030, 60% of the world's population will be urban. This significant population growth is expected to create a real need for innovative mobility options as ground infrastructure becomes increasingly congested. Providing people with a safe, sustainable, and convenient solution that leverages the airspace above cities could be a solution.


Nexa advisors (provides corporate and strategic financial advisory services to the aerospace, transportation, logistics, and homeland security sectors.


Across 75 leading cities worldwide, up to $318 billion will be spent on urban air mobility between 2020-2040.

They believe that as many as 28000 eVTOL aircraft will enter service over the next 20 years.

Major players in eVTOL Space


City Airbus

CityAirbus is an all-electric, four-seat, multi-copter vehicle demonstrator that focuses on advancing remotely piloted electric vertical take-off and landing (eVTOL) flight. 

It is remotely Piloted for autonomous flight.

CityAirbus has a multi-copter configuration that features four ducted high-lift propulsion units. Its eight propellers are driven by electric motors at around 950 rpm to ensure a low acoustic footprint. The single failure tolerant architecture ensures safety. Its cruise speed will be approximately 120 Km/h on fixed routes with up to 15 minutes of autonomy. It has a capacity of four passengers that is ideal for aerial urban ridesharing. 


Vahana Airbus

Vahana is an all-electric, single-seat, tilt-wing vehicle demonstrator that focused on advancing self-piloted, electric vertical take-off and landing (eVTOL) flight. 

The eight electric motors and tandem tilt-wing configuration enabled Vahana to achieve both vertical take-off and landing, as well as cross-city flight range on battery power alone. In total, Vahana has flown over 138 full-scale test flights.


The project team de-risked the core elements of autonomy, including real-time, detect-and-avoid capabilities for both airborne and ground-based hazards. 

Screenshot 2021-10-24 010629.png

EHang 216

EHang aims at making safe, autonomous, and eco-friendly air mobility accessible to everyone by providing comprehensive solutions including hardware and software configuration, takeoff and landing platform establishment, and other operational services.

EHang command & control center is equipped with a set of intuitive AAV command and control system that provides the five core functions of monitoring, dispatching, controlling, early-warning and cluster management.


In case of failure or malfunction of EHang AAVs, the command & control center can take appropriate measures to remotely control the aircraft and safely land it at the nearest available location. 


Volocity by Volocopter

The VoloCity air taxi flies quietly and is emission-free in flight. Combined with VoloPorts, it connects key transport hubs like main stations and airports to enable seamless, faster, and more convenient travel within the city.  

Volocity is a 2-seater 18 rotor eVTOL  which is developed to the highest aviation standards and requirements of the European Aviation Safety Agency (EASA)

The 18 motors inside VoloCity are powered by nine rechargeable batteries. The VoloCity battery swapping system enables short flight stops with minimum time on the ground.


Wisk (Kitty Hawk) Cora

The companies Boeing and Kitty Hawk announced the formation of a new joint venture called Wisk to solidify their partnership. They focus entirely on developing the two-seat Cora, which is expected to fly on routes up to around 60 miles and at speeds of approximately 112 mph.   

The vertical lift system features 12 independent rotors with only one moving part, the fan. An issue with one rotor is automatically handled so the flight can continue.

If a situation arises where it needs to land without the use of lift fans, every aircraft is equipped with a parachute for a safe landing.


An ATS is anticipated to have multiple segments

Building an ecosystem for eVTOL and ATS

Overview of urban Air taxi service (ATS)

An ATS shares the characteristics of common transportation modes, such as regular taxis and subways.

The experience of requesting an ATS would be similar to the process of booking standard on-demand mobility (ODM) service. 

Typically, a registered customer will enter the pickup and drop-off locations using a ridesharing application. 

Depending on the trip information, the platform will estimate the fare amount and travel time for all applicable transportation modes, such as air and regular taxis

Screenshot 2021-10-24 014043.png

Main Components of Ecosystem

As a user of urban air mobility, which connects from cities to cities, cities to the countryside, they usually come across main 3 components

Frame 3.png

Mobile application and the HMI can both be categorized into Human-Computer Interaction (HCI)

All these components together give a complete interaction to the user with the vehicle.

Phase 1 concentrate on Human-Machine Interaction (HMI)


When different parts with each one having a function are put together to perform one task,its called a machine

Machine Interface

Machine interface is through which a user interact with the machine

Frame 4.png

Machine into existence

3000 BC

Tools made for  hunting

HMI Timeline

Frame 5.png

Types of HMI

  • Push Button replacer :

      The interface has buttons with all functions located in            one place

  • The data handler : 

       HMI screen is big enough for such things as graphs,             visual representations, and production summaries.

  • The overseer :

       It's used in windows system used for monitoring the               entire site


Channel, Directions and Type

In the field of human-machine interaction, the potential channels for interaction are predetermined by the available human senses: hearing, seeing, touching, smelling, and tasting.​

Among these, the first three are the ones to be focused on in this study.

Based on the available channels, there are two potential directions for the transfer: either from human to machine or from machine to human.


These interactions can be categorized into two different types - both based on an information transfer:

  • Information transfer without a resulting change of parameters (e.g., displaying of current speed, altitude, environmental,temperature);

  • Information transfer with a resulting change of parameters (e.g., desired cockpit-temperature, change of speed).

User interface - general purpose and requirements

In autonomous cars, the HMI can be seen as a communication medium between (human) driver and autonomous system that incorporates the human, the machine, and data (e.g., ambient temperature, front/side radar sensor data, driver monitoring data from eye-tracking cameras or heart rate monitors) in order to give behavioral recommendations, assess the driver’s state or plan needed information transfer by the system.​

The main requirements for HMI in autonomous the vehicles

  • meet expectations of the driver

  • avoid mode confusion by displaying the correct data for a given driver state

  • present information in a user-friendly manner.


Transparency and Trust

Based on the work of Fink and Weyer, one central characteristic of autonomous systems is their partial impenetrability that leads to a tendency of users to perceive the available functionalities of the system as a black-box without intervention possibilities.​

Studies have suggested that the HMI should provide a transparent representation of the decision-making process of the machine as an appropriate insight into the system provides transparency, which in turn can contribute positively to a feeling of trust.

In Order to understand the machine’s behavior as a fundament for a trust-encouraging process-specific principle should be followed : 

The driver should be 

  • Informed that the system will control the vehicle based on established practices and traffic  laws

  • Know how the system prioritizes its behavior based on multiple options possible

  • Know about maneuvers that might interrupt the current one to avoid feelings of surprise/fear in case of an upcoming decision of the system 

  • Able to perceive the intention of the system (e.g., planned maneuvers - why, how and when?)

  • Be shown that the system knows about environmental and temporal constraints (e.g., vehicles above, in front, behind).

Channels of interaction

Given the human senses available, this communication between driver and system takes place in the following ways: 

  • Visual 

  • Auditory

  • Tactile.



The transfer of information in visual user interfaces is usually done via a form of display - the characteristics range from conventional, mechanical speedometers to software-supported screens such as central displays in cockpit dashboards or head-up displays (HUD) in the driver’s line of gaze.

Based on the research of Reuschenbach et al, status information of the system could be shown to the driver (e.g., a 3D model of the car, information about sensor states)

Also, the display could present detailed information about the ride(the current speed, the current position on a map, the planned route, the estimated time of arrival) as well as information regarding the exterior of the vehicle (the ambient temperature or the distance data from front-radar sensors)


With regard to the research of Bazilinskyy, auditory user interfaces have specific added values. 

In direct comparison to visual interfaces, they work Omni-directional. This means, that auditory information can be perceived from any direction, which gives them special importance for automated driving where the driver may not be focusing on the road or dashboard in the course of the journey. Also, language would provide a higher possible information density (information given per time).



Tactile user interfaces are widely established in the HMI of current, commercially available cars. They appear in different forms (every type of button that is linked to a function or a touch screen embody tactile user interfaces). 

They are used to transfer information unidirectionally from the user to the system in order to enter commands


HMI design requirements for autonomous aircraft

Generally, the interaction between passenger and automated aircraft should be designed as simple and minimalistic as possible to make this mobility offer available to a broad target group.

Thus, heterogeneous types of passengers can be expected including both technology affine as well as aversive users. A simple and minimalistic interaction concept allows both types of users to handle the system in the same way with a special focus on the target-oriented handling of the user interfaces.

HMI must show general flight data 

  • current location, altitude, speed over the ground, route, estimated time of arrival, environmental data (e.g., the temperature outside or at the destination, time, weather forecast for destination)

  • Entertainment offers (e.g., digital newspapers, audio/video streaming services, information about points of interest near the destination)

  • In order to provide transparency of the autonomous system and to further increase the passenger’s trust in it, the system should also provide an estimation of the prospective comfort during the flight  (expected extent of turbulences based on weather forecast), information about upcoming changes of the flight state (acceleration or deceleration processes), expected turbulences during the flight respectively special flight circumstances including the system’s intended precautions and reactions to these situations (route change, adaption of altitude or speed).

  • To enable human influence on the system, the display should be equipped with touch functionalities to provide a way for the passenger to change the given information (about another aspect of the journey or to another extent), enter commands based on current demands (onboard entertainment) or contact off-board human flight supervisors in case of questions or insecurities ( an external control center).

  • Another fundamental functionality of the HMI in an autonomous aircraft should be the passenger monitoring with according monitoring devices.

  • 3D view of the vehicle to give more transparency to the user regarding vehicle safety.

It might occur that the aircraft comes into a situation that is either indeed dangerous for the aircraft and the passengers or that the latter feels endangered. The above-mentioned monitoring data could then be used to establish different levels of escalation in order to provide support to the passengers

HMI design of existing eVTOL players


Screenshot 2021-10-24 105113.png



HMI design 

Low Fidelity prototypes



Frame 8.png


The logo and brand name

Frame 9.png

The logo is inspired from a sand clock representing time.

Here the name "Hour" stands for easy and time-efficient transportation offered by the autonomous vehicle.

Colour Schemes

Frame 10.png

The colour schemes represent Bold, Luxury and Efficiency 

Frame 11.png


HMI Screens - High Fidelity

Home 1.png




Towards the end of phase 1, I had to present my research as an interactive piece for my academic exhibition. The research was condensed within 2 charts as a banner and the HMI prototype was displayed on an ipad which was placed on a stand that simulated the actual HMI in an autonomous vehicle. The visitors were given a chair to sit and interact with the HMI to get an overall feel of being inside the vehicle.

Construction of Stand

Pencil Sketches 


Small Scale Model


End of phase 1

All the additional work for the phase 1 project is recorded here

Phase 2 is an ongoing work and will be uploaded in the above link

Move to Top

bottom of page