When will your baggage drive itself to the aircraft?

Autonomous ground vehicles (AV) are being investigated and adopted by airports around the world to fulfil several functions from automated car parking through to baggage transport. They are increasingly capable of a wide range of functions that help improve airport operational efficiency and service quality.  Additionally, AVs are usually electrically powered, so when replacing existing solutions they provide another step towards greener airports and reducing environmental impacts.

One area of current focus is the automation of airside baggage transport. Ground handling of baggage can be a very manual activity in many airports, placing physical demands on handlers and bringing significant numbers of people into high-risk environments. Furthermore, the manual and seasonal nature of the operation leads to constant recruiting challenges, wide disparities in experience and service quality as well as inevitable ground handling-related incidents and accidents at airports[1].   AV could be a credible solution to help improve the efficiency of the turnaround process, reduce the physical demands on handlers, reduce the dependence on seasonal work and to move the handling role to a higher skilled, better paid turnaround management activity. Potentially, its win-win for airports and handlers, so what are the challenges?


The challenge of optimising efficient Human-Computer Interaction in Ground-Handling

Currently, baggage is packed by ground handlers onto a ‘baggage dolly’ – an unpowered cart that is attached to a towing truck to transport baggage to aircraft.  This truck is driven by the handlers, but has the potential to be made autonomous.  However, interaction with people is still likely to be required when the baggage is loaded from the sorting area to the dolly, and from the dolly to the aircraft.

The introduction of a new automation will never be fully efficient if we don’t look at the whole system, which includes not only the machines, but also the humans behind them.  Buying autonomous baggage towing trucks is straightforward, but introduction into operations will affect the entire ground handling process.  In fact, there is evidence that the move towards AVs will be limited by the rate of human acceptance[2], and that human actors will use the system only if they feel they can trust it[3;4].  So how do you ensure that? For example, by introducing a new working pattern, which will assure that staff interact with the AVs efficiently and safely at all times.  This includes developing accurate mental models by the operators – if they have an understanding of the AV system’s principles as well as its limitations, it may positively affect their level of trust.

Building trust in the automation is one challenge.  The other is the fade of cognitive skills over time if not used, i.e. not frequently retrieved from the long-term memory[5].  If the operator will only be trained on the system on a purely theoretical level, their recollection of these skills will likely be incomplete, compared to practical training and use of the system on a daily basis.  Switching from driving the towing truck to supervising the AV may result in the workload increase, including high frustration levels[6].  Many studies have shown that some monitoring tasks are hard mental work and stress for the operator[7].  What is particularly threatening is vigilance decrement, which refers to the ability of operators to spot abnormal behaviour declining over time, typically arising within the first 5 to 15 minutes of a task.


What is important to consider is how to react when the system fails? For example, the autonomous baggage truck stopping in the middle of the apron.  Another potential malfunction would be if the truck’s detection sensors do not work properly, resulting in collision with an obstacle such as an aircraft. In particular, a big challenge is to make the system resilient to external impacts such as adverse weather conditions or cyber-attacks.  In order to make this system resilient, clear cooperation strategies between people and autonomous vehicles have to be introduced, including contingency procedures and instructions for reverting to manual tasks in case of a system failure.



The future becomes the present 

So, when will your baggage drive itself to the aircraft?  If you’ve flown from Changi airport in Singapore recently, there’s a high chance that it already has.  Heathrow Airport is seeing a trial of a self-driving electric baggage dolly developed by British Airways and Aurrigo capable of carrying 40 bags in one journey, while using a complex navigation system to decide on the shortest possible route to the aircraft.   This targets improved departure punctuality, as well as reduction of passengers’ total time spent at the airport.  Autonomous baggage towing trucks are the first step in incorporating robots in daily airport operations, but it certainly won’t stop there. Only a few years ago it was difficult to imagine self-baggage drop stations, nowadays they are becoming a standard at larger airports (such as ICM’s ‘Auto Bag Drop’ at Changi Airport or Sydney Terminal 1).

At Think, we look to fully incorporate Human Factors expertise into each innovative change to airport operations.  In doing so, we are helping our customer airports to achieve their goals by considering the needs of all actors affected by new solutions, therefore reducing risk and providing resilience to implementation.

Hanna Neroj, ATM Consultant, Think Research

Author: Hanna Neroj, ATM Consultant



[1] Tabares, D.A., Mora-Camino, F. (2017) ‘Aircraft Ground Handling: Analysis for Automation’. AIAA Aviation Technology, Integration, and Operations Conference (17th), Denver, United States, doi:10.2514/6.2017-3425. hal-01568979.

[2] Smyth, J. (2021) ‘The road to a transportation revolution’. The Ergonomist. Chartered Institute of Ergonomics & Human Factors.

[3] Körber, M., Baseler, E., Bengler, K. (2018). Introduction matters: Manipulating trust in automation and reliance in automated driving. Applied Ergonomics, 66, pp. 18-31, ISSN 0003-6870.

[4] Pop, V. L., Shrewsbury, A., Durso, F. T. (2015). Individual Differences in the Calibration of Trust in Automation. Human Factors, 57(4), pp. 545-556. doi:10.1177/0018720814564422.

[5] Bainbridge, L. (1983). Ironies of Automation. Automatica, 19(6), pp. 775-779.

[6] Warm, J. S., Dember, W. N, Hancock, P. A. (1998). Workload and Vigilance. SAGE Journals, 42(10), pp. 769-771.

[7] Warm, J. S., Parasuraman, R., Matthews, G. (2008). Vigilance Requires Hard Mental Work and Is Stressful. Human Factors, 50(3), pp. 433-441.