A recent report from the National Transportation Safety Board (NTSB) is a chilling reminder of the “in-between” period we find ourselves in as we hurtle toward a world with near-complete autonomous transportation — but aren’t there quite yet.
According to the investigative agency’s accident report on a fatal crash in which an autonomous vehicle struck a pedestrian in Tempe, Ariz., on March 18, 2018, the Uber Technologies Inc. test vehicle was operating in computer control mode. It failed to recognize that a pedestrian was crossing the road in front of the vehicle. The pedestrian was walking a bicycle. The automated driving system (ADS), which controls a vehicle in autonomous mode, classified her as a vehicle, a bike, and “and other.” (Read the full report here.)
In addition, the test vehicle’s occupant, who was unharmed in the crash, was distracted at this point in the vehicle’s 19 minutes of autonomous operation. She was looking down at the center console, where her cell phone was placed. She redirected her vision to the road about one second before impact and tried to adjust the steering, but it was too late.
In its comments on this unfortunate accident, the NTSB noted, among other things, that the federal government has failed to provide needed oversight of autonomous testing on public roads.
In fairness, self-driving cars have successfully driven hundreds of thousands of miles on actual roads and we are swiftly moving toward a point where the majority — if not all — cars on the road will have some level of autonomy. This in-between time, however, when autonomous vehicles are on our roads for testing purposes, is still cause for concern, especially for student transportation.
As [autonomous vehicles] become mainstream, safer travel for all may indeed be a reality. The in-between time, though, remains a learning environment for everyone.
Our industry’s greatest concern about autonomous vehicles must be that schoolchildren boarding or exiting the bus are recognized by the vehicles as such. A school bus is not a delivery truck or a trash truck, for example, that can be passed. Children are not “and other.” Autonomous vehicles must recognize a school bus with its distinct operating conditions and react accordingly every time.
In its findings, the NTSB also recognized an inadequate safety culture at Uber during its investigation. At the time of the crash, according to the agency’s report, Uber showed “a lack of risk assessment mechanisms, of oversight of vehicle operators, and of personnel with backgrounds in safety management.”
Since the crash, Uber has “made changes in organizational, operational and technical areas,” according to the report. In addition, the NTSB recommended in the report that Uber “complete the implementation of its safety management system that at a minimum will include safety policy, safety risk management, safety assurance, and safety promotion.”
The school transportation industry puts safety above all else. We expect others on the road and the governing bodies that approve their actions will do the same. Our children deserve nothing less.
Autonomous vehicles, however, are a whole other level of operation. As they become mainstream, safer travel for all may indeed be a reality. The in-between time, though, remains a learning environment for everyone.
To ensure we’re all as safe as possible when autonomous vehicles are on our roads, you can:
• Encourage your drivers to attend conferences where autonomous vehicles are on display or demonstrated.
• Recommend that your drivers take opportunities to ride in vehicles with this technology to see how it operates.
• Educate your drivers that autonomous vehicles could be on the roads where they operate.
• Encourage the schools you serve to consider including them in the safety conversation.
See all comments