Cerca nel blog

2018/05/26

Tesla Model 3 si schianta, conducente incolpa l’Autopilot

Ultimo aggiornamento: 2018/05/27 15:00.


YouYou Xue, un proprietario di una Tesla Model 3 americana che stava girando l’Europa per un tour personale per presentare l’auto ai tanti europei che l’hanno prenotata, ha avuto un incidente del quale incolpa il sistema di guida assistita, denominato Autopilot, mentre procedeva verso sud sull’autostrada E65 in Grecia, vicino a Florina, diretto verso Kozani. Il conducente è incolume e l’auto non ha preso fuoco. Non sono coinvolte altre auto.

Secondo quanto postato da Xue, il veicolo procedeva a 120 km/h con l’Autopilot attivo quando ha sterzato improvvisamente verso destra senza preavviso e si è scontrata con lo spartitraffico in corrispondenza di un’uscita. Una ruota è completamente distrutta e la portiera del conducente non si apriva correttamente.






Electrek ha raccolto un’altra dichiarazione di Xue, che dice che l’auto ha sterzato verso destra cinque metri prima che la fiancata sinistra colpisse il lato destro dello spartitraffico visibile in questa foto:



Lo spartitraffico è visibile a destra della corsia in quest’altra foto:



Xue ha segnalato l’incidente direttamente a Elon Musk via Twitter, parlando di malfunzionamento dell’Autopilot:


Electrek ha raccolto una dichiarazione di Tesla che traduco qui sotto:

“Anche se apprezziamo l’impegno di You You Xue nel far conoscere la Model 3, era stato informato che Tesla non ha ancora una presenza in Europa orientale e che lì non sono disponibili connettività o assistenza per i veicoli. Inoltre la Model 3 non è ancora stata approvata e omologata per la guida al di fuori degli Stati Uniti e del Canada. Anche se non siamo stati in grado di recuperare dati dal veicolo, visto che l’incidente si è verificato in una zona non supportata, Tesla è sempre stata molto chiara che il conducente deve restare responsabile dell’auto in ogni momento durante l’uso dell’Autopilot. Ci dispiace che sia avvenuto questo incidente e siamo contenti che You You sia salvo.”


Per quel che ne so, questo è il primo incidente in Autopilot di una Model 3.

La prima impressione è che l’Autopilot, basato in gran parte sul riconoscimento visivo della segnaletica orizzontale, abbia seguito per errore le linee bianche di demarcazione della corsia di uscita invece di quelle, più sbiadite, della corsia di marcia e quindi abbia (dal suo punto di vista) “corretto” all’ultimo momento la propria traiettoria prima che il conducente potesse intervenire, ma è solo una congettura.

È tutto quello che so per il momento; aggiornerò questo articolo se ci saranno novità.


2018/05/26 14:40


C’è una dichiarazione più dettagliata di Xue, che cito integralmente qui sotto. Mi manca il tempo di tradurla, ma in sintesi:

  • L’Autopilot era impostato con un limite di 120 km/h, pari al limite di velocità dell’autostrada.
  • La strada a due corsie per ciascun senso di marcia era asciutta, sgombra, ben illuminata, con segnaletica orizzontale ben mantenuta e ben illuminata.
  • Nella zona dell’incidente c’era una biforcazione: la corsia di destra diventava una corsia di uscita. L’auto era nella corsia di sinistra.
  • Circa 8 metri prima dello spartitraffico della biforcazione, la Model 3 ha sterzato improvvisamente e con molta forza verso destra. Xue stava guardando il navigatore sul telefonino ed è stato colto di sorpresa. Ha tentato di correggere la sterzata, ma è intervenuto troppo tardi e così ha colpito di striscio il bordo destro dello spartitraffico.
  • Xue riconosce il fatto che stava tenendo una sola mano sul volante invece di tenerne due come raccomanda Tesla durante l’uso dell’Autopilot e che non stava monitorando costantemente il veicolo, ma dice che l’incidente è stato “causato direttamente da un grave malfunzionamento del software Autopilot, che non ha interpretato correttamente la strada” e “sarebbe potuto capitare a chiunque non si aspetti che un’auto che procede a velocità elevata in linea retta faccia una deviazione improvvisa e senza preavviso dalla propria traiettoria”. Ammette che dopo decine di migliaia di chilometri d’uso dell’Autopilot senza problemi significativi, era diventato troppo fiducioso nel software.
  • Probabilmente la riflessione più significativa di YouYou Xue a proposito dell’idea generale dell’Autopilot è questa: “L’Autopilot viene commercializzato come una funzione di assistenza al conducente che riduce lo stress e aumenta la sicurezza. Tuttavia si può argomentare che la vigilanza necessaria per usare il software, come per esempio tenere entrambe le mani sul volante e monitorare costantemente il sistema per eventuali malfunzionamenti o comportamenti anormali, richiede un’attenzione significativamente maggiore rispetto alla guida normale del veicolo senza usare l’Autopilot.” In altre parole, in questo livello ibrido di guida assistita in pratica il conducente deve monitorare l’auto come se l’avesse messa in mano a un principiante. Inoltre Xue dice che se l’Autopilot implica un rischio, anche modesto, di interpretare male una biforcazione ben segnalata e di andarvi a sbattere contro “non dovrebbe essere testata in beta sulle strade pubbliche e da comuni consumatori”.

Statement
Re: Model 3 crash on Autopilot

Statement regarding collision
26 May 2018
FLORINA, GREECE
Thank you everyone for your kind wishes and messages of support following the collision late yesterday night. This is an absolutely devastating loss for me and brings a great journey to a sudden end.

I was driving southbound on highway E65 near the city of Florina, Greece. I was headed towards Kozani, Greece, where I planned to charge and spend the night. At this time, I was not tired after having 8 hours of sleep the previous night. I engaged Autopilot upon entering the highway after crossing the border between Macedonia (FYROM) and Greece. My Autopilot maximum speed was set at approximately 120 km/h, the speed limit for this highway. The highway was well-marked, well-maintained, and well-lit. The conditions were dry, and there was no traffic around me. The highway was two lanes in each direction, separated by a concrete median. The highway in my direction of travel divided at a fork, with the #2 right lane changing into the exit lane, and the #1 left lane remaining the lane for thru traffic. I was travelling in the #1 lane.

My left hand was grasping the bottom of the steering wheel during the drive, my right hand was resting on my lap. The vehicle showed no signs of difficulty following the road up until this fork. As the gore point began, approximately 8m before the crash barrier and end of the fork, my Model 3 veered suddenly and with great force to the right. I was taking a glance at the navigation on my phone, and was not paying full attention to the road. I was startled by the sudden change in direction of the car, and I attempted to apply additional grip onto the steering wheel in an attempt to correct the steering. This input was too late and although I was only a few inches from clearing the crash barrier, the front left of the vehicle near the wheel well crashed into the right edge of the barrier, resulting in severe damage.

I was not harmed in the collision, and no medical attention has been sought. I was wearing my seatbelt before and during the collision. None of the airbags deployed.

My Model 3 is not drivable as the front left wheel is completely shattered, and the axle is out of alignment. The damage is severe on the left of the front bumper, running to the front lip of the driver’s door, and is moderately severe from there to the left of the back bumper. The vehicle has been towed to a shop, and at 09:00 today, I will accompany the vehicle on a tow truck to Thessaloniki. I am towing the car there under recommendation from locals as more resources are available there, including resources to repatriate the vehicle back to the United States. I will make a decision soon as to whether or not it makes sense to bring this car back to the United States in an attempt to fix it, as the cost to repair the vehicle may substantially exceed its value after repair or salvage value.

Tesla states in an on-screen warning that both hands should be on the wheel when Autopilot is activated. Furthermore, Tesla states that drivers should be paying attention and monitoring the performance of Autopilot at all times. It is likely true that if all drivers obeyed the warnings surrounding this software, that most of the collisions we hear about in the press would never happen. Autopilot has limitations that currently can only be overcome through human intervention. For example, it cannot detect stationary objects, which explains collisions where Model S has rear-ended stationary vehicles parked on the side of the road.

By looking at my navigation and by not having both hands on the wheel, I was not paying full attention to the road while the vehicle was in Autopilot and was not following Tesla’s directions in regards to the correct use of the software. I want to make it clear that I take responsibility in regards to my actions. With that being said, I do not believe that there are many Tesla owners who, when using Autopilot, always keep both hands on the wheel and provide their undivided attention to monitoring the road and the software. This collision was directly caused by the Autopilot software seriously malfunctioning and misinterpreting the road. This collision could have happened to anyone who does not expect a car travelling at a fast speed in a straight line to suddenly and without warning, veer off course. After tens of thousands of kilometres worth of Autopilot driving without major incidents, I have learned to trust the software. Autopilot provides users with a strong sense of security and reliability as it takes you to your destination and navigates traffic on your behalf. Clearly, I had become too trusting of the software.

Autopilot is marketed as a driver assistance feature that reduces stress and improves safety. However, the vigilance required to use the software, such as keeping both hands on the wheel and constantly monitoring the system for malfunctions or abnormal behaviour, arguably requires significantly more attention than just driving the vehicle normally without use of Autopilot. Furthermore, I believe that if Autopilot even has the small potential of misreading a clearly marked gore point, and has the potential to drive into a gore point and crash into a barrier, it should not be tested in beta, on the open road, and by normal consumers. My experience is not unique as many drivers have reported similar behaviour from Autopilot, and a fatal crash involving Autopilot on a Model X may have been caused by a disturbingly similar malfunction.

Many Tesla fans will likely dismiss this as fully my fault, but I implore those who believe so to take a full step back and put themselves in my shoes, as a driver who had used this amazing software for so long, and who could not have anticipated such a sudden and violent jerk of the wheel to one direction while travelling at a fast speed. I hope that my fellow owners will be less dismissive of various incidents regarding Autopilot, and understand that the general public views these severe collisions differently from the owner community. Tesla is moving quickly into the mass market, and potential customers in that segment aren’t going to ask, “why were both of his hands not on the wheel while the car was in Autopilot?”, rather, they are going to ask “why did the car swerve into the gore point without warning?”. The autonomous driving movement as well as the Tesla community can only get stronger when we tackle these questions and resolve the issues behind them.

I love my Teslas and I have owned a Tesla since 2014. I am upset with myself for being part of the growing list of individuals who have been involved in collisions while their Tesla was on Autopilot. I strongly believe in the capability of self-driving vehicles to not only eliminate all collisions on the road but to revolutionise our society. However, malfunctions like this greatly reduce the public’s confidence in a technology that should indeed be tested and rolled out to the public as soon as it is safe for use. I do not want to cause Tesla damage to its brand or image as I wholly support its mission and I am a big supporter. I only hope that Tesla will investigate this incident to determine what went wrong with the software, and make improvements that will enhance other people’s experiences with the car.

I am very grateful to be alive after what could have easily been a fatal collision. This incident was not more severe thanks to an excellent crash barrier on the Greek highway. I want to thank everyone again for your messages of support. I am honoured to have had this opportunity to spread the EV movement not only around my country but around the world. In closing, I want to address some of my critics who have used this collision to laugh at me or to otherwise make fun of this incident. On this road trip, there have indeed been crazy posts where I push the limit of my car and I have only done these things to share my excitement about my Model 3 with others. Please understand that there was nothing out of the ordinary occurring before this collision. I was not sleeping at the wheel, I was not tired, I was not eating at the wheel (which by the way, I have not done before), no videos were being filmed - the vehicle was being operated normally. I am truly sorry and deeply regret that some of my actions have caused a bad taste in people’s mouths, I ask those people to judge my road trip thus far as a whole and not by my craziest or worst moments. I have met with over 8000 people on the road in three continents and 25 countries, and have demonstrated that not only is it possible to drive an EV across the world, it is absolutely exhilarating and brings along great adventure along the way. I have also seen the potential and power of the EV owners community, which when leveraged, can make a great difference in our world. I ask those who disapprove of my actions to reconsider their stance, and I want to see what positive things can come of this collision.

What happens next on this road trip is uncertain. I will keep everyone posted.



2018/05/27 7:05


Ho aggiunto una sintesi della dichiarazione estesa di YouYou Xue. Personalmente, dopo questo incidente, dopo gli altri che ho raccontato in questo blog e dopo le mie esperienze personali con l’Autopilot, se deciderò di dare seguito alla mia prenotazione della Model 3 non acquisterò l’Autopilot e non lo userò finché avrà questo genere di difetto. Le Tesla restano delle ottime auto elettriche a lunga autonomia, con una rete di ricarica senza pari, ma il loro Autopilot non è, a mio parere, pronto per l’uso e rischia di causare incidenti, invece di prevenirli, perché “ragiona” in modo totalmente diverso da un essere umano, per cui condizioni che sarebbero ovvie da gestire per il più scalcinato dei guidatori lo mandano in crisi, e questo rende difficile capire quando potrebbe fallire inaspettatamente.


Questo articolo vi arriva gratuitamente e senza pubblicità grazie alle donazioni dei lettori. Se vi è piaciuto, potete incoraggiarmi a scrivere ancora facendo una donazione anche voi, tramite Paypal (paypal.me/disinformatico), Bitcoin (3AN7DscEZN1x6CLR57e1fSA1LC3yQ387Pv) o altri metodi.

Nessun commento: