Tesla CEO Elon Musk has admitted creating fully autonomous vehicles is more difficult than he expected at a time when his company is facing greater public scrutiny for Autopilot accidents.

“Generalised self-driving is a hard problem,” Musk recently wrote on Twitter. “As it requires solving a large part of real-world AI.

“Didn’t expect it to be so hard, but the difficulty is obvious in retrospect.”

The billionaire made his comment in response to a Tesla driving making fun of Elon’s claims in April that a new beta for Tesla’s Full Self-Driving (FSD) software would be widely available “in two weeks”.

Select Tesla drivers were given early access to the FSD beta late last year with users sharing mixed results of the software that, despite its name, is far from being fully autonomous and still requires human intervention.

YouTuber AI Addict published a video in March that showed how difficult the FSD feature finds navigating city streets as it nearly drives into oncoming traffic to avoid cyclists, almost mounts the curb, and even ends up driving on the complete wrong side of the road.

Tesla’s driving assistance technology and the promise of ‘full self-driving’ has brought the company under increased scrutiny including and is the subject of lawsuits.

A recent suit attempts to blame Tesla for the death of a teenage boy who died after a Tesla suddenly collided with his father’s Ford Explorer.

According to the New York Times, Benjamin Maldonado was driving along a freeway when he saw a vehicle slow down in front. Maldonado indicated, turned into the right-hand lane, and was crashed into by the Tesla that did not respond in time.

Earlier this year a Tesla crashed into a tree at high speed killing its two passengers. Police initially claimed no one was in the driver’s seat at the time of the accident leading to speculation that it was driving using the Autopilot feature.

Tesla rigorously denied any fault from the vehicle, with the company’s VP of vehicle engineering Lars Moravy telling investors shortly after the incident that Autopilot could not have caused the accident since the seatbelts were found to be unbuckled post-crash and there was evidence to suggest someone was in the driver’s seat.

Two Tesla vehicles were involved in two fatal crashes on a single day in 2020. And a fatal 2018 crash was directly attributed to Autopilot, although Tesla claimed the driver ignored warnings the “audible hands-on warnings” designed to make drivers pay attention.