Tesla’s semi-driverless car technology is under intense scrutiny after its vehicles were involved in two fatal crashes in a single day.

In California last week, a Tesla Model S sedan left a freeway and ran a red light at high speed, hitting a Honda Civic and killing two people in that car.

On the same day, a Tesla Model 3 hit a parked fire truck on an Indiana freeway, killing a passenger in the Tesla vehicle.

The special crash investigation unit of the National Highway Traffic Safety Administration (NHTSA) is now investigating the California crash, while it is considering whether to investigate the incident in Indiana.

It is currently unclear if the Tesla vehicles were operating in Autopilot mode at the time of the accidents.

It follows an incident in Connecticut last month where a Tesla Model 3 struck a police car on a highway while operating in Autopilot mode.

There were no injuries as a result of the crash.

Tesla’s Autopilot is a semi-driverless technology that aims to keep the vehicle in its lane at a safe distance from other cars.

It can also change lanes without assistance from the driver.

Tesla says the technology is only meant to assist drivers and that they must keep attention on the road and be ready to intervene at any time.

But the technology has been criticised for allowing drivers to become too reliant on it and not pay attention to the road.

The NHTSA has now launched investigations into 13 separate Tesla crashes involving Autopilot since 2016.

If it is revealed that the Autopilot feature was being used at the time of the recent fatal crashes, it will intensify pressure and scrutiny on Tesla just weeks after its founder Elon Musk claimed that fully driverless cars are “imminent” and will be available this year.

The Center for Auto Safety in Washington has said it wants Tesla to restrict the use of Autopilot to four-lane divided highways without any cross traffic, along with better technology to monitor drivers and ensure they are focused on the road.

“At some point, the question becomes: how much evidence is needed to determine that the way this technology is being used is unsafe?” Center for Auto Safety in Washington executive director Jason Levine told the Star Tribune.

“In this instance, hopefully these tragedies will not be in vain and will lead to something more than an investigation by NHTSA.”

It’s not the first time that Tesla’s Autopilot has been involved in fatal car crashes.

In March 2018 the driver of a Tesla Model X was killed when the car crashed headfirst into a safety barrier section of a road divider on a highway in Mountain View, California.

The subsequent investigation found that the driver had ignored warnings while the car was in Autopilot mode and did not return his hands to the wheel as required.

In late 2017 the National Transportation Safety Board found that a driver’s “over-reliance” on the Autopilot feature had played a major role in a fatal accident in the US in 2016.

A Tesla Model S sedan running on Autopilot mode had crashed into a truck that had turned in front of it without giving way, killing the driver of the Tesla car.

Despite the accident being the result of human error, with the truck driver not giving way when turning onto the highway, the National Transportation Safety Board found that the Tesla driver had not been watching the road at the time and did not override the Autopilot feature when they could have.