To make sure you never miss out on your favourite NEW stories, we're happy to send you some reminders

Click 'OK' then 'Allow' to enable notifications

Safety group claims to test Tesla's autopilot and the results weren't good
Featured Image Credit: TikTok/@candlejennerofficial

Safety group claims to test Tesla's autopilot and the results weren't good

A video of a Tesla car being tested on autopilot shows it smashing right through a dummy in the road.

Footage of a Tesla which it is claimed was driving on full autopilot appears to show the car failing to stop and hitting several child-shaped mannequins.

When it comes to letting a car drive itself, technology in the sector has come on in leaps and bounds.

Theoretically a car on autopilot can now drive itself around without a human to tell it what to do, but we're not quite in that utopian future where our cars chauffeur us from place to place. Watch footage from the test below:

In the video of the test, which was conducted by campaign group The Dawn Project, the Tesla is seen accelerating down a road where a child-sized dummy is standing directly in its path.

Getting closer and closer, the car on autopilot shows no signs of stopping as it collides with the dummy and knocks it down in the road.

The child-sized dummy is smashed into smithereens and lies in pieces across the road.

Even if the car can drive itself Tesla have been clear someone should be behind the wheel.
Imaginechina Limited / Alamy Stock Photo

The Dawn Project described the test results as 'deeply disturbing'.

The group put the latest version of Tesla's self driving software through its paces and found that it repeatedly crashed into a stationary child-sized mannequin placed in the vehicle's path.

This is not the first time Tesla's autopilot function appears to have gone wrong with dangerous results.

Teslas on autopilot have smashed into other cars, in one case because the driver was watching a movie on his phone.

Earlier this year one even hit a plane after a Tesla owner summoned their car to them using the autopilot function.

It's reached the point where the US government has launched a formal investigation into Tesla's autopilot systems, though Tesla insists the rate of accidents in their cars on autopilot is lower than the average driver gets into.

The autopilot function is under investigation after a number of incidents involving Teslas.
Taina Sohlman / Alamy Stock Photo

In the past Tesla have been very clear that the autopilot function on their cars isn't something which you're just supposed to let handle everything by itself.

The company have been clear that the function is intended to 'assist' drivers with 'the most burdensome parts of driving' and aren't meant to ferry people around by themselves.

Tesla say their autopilot system in its current form is something which requires 'active driver supervision' and does 'not make the vehicle autonomous'.

The autopilot function has also done some good for drivers, it once saved the life of a drunk driver who passed out at the wheel.

UNILAD has reached out to Tesla for comment.

If you have a story you want to tell, send it to UNILAD via [email protected]

Topics: Tesla, Cars, Technology