A Tesla Model Y driver has filed a complaint, claiming their car was in Full Self-Driving (FSD) mode when it crashed into another vehicle. Despite its name, FSD is not a truly autonomous driving system. Instead, it is Tesla’s latest iteration of its premium assisted driving technology, building upon its Autopilot and Enhanced Autopilot offerings. Currently available as a beta product, FSD includes all the features of the lower tiers, while also promising additional functionality, such as automated steering on city streets, and the ability to stop itself at traffic lights.

The US National Highway Traffic Safety Administration (NHTSA) uses a six-point scale to describe driverless vehicles. Level 0 means no autonomous features, while Level 5 refers to full automation whereby no human driver is necessary for a vehicle to travel safely on public roads. Based on its present features, FSD classifies as Level 2, which is partial automation. Indeed, Tesla mandates all users of its autonomous systems keep their hands on the steering wheel when driving.

Related: Elon Musk & Tesla Don't Agree On When Full Self-Driving Will Be A Reality

Earlier this month, in Brea, California, a 2021 Model Y was hit by another vehicle as the driver was turning left. As noted by The Next Web, a statement on the NHTSA’s website explains that the unnamed driver claims their car was in FSD mode and that it steered into the wrong lane during the turn. The individual says the Model Y provided a warning half-way through the maneuver, but that their attempt to correct the trajectory was to no avail and the vehicle was struck on the driver’s side. Fortunately, there were no reported injuries, but the NHTSA says it is now investigating the incident.

Tesla Recalled Vehicles Last Month

Tesla vehicle recall

The incident comes after Tesla recalled almost 12,000 cars last month due to an FSD issue. The company released an FSD beta 10.3 update on October 23rd, but some drivers soon discovered issues relating to their vehicles’ forward collision warning and emergency braking features. The company temporarily rolled users back to an earlier build before issuing a patch. Of course, Tesla’s software is explicitly being offered as a beta, so it’s unclear whether the company can or will be held liable.

Whatever standards Tesla will be held to for its nascent technology, it’s worth noting regulators are paying attention. This latest action by the NHTSA follows a probe into Tesla cars reportedly crashing into emergency vehicles while in Autopilot mode. It’s also of note that this is not even the first investigation relating to FSD - the California Department of Motor Vehicles is investigating Tesla over its use of the phrase “full self-driving” and whether it amounts to false advertising.

Next: Tesla Model 3 And Y Are Nearly $10,000 More Expensive Than A Year Ago

Sources: NHTSA, The Next Web