TikTok has begun tagging videos of people using Tesla's Autopilot or Full Self-Driving capabilities dangerously with a warning that the actions could cause serious injury. The move is a worrying reflection of the number of such videos that are being posted to the platform. However, TikTok's community guidelines suggest the content shouldn't remain on the platform at all.

Tesla's Autopilot feature is a driver assistance system that helps with things like matching a vehicle's speed to the traffic around it, keeping a vehicle within a lane, and automatic steering. Its Full Self-Driving capability, meanwhile, is ultimately intended to provide fully autonomous driving without the need for any action by the person in the driver's seat. It's currently in beta and is still a way off providing a fully autonomous experience, but sadly that hasn't stopped users vacating the driver's seat while it or the less advanced Autopilot are engaged all in the name of some likes online. Predictably, it seems like this may already be causing fatal accidents.

Related: What Is The Tesla Roadster SpaceX Package?

As the Wall Street Journal reports, lots of videos are now cropping up online of people pushing the boundaries and carrying out stunts with the Autopilot and Full Self-Driving features in Tesla cars. One individual was arrested by police for riding in the back of a moving Tesla, released on bail, and then videoed himself doing it again. As The Next Web reports, the trend for videos showing people using the Autopilot and Full Self-Driving features without following the guidance provided by Tesla has become so pronounced that TikTok is adding disclaimers to them warning of the dangers, such as: "The action in this video could result in serious injury."

Are TikTok's Autopilot & Full Self-Driving Disclaimers Enough?

Tesla FSD Features

The answer to that is, surely: no. The very nature of online trends is that they encourage others to try the same — whether that's a charitable Ice Bucket Challenge video or one of someone operating a car with no one in the driver's seat. Disclaimers about safety will be ignored by people who want to ignore them and, at worst, could even encourage more of the same just as Parental Advisory labels can make music more appealing to the young people they're intended to protect.

While TikTok can hardly be blamed for the videos being made, the most responsible course of action it could take would be removing them altogether. Indeed, its own community guidelines seem to suggest that's precisely what it should be doing. Under a section about 'dangerous acts,' they read: "We do not allow content that depicts, promotes, normalizes, or glorifies such behavior, including amateur stunts or dangerous challenges." The dangerous use of vehicles is even referenced as something that shouldn't be posted.

Ultimately, though, it is Tesla that will need to address these issues. It should not be possible for its Autopilot or Full Self-Driving features to be operated without a person in the driver's seat and it should find a way to address this. Hopefully, news of its in-car driver monitoring cameras being brought online will help to tackle the proliferation of these videos.

More: Everything Tesla Drivers Can Do With The Tesla Mobile App

Sources: Wall Street Journal, The Next WebTikTok