Hugging Face expands its LeRobot platform with training data for self-driving machines

Spread the love

Last year, AI Dev platforms, hugs, Lerobots, Open AI models, data sets and real-world robotics systems have launched a compilation of equipment. On Tuesday, the hug face tied up with AI Startup Yak to extend the lerobot with a training set for robots and cars that could navigate the environment like City Streets autonomously.

New set, Driving to drive (L2D) is calledOn top of a petabyte in size and it contains data of sensors installed in the car in German driving schools. L2D drivers and students capture the camera, GPS and “vehicle mobility” data to navigate the roads with construction zones, intersections, highways and more.

There are several open self-driving training sets from companies, including the alphabet wemo and comma AI. However, many of these objects focus on planning work like tracking and tracking, for which the L2D manufacturers require high quality vaccines-make them difficult to scale.

Hugged face self-driving
A sample of data on the L2D data set captured by several sensors.Figure Credit:Hug

In contrast, the L2D “last-to-end” is designed to support the development of education, its manufacturers claim, which helps to predict the road directly from censor inputs (eg camera footage) when a pedestrian can cross the road)

“AI community can now create self-driving models from the end to the end,” Yak’s co-founder Harsimrat Sandhwalia and AI member of the robotics team AI in the hugs wrote in a blog post. “The goal of L2D is the largest open-source self-driving data set, which empowers the AI ​​community for the training of the last to the end with the unique and diverse ‘episode’.”

The hug face and yak plan to test the “close-loop” of the real-world of trained models using L2D and Lerobot this summer, deploying a car with a protective driver. Companies have called on AI community to submit models and tasks that they want to evaluate models, such as navigating roundouts and parking spaces.

Leave a Reply

Your email address will not be published. Required fields are marked *