The official Tesla Optimus doesn’t post much, but has today. In the latest post, we get a new video of the TeslaBot that reveals the progress the company is making towards the humanoid robot.
In the post, we learn that TeslaBot is now running and end-to-end neural network: video in, controls out. This mirrors what we hear from Tesla in their cars, referred to as Version 12 which we seen lives streamed by Elon Musk for the first time last month.
The video starts by showing TeslaBot now able to self calibrate, important if you were to unbox your new robot in a new location.
We then move on to see a view from the on-board cameras, showing the robot identifying ciritcal joints on the hands and wrists indicated by circles. We also see the video rendered with different layers wit a blue overlay of what is very similar to human like structures of the fingers, hands, forearms etc, but not the wiring and cabling necessary to make the robot work.
The description shares with us that using only vision and joint position encoders, it can precisely locate its limbs in space. This will not only be important to interacting with objects, but also being able get itself off the floor should it ever fall over.
The video then moves on to show a Teslabot standing in front of a blue and yellow tray, tasked with the challenge of sorting blue duplo blocks into the blue tray and green duplo blocks into the yellow tray. Teslabot grasps each block with ease and importantly at a human-link speed, proceeds to sort them successfully.
At the 35 second mark, we see a human who intervenes in the task, as to create additional complexity and diversity for TeslaBot, dynamically adjusting the environment. The human moves the block positions, right before the robot was going to pick them up. We see the robot able to rapidly adapt to this change and continue to process the task.
Optimus also demonstrates autonomous corrective action by resolving a block that had landed on its side and rotating it to place it face down. The dexterity in the TeslaBot fingers is really on show here and is very impressive, as we imagine all the potential use cases a robot may be deployed, this is critical to future success.
Finally we see the robot effectively showing off with a couple of yoga poses that have no correlation to actual workloads, but does show an ability for the robot to balance on one leg, and counterbalance the weight of extended limbs.
Ultimately the final line in the X post is the most telling, this is being released as an effort to recruit more engineers to work on TeslaBot.
While there’s no furthering information on a timeline of when TeslaBot would start taking on production workloads in the factory, or be a commercial product, it is impressive and promising that Tesla have confirmed the robot is now running on V12 of their software, like in the cars.
In response to the Optimus post, Elon Musk replied simply with ‘Progress’.
4 months ago, in Tesla’s last TeslaBot update, they shared this video.
>>> Read full article>>>
Copyright for syndicated content belongs to the linked Source : TechAU – https://techau.com.au/tesla-shows-off-teslabot-running-fsd-v12-autonomously-sorting-objects-using-video-in-controls-out/