Tesla Inference Computing And The Problem It Solves

▶️ Watch on 3Speak


What is inference computing? How does this play into AI and LLMs.

In this video I discuss how Tesla, if they keep building out their fleet, will have the largest distributed inference computing network in the world.


▶️ 3Speak



0
0
0.000
7 comments
avatar

Congratulations @taskmaster4450le! You have completed the following achievement on the Hive blockchain And have been rewarded with New badge(s)

You have been a buzzy bee and published a post every day of the month.

You can view your badges on your board and compare yourself to others in the Ranking
If you no longer want to receive notifications, reply to this comment with the word STOP

Check out our last posts:

Hive Power Up Month Challenge - April 2024 Winners List
Hive Power Up Day - May 1st 2024
0
0
0.000
avatar

Have you heard anything about how the data would be transmitted to and from the vehicles computers? Would it use the owners Wifi or cellular? Would Tesla pay the vehicle owners for use of the computers and the battery it uses?

0
0
0.000
avatar

Havent heard any details. That is a long way away.

As for the access the computers, it would use the same method as they do now. I am not sure what the over the air updates uses for example. But the computers are connected to the Tesla network.

Would Tesla pay the vehicle owners for use of the computers and the battery it uses?

I would think it would be more the applications that are using the computers for inference. If you have "Inference-as-a-Service", there would be different applications that could use the infrastructure. They would have to pay for the service.

0
0
0.000
avatar

Ah, thank you.

I'm pretty sure the OTA updates use the owner's home WiFi... which is likely to have better connections than an in-car cellular service and is less likely to have data limits (in the US at least).

Oh... I hadn't realized that the applications themselves could pay potentially pay the owners for the cloud computing. That's a super interesting concept... I think it could be super hard to do technically, but I'll be super curious about the pricing to see how competitive it might be with AWS, Azure, etc.

0
0
0.000
avatar

I'll be super curious about the pricing to see how competitive it might be with AWS, Azure, etc.

I dont think it is a mistake that Amazon put $4 billion in Anthropic (Claude) and Microsoft has a huge stake in OpenAI. We are seeing the LLMs tied to those with the training compute.

What about inference? That is going to be the next challenge they all are going to have to create. We will see how it goes.

0
0
0.000
avatar

I don't know what will be the requirement to reach inference, but if it is just data, then Tesla will have a lot of it from their cars. I do think we are still years away from inference, but once we do get it, AI should be able to help us answer some difficult problems we face.

0
0
0.000
avatar

This is quite interesting. AI innovations are promising a wonderful future though I don't know how long it reality will materialize

0
0
0.000