Coding, Data Science, A.I., Robots |

  • Thread starter Thread starter nycfan
  • Start date Start date
  • Replies: 490
  • Views: 16K
  • Off-Topic 
Decent article on AI driving (found while trying to figure out if the above video was real, or AI):

My favorite bit.


While Waymo is the first company to deploy a self-driving fleet, they’re far from the only company to try. In the early 2010s, Uber spent hundreds of millions of dollars on autonomous vehicles, which executives believed were essential to profitability. Raffi Krikorian, former director of Uber’s Advanced Technologies Center, who led those efforts from 2015 to 2017, said that the mechanical aspect of teaching cars to drive was one challenge, but unspoken social norms were another entirely.

“When you follow all the rules of the road to the letter, you’re actually a dangerous driver,” Krikorian says. “You actually need to know when to speed, when to roll through a yellow. When we first deployed, we would slam on the brakes at red lights.”

Around the same time, Google was ramping up its own self-driving project, operating buggy-like Firefly prototypes with plastic windshields around the company’s Bay Area-campuses. Being within Google’s massive corporate umbrella gave engineers the time and resources to experiment without rushing to market. In 2016, Google spun this project, called Waymo, into its own company within Alphabet.

Dmitri Dolgov, an engineer who worked on the Firefly and is now Waymo’s co-CEO, says those days were characterized both by technological breakthroughs and painstaking cautiousness. “We would build something with goals for what the Waymo driver would be capable of, and then run it through our rigorous safety framework—and see that it fell short of where we set the bar,” he says. “And we would not deploy.”

Mawakana joined the company in 2017, initially as vice president of public policy and government affairs, ultimately moving into running business operations. Waymo’s road safety mission was personal to her: Her uncle, a longhaul trucker, had died in his vehicle from a heart attack. She relates to customers’ stories of family members killed by drunk driving or other accidents. “People don’t think they need a safer alternative. But the status quo is not acceptable,” Mawakana says.
 
Scuse! lol


Here's why this sort of thing happens. The way these things are trained is to put sensors on a car with real drivers. In Tesla's case, that sensor is a camera and in Waymo's case those sensors are cameras, sonar, lidar, maybe radar.

So you have human giving inputs to the car, wheel turns left, wheel turns right, touch the brake 30%, touch the accelerator 50%, etc. these inputs are recorded at the same time the sensors output recorded.

Over millions of hours of driving, the AI model gets an idea about what to do. It doesn't see a red light but it does see pixels of red coming from the camera and it figures out that whenever the pixels look like this, the drivers tend to do that. It can then try to duplicate those actions later in self-driving mode.

It works very well for normal driving. It's trained on millions of hours of following the road and stopping at stoplights and merging into a different lane when there's a slower car in front of you. But it doesn't do a good job for edge cases or things that are not common that it never sees. It definitely sees flashing police lights in the normal course of driving and it knows to avoid them but it's very rarely sees a shootout happening so it doesn't really know what to do. The same thing happened a few years ago when a car encountered horses or an overturned vehicle. It hadn't seen that before and behaved badly.
 
Last edited:
Here's why this sort of thing happens. The way these things are trained is to put sensors on a car with real drivers. In Tesla's case, that sensors a camera and in Waymo's case those sensors are cameras, sonar, lidar, maybe radar.

So you have human giving inputs to the car, wheel turns left wheel turns right, touch the brake 30%, touch the accelerator 50%, etc. these inputs are recorded at the same time the sensors output recorded.

Over millions of hours of driving, the AI model gets an idea about what to do. It doesn't see a red light but it does see pixels of red coming from the camera and it figures out that whenever the pixels look like this, the drivers tend to do that. It can then try to duplicate those actions later in self-driving mode.

It works very well for normal driving. It's trained on millions of hours of following the road and stopping at stoplights and merging into a different lane when there's a slower car in front of you. But it doesn't do a good job for edge cases or things that are not common that it never sees. It definitely sees flashing police lights in the normal course of driving and it knows to avoid them but it's very rarely sees a shootout happening so it doesn't really know what to do. The same thing happened a few years ago when a car encountered horses or an overturned vehicle. It hadn't seen that before and behaved badly.
More on this:

 
More on this:


This was actually an Elon call not to put lidar and other non-opical sensors on Tesla cars. Lidar is really expensive. They also used to have big maintenance issues because there were more moving parts that broke. So while Tesla's cars are able to be sold at a more realistic price in line with the consumer car market, the self driving outcomes may not be as good long-term. Waymo's cars cost about $200,000 a piece to make, although that could come down with mass production but doubtful it would come down to the regular car market.
 
This was actually an Elon call not to put lidar and other non-opical sensors on Tesla cars. Lidar is really expensive. They also used to have big maintenance issues because there were more moving parts that broke. So while Tesla's cars are able to be sold at a more realistic price in line with the consumer car market, the self driving outcomes may not be as good long-term. Waymo's cars cost about $200,000 a piece to make, although that could come down with mass production but doubtful it would come down to the regular car market.
I took 2 waymo trips last month on San Fran. Some of motion was abrupt but it was great. Even witnessed some confusion/stall due to an empty, parked car jutting to far into the road. We actually hit the support button and chatted with a human just as it was doing a 10-pt turn to figure it's move to slightly swerve around it.

200k seems like a bunch unless mileage is low and trip is high-cost anyway.
 
Youth poll:

IMG_1504.jpeg

 
Back
Top