> Waymo, Cummings says, “should not be allowed to operate around schools during school pickup and drop-off until they get this problem fixed and can demonstrate it with specific tests.”
This commentor must misunderstand the situation. School busses regularly stop to pickup and drop off students on streets near where they live; and there's generally schools all around. If Waymos can't properly respond to school bus signals, they need to not operate in areas where these pickups and drop offs happen, which is not exclusively near schools.
If a cop notices this, who gets the ticket? Asking because I’ve noticed Waymos starting to go above the speed limit now. They’re generally just matching the flow of traffic like everyone else, but it does raise the question: who gets fined? And if the fleet as a whole racks up more than 4 points in 12 months, would Waymo loose it's license similar to human drivers?
I saw a waymo go in a nonexistent rightmost lane at a stop light, I thought it was going to turn but it instead proceeded to go forward and force the driver in the actual rightmostlane to break to allow it to merge else it would have caused an accident as there was no lane in front of it.
This was on El Camino in Santa Clara. I was highly suprised as I was under the assumption they were pretty much production ready as they have been expanding their area a lot.
Use statistical incidence rates and not "i saw a thing.." to make that call. I mean I'm sure most drivers regularly think "wow maybe humans shouldnt be allowed to drive" every time they go out on the road.
The thing about human drivers is we’re all unique little stupid snowflakes.
If a software powered car is vulnerable to a certain condition, presumably, all running that software system are. The rare day we can generalize a bad driving story, in fact.
I don't think this checks out. Would the model do the same thing when presented with the exact same inputs? Yes. Is it more likely to do the same thing at the same intersection? Probably. But if you repeat a similar setup somewhere it might not. Bad behavior still exists and should be fixed, but it doesn't mean they're bad drivers in general.
People have trouble seeing outside of their own biases and understanding how different another view can be with a different background and context to the situation. I have no problem confidently saying the parent poster has definitely made worse and more questionable driving decisions under more constrained and more dangerous situations on the road, and then never thinks twice about it after that moment because it had no consequences. All they need to do is look at driver safety statistics of autonomous vehicles vs humans to immediately reject their flawed understanding, and they never will.
Luckily, cars and driving in general aren't enshrined as an early amendment of the constitution (in the US) and aren't even considered a legal right, so pushback to change won't be artificially inflated several decades by heavily motivated interest groups seeking to spread misinformation about their safety. Not a bang, but a whimper.
You're missing that the difference is incentives, specifically incentives scaled up. If we were talking about an individual hacker who programmed their car for automated driving and it made the above wrong decision, most people would be comfortable attributing fault to the individual and leaving it at that. The problem here is that large corpos, who will eagerly tout their right to do whatever they want without further consideration when it's within the law, going beyond even that and breaking the law with impunity.
We can easily imagine a crash from such a thing being declared "no fault" (or even the fault of the turning driver!) based on corpo-sympathetic police, judiciary, and regulators. That perceived lack of justice is the problem - when another individual does something wrong (either accidentally or willful) and gets away with it, we can brush it off as their bad behavior will eventually catch up to them. Whereas with corpos it has been thoroughly demonstrated that this will not happen.
That page addresses tort liability, not liability for driving infractions or crimes. Liability for damages when a company does it is more settled of a situation.
It still isn't quite as clear who or if anyone is liable when traffic laws are broken:
> Asking because I’ve noticed Waymos starting to go above the speed limit now
Where at? Im curious because I see a lot of people say this, but Ive never seen them go more than 1mph over the limit when riding in them, and watch them do 65 on the freeway every day, even when people are passing.
I remember when they told us that autonomous cars wouldn’t break laws and wouldn’t speed.
I always felt this was just a strategy, and that soon enough fleet operators would turn up the dials on speed and aggressiveness. After all, the only people who can complain are the people outside the car, and they will be dead.
There are highways in the US where drivers regularly go 10-20 over the speed limit, if not more; maintaining the speed limit on a road that's labeled as 45MPH zone, but is treated as a 65, will be dangerous for everyone involved, both the cars approaching the slowpoke at 20+ miles an hour, and the slowpoke itself.
I don't know how Waymo is going to square that circle.
I used to live in a place where this was common -- the issue was not just speed, but a general disregard for traffic law because traffic law was unenforced. You could be going 50 in a 35 and someone would aggressively pass you. At some point, the road is simply occupied by unsafe drivers and there's not much you can do other than hold your line and be as predictable as possible to the aggressive drivers around you.
I understand this phenomenon and experienced it when I used to drive. What I found so revealing was it ultimately meant that the people weren’t actually driving their cars.
Each ostensibly independent driver was being forced to drive a certain way by the most aggressive driver behind them, and in turn they were required to force the driver ahead of them to drive in the same way.
That's Phoenix, it's here. Waymos commit to nominally keep the speed at the speed limit but it is _extremely noticeable_ that that's the case because literally NO ONE drives 65 on the freeways here. Everyone is at minimum at 74. It's a rite of passage in Arizona. It's not even a speeding ticket until 75. Goes back to the 70s with the feds trying to force speed limit laws or threatening to revoke highway funding. Arizona said "fine, but it's not a speeding ticket. it's 'misuse of a finite resource.'"
So you'll see the Waymos kind of puttering along at 65 as everyone zooms around them. They DO say they'll occasionally exceed speeds when it's safer to do so, but it's obvious they don't want a narrative of them being speed demons and flying around exceeding the speed limit.
> a road that's labeled as 45MPH zone, but is treated as a 65
If this is the case, then the speed limit is too low. To control speed on such a road you either need draconian enforcement or you need to change the road so people aren't comfortable driving that fast. Make the lanes narrower, introduce lane shifts or reduce the number of lanes, etc.
Sometimes bad road design (e.g. lanes too wide) are to blame, but in miserable neighborhoods with no traffic enforcement at rush hour you can also end up in a situation where the majority of people on the road are simply aggressive drivers who are familiar with the road. At some point you do need to enforce the law if it isn't being respected. There is a growing subset of people in the US who not only disregard traffic law but pride themselves in a distain for it.
IDK if it's draconian but speed cameras or simply forcing cars to have modules that report speeds at certain points and issue fines automatically should be standard by now. What's the point of having smarter cars if they can't be forced to stay below the legal speed limit.
There's a road near me that just dropped the speed limit to 40. This is a divided road, two 12-foot lanes in each direction, good visibility, with turning lanes at intersections. It's highway-class. Most people drive 55 or 60, because that speed feels appropriate and reasonably safe (search the "85th percentile" rule in setting speed limits to read more about this).
By reducing the speed limit to 40 the road is IMO less safe, because there are always some people who very conscientiously do not exceed the posted speed limit. So now you have some people driving 40, while most people still want to go 55 or 60. This creates an unsafe mix of vehicle speeds.
I don't think building enforcement into cars would be a good idea, or even effective, but a few speed cameras work wonders for changing the overall 'temperature' of driving in an area.
Falsehoods programmers believe about speed limits:
1. The speed limit of a road is always marked by a sign
2. The speed limit of a road is in a database
3. You can look up the GPS location of a vehicle to determine what road it is on
4. Roads have exactly one speed limit at any one moment in time
5. Speed limits rarely change
6. Well, maybe speed limits do change, but only during certain fixed times
7. Roads have speed limits
8. Cars are only driven on roads
9. There are no exceptions for following speed limits
10. Well maybe there are but we can safely ignore those without any real consequences
[...]
I've personally done some software experimentation with speed limit detection in vehicles. The combined accuracy of automatic-traffic-sign recognition and speed limit databases + GPS is far less than 100% in real world driving conditions.
I'm fairly certain "slower traffic keep right" is part of the expected flow.
Maybe the Waymo is technically speeding, but so is everyone else, because speed limits aren't magic, and if the de-facto limit ends up being 50 when the posted limit is 40 or 45, going slower creates extra conflict points for accidents.
Get it straight. It is going faster than the speed limit that creates extra conflict points for accidents. That's the problem. If better enforcement is needed via cameras, radar, etc, then that's the solution....not everyone speeding. Speed kills.
Just slightly over half of US states require you to move right to yield to faster traffic. In some places it is completely allowable to drive the speed limit in the left lane.
>After all, the only people who can complain are the people outside the car, and they will be dead.
I'm not sure how you can earnestly make this claim while reading people complaining about the speed and aggressiveness. Do you suspect you're replying to ghosts?
People are getting wise they can abuse these cars on the road, cut them off, not let them in. Waymo needs to respond like other drivers in the city if they want to merge lanes, force their way into the lane and demand space is created.
And school busses go all sorts of places carrying kids to field trips and sporting events. Along with police/fire/ambulances, school busses are just another special type of vehicle that ALL drivers must learn to deal with. If you cannot act properly around a school bus, you shouldnt be on the road.
(Funny story: i was in Ottawa over the winter. There, snow plows, ambulances and fire trucks all use blue flashing lights. I thought i was being pulled over by a giant police truck ... it was a snow plow that really did not appreciate me stopping on the side of the road. Yet another special case vehicle.)
So.. it sounds like they're doing a lot better to me? 19 cases in the fall, 4 between the recall in Novemberish and Jan, and 1 between them and now that occurred in Jaunary?
Also lol at this quote in the article "Six vehicles passed the school bus while it was stopped, the agency said. It is still investigating." What it doesn't note is that the other 5 seem to have been human driven passenger vehicles. From the NTSB report: "located in Novi, Michigan, replied “No” to the prompt. The ADS-equipped vehicle then resumed travel and passed the school bus while its stop arms were still extended. A passenger vehicle following the ADS-equipped vehicle similarly passed the school bus. In total, six vehicles passed the school bus while it was stopped. A crash did not occur.", so it sounds to me like 4 people passed it, waymo was like wtf I'm pretty sure that's a stopped bus, a human incorrectly identified it as not a bus, waymo passed it, and then one more person passed after the waymo.
> A preliminary report by the NTSB published in early March found that one ensuing incident, on January 12, occurred after a Waymo remote assistant, a Michigan-based human tasked with “helping” the software when it was struggling on the road, incorrectly told the robotaxi that the school bus ahead of it didn’t have active signals on. Six vehicles passed the school bus while it was stopped, the agency said. It is still investigating.
I will let you judge for yourself here what the "right" thing for the Waymo to do was... but let's think critically about how Waymos work in the real world, benchmarked against other real drivers dealing with real life issues.
Obviously unacceptable to flaut the law. I do wonder what the risk profile is. Obviously kids can be eratic and unexpected, and coming out racing from behind a flat nosed pusher bus wouldn't be totally unheard of. But also do low key wonder if the Waymo's response time and speed might be enough so that there's not much real risk. The law and expectations ought be followed! But I am low key curious too, if perhaps the Waymo's infinite attention & seeming caution would mitigate the risk adequately.
The fact that it is passing stopped school buses does rather suggest that perhaps as cautious as it is, it still isn't smart enough to be cautious in the right ways.
So I assume Waymo will be immediately banned from any residential areas until they can demonstrate the ability to follow the laws of the road?
The problem is there is zero enforcement. We know the vehicle is not safe around schoolchildren so the appropriate incentive needs to be applied to get the issue addressed.
Humans. Humans repeatedly violate traffic laws. Humans behind the wheel are killing 10's of thousands every year. Yet we keep giving these drugged up meatbags licenses.
$1,000 is not a meaningful amount of money to Google. Maybe if, based on the fact the entire fleet uses the same software, it is fined $1,000 per car in their fleet each time an incident occurs?
Bear in mind $1,000 per incident is not enough money to justify paying a software developer to fix it.
If this behavior actually is a prevalent issue, then there will be many fines that add up. If Google doesn't rack up many fines, then this problem is evidently rare.
Well, you can just treat them like they are anybody else. So, $1000 fine plus a point on the license of Waymo. And as suggested by another commenter in the thread, if the cars in the fleet (collectively) accumulate more than 4 points within 12 months, Waymo loses its license. As in, all cars operated by Waymo.
Ticket and require a company lawyer and programmer to show up in traffic court for every infraction and explain current status of self-driving software.
> Waymo, Cummings says, “should not be allowed to operate around schools during school pickup and drop-off until they get this problem fixed and can demonstrate it with specific tests.”
This commentor must misunderstand the situation. School busses regularly stop to pickup and drop off students on streets near where they live; and there's generally schools all around. If Waymos can't properly respond to school bus signals, they need to not operate in areas where these pickups and drop offs happen, which is not exclusively near schools.
If a cop notices this, who gets the ticket? Asking because I’ve noticed Waymos starting to go above the speed limit now. They’re generally just matching the flow of traffic like everyone else, but it does raise the question: who gets fined? And if the fleet as a whole racks up more than 4 points in 12 months, would Waymo loose it's license similar to human drivers?
I saw a waymo go in a nonexistent rightmost lane at a stop light, I thought it was going to turn but it instead proceeded to go forward and force the driver in the actual rightmostlane to break to allow it to merge else it would have caused an accident as there was no lane in front of it.
This was on El Camino in Santa Clara. I was highly suprised as I was under the assumption they were pretty much production ready as they have been expanding their area a lot.
Use statistical incidence rates and not "i saw a thing.." to make that call. I mean I'm sure most drivers regularly think "wow maybe humans shouldnt be allowed to drive" every time they go out on the road.
The thing about human drivers is we’re all unique little stupid snowflakes.
If a software powered car is vulnerable to a certain condition, presumably, all running that software system are. The rare day we can generalize a bad driving story, in fact.
I don't think this checks out. Would the model do the same thing when presented with the exact same inputs? Yes. Is it more likely to do the same thing at the same intersection? Probably. But if you repeat a similar setup somewhere it might not. Bad behavior still exists and should be fixed, but it doesn't mean they're bad drivers in general.
People have trouble seeing outside of their own biases and understanding how different another view can be with a different background and context to the situation. I have no problem confidently saying the parent poster has definitely made worse and more questionable driving decisions under more constrained and more dangerous situations on the road, and then never thinks twice about it after that moment because it had no consequences. All they need to do is look at driver safety statistics of autonomous vehicles vs humans to immediately reject their flawed understanding, and they never will.
Luckily, cars and driving in general aren't enshrined as an early amendment of the constitution (in the US) and aren't even considered a legal right, so pushback to change won't be artificially inflated several decades by heavily motivated interest groups seeking to spread misinformation about their safety. Not a bang, but a whimper.
You're missing that the difference is incentives, specifically incentives scaled up. If we were talking about an individual hacker who programmed their car for automated driving and it made the above wrong decision, most people would be comfortable attributing fault to the individual and leaving it at that. The problem here is that large corpos, who will eagerly tout their right to do whatever they want without further consideration when it's within the law, going beyond even that and breaking the law with impunity.
We can easily imagine a crash from such a thing being declared "no fault" (or even the fault of the turning driver!) based on corpo-sympathetic police, judiciary, and regulators. That perceived lack of justice is the problem - when another individual does something wrong (either accidentally or willful) and gets away with it, we can brush it off as their bad behavior will eventually catch up to them. Whereas with corpos it has been thoroughly demonstrated that this will not happen.
> when a Waymo vehicle is driving itself, Waymo may be legally considered the operator, even if a human passenger sits inside
Source: https://www.vazirilaw.com/faqs/whos-liable-in-a-waymo-self-d...
That page addresses tort liability, not liability for driving infractions or crimes. Liability for damages when a company does it is more settled of a situation.
It still isn't quite as clear who or if anyone is liable when traffic laws are broken:
https://web.archive.org/web/20251025055924/https://www.nytim...
Often, they are simply getting away with it.
> Asking because I’ve noticed Waymos starting to go above the speed limit now
Where at? Im curious because I see a lot of people say this, but Ive never seen them go more than 1mph over the limit when riding in them, and watch them do 65 on the freeway every day, even when people are passing.
I remember when they told us that autonomous cars wouldn’t break laws and wouldn’t speed.
I always felt this was just a strategy, and that soon enough fleet operators would turn up the dials on speed and aggressiveness. After all, the only people who can complain are the people outside the car, and they will be dead.
There are highways in the US where drivers regularly go 10-20 over the speed limit, if not more; maintaining the speed limit on a road that's labeled as 45MPH zone, but is treated as a 65, will be dangerous for everyone involved, both the cars approaching the slowpoke at 20+ miles an hour, and the slowpoke itself.
I don't know how Waymo is going to square that circle.
I used to live in a place where this was common -- the issue was not just speed, but a general disregard for traffic law because traffic law was unenforced. You could be going 50 in a 35 and someone would aggressively pass you. At some point, the road is simply occupied by unsafe drivers and there's not much you can do other than hold your line and be as predictable as possible to the aggressive drivers around you.
I understand this phenomenon and experienced it when I used to drive. What I found so revealing was it ultimately meant that the people weren’t actually driving their cars.
Each ostensibly independent driver was being forced to drive a certain way by the most aggressive driver behind them, and in turn they were required to force the driver ahead of them to drive in the same way.
That's Phoenix, it's here. Waymos commit to nominally keep the speed at the speed limit but it is _extremely noticeable_ that that's the case because literally NO ONE drives 65 on the freeways here. Everyone is at minimum at 74. It's a rite of passage in Arizona. It's not even a speeding ticket until 75. Goes back to the 70s with the feds trying to force speed limit laws or threatening to revoke highway funding. Arizona said "fine, but it's not a speeding ticket. it's 'misuse of a finite resource.'"
So you'll see the Waymos kind of puttering along at 65 as everyone zooms around them. They DO say they'll occasionally exceed speeds when it's safer to do so, but it's obvious they don't want a narrative of them being speed demons and flying around exceeding the speed limit.
> a road that's labeled as 45MPH zone, but is treated as a 65
If this is the case, then the speed limit is too low. To control speed on such a road you either need draconian enforcement or you need to change the road so people aren't comfortable driving that fast. Make the lanes narrower, introduce lane shifts or reduce the number of lanes, etc.
> If this is the case, then the speed limit is too low.
I don't disagree with you, but it's still a problem if there are drivers on that road who are driving so slowly as to be unsafe, robot or human.
Sometimes bad road design (e.g. lanes too wide) are to blame, but in miserable neighborhoods with no traffic enforcement at rush hour you can also end up in a situation where the majority of people on the road are simply aggressive drivers who are familiar with the road. At some point you do need to enforce the law if it isn't being respected. There is a growing subset of people in the US who not only disregard traffic law but pride themselves in a distain for it.
IDK if it's draconian but speed cameras or simply forcing cars to have modules that report speeds at certain points and issue fines automatically should be standard by now. What's the point of having smarter cars if they can't be forced to stay below the legal speed limit.
I would call speed cameras draconian.
There's a road near me that just dropped the speed limit to 40. This is a divided road, two 12-foot lanes in each direction, good visibility, with turning lanes at intersections. It's highway-class. Most people drive 55 or 60, because that speed feels appropriate and reasonably safe (search the "85th percentile" rule in setting speed limits to read more about this).
By reducing the speed limit to 40 the road is IMO less safe, because there are always some people who very conscientiously do not exceed the posted speed limit. So now you have some people driving 40, while most people still want to go 55 or 60. This creates an unsafe mix of vehicle speeds.
I don't think building enforcement into cars would be a good idea, or even effective, but a few speed cameras work wonders for changing the overall 'temperature' of driving in an area.
How would setting the max speed of a car to the speed limit be a bad idea.
Falsehoods programmers believe about speed limits:
1. The speed limit of a road is always marked by a sign
2. The speed limit of a road is in a database
3. You can look up the GPS location of a vehicle to determine what road it is on
4. Roads have exactly one speed limit at any one moment in time
5. Speed limits rarely change
6. Well, maybe speed limits do change, but only during certain fixed times
7. Roads have speed limits
8. Cars are only driven on roads
9. There are no exceptions for following speed limits
10. Well maybe there are but we can safely ignore those without any real consequences
[...]
I've personally done some software experimentation with speed limit detection in vehicles. The combined accuracy of automatic-traffic-sign recognition and speed limit databases + GPS is far less than 100% in real world driving conditions.
> turn up the dials on speed and aggressiveness
You literally cannot drive on public roads unless you match the speed, flow, and maneuvering of other traffic.
Never been stuck behind someone doing 45 in a 55? Really?
You don’t have to speed. It’s a choice. You shouldn’t make the choice in the passing lane, though.
I'm fairly certain "slower traffic keep right" is part of the expected flow.
Maybe the Waymo is technically speeding, but so is everyone else, because speed limits aren't magic, and if the de-facto limit ends up being 50 when the posted limit is 40 or 45, going slower creates extra conflict points for accidents.
Get it straight. It is going faster than the speed limit that creates extra conflict points for accidents. That's the problem. If better enforcement is needed via cameras, radar, etc, then that's the solution....not everyone speeding. Speed kills.
Just slightly over half of US states require you to move right to yield to faster traffic. In some places it is completely allowable to drive the speed limit in the left lane.
https://www.mit.edu/~jfc/right.html
>After all, the only people who can complain are the people outside the car, and they will be dead.
I'm not sure how you can earnestly make this claim while reading people complaining about the speed and aggressiveness. Do you suspect you're replying to ghosts?
People are getting wise they can abuse these cars on the road, cut them off, not let them in. Waymo needs to respond like other drivers in the city if they want to merge lanes, force their way into the lane and demand space is created.
And school busses go all sorts of places carrying kids to field trips and sporting events. Along with police/fire/ambulances, school busses are just another special type of vehicle that ALL drivers must learn to deal with. If you cannot act properly around a school bus, you shouldnt be on the road.
(Funny story: i was in Ottawa over the winter. There, snow plows, ambulances and fire trucks all use blue flashing lights. I thought i was being pulled over by a giant police truck ... it was a snow plow that really did not appreciate me stopping on the side of the road. Yet another special case vehicle.)
So.. it sounds like they're doing a lot better to me? 19 cases in the fall, 4 between the recall in Novemberish and Jan, and 1 between them and now that occurred in Jaunary?
Also lol at this quote in the article "Six vehicles passed the school bus while it was stopped, the agency said. It is still investigating." What it doesn't note is that the other 5 seem to have been human driven passenger vehicles. From the NTSB report: "located in Novi, Michigan, replied “No” to the prompt. The ADS-equipped vehicle then resumed travel and passed the school bus while its stop arms were still extended. A passenger vehicle following the ADS-equipped vehicle similarly passed the school bus. In total, six vehicles passed the school bus while it was stopped. A crash did not occur.", so it sounds to me like 4 people passed it, waymo was like wtf I'm pretty sure that's a stopped bus, a human incorrectly identified it as not a bus, waymo passed it, and then one more person passed after the waymo.
https://archive.ph/1BblR
"A School District Tried to Help Train Waymos to Stop for School Buses. It Didn’t Work."
https://web.archive.org/web/20260329110357/https://www.wired...
> A preliminary report by the NTSB published in early March found that one ensuing incident, on January 12, occurred after a Waymo remote assistant, a Michigan-based human tasked with “helping” the software when it was struggling on the road, incorrectly told the robotaxi that the school bus ahead of it didn’t have active signals on. Six vehicles passed the school bus while it was stopped, the agency said. It is still investigating.
I will let you judge for yourself here what the "right" thing for the Waymo to do was... but let's think critically about how Waymos work in the real world, benchmarked against other real drivers dealing with real life issues.
Stop making bad human drivers an excuse for these machines to also be bad drivers. We're striving to do better, that's the whole point.
And they are doing better.
Obviously unacceptable to flaut the law. I do wonder what the risk profile is. Obviously kids can be eratic and unexpected, and coming out racing from behind a flat nosed pusher bus wouldn't be totally unheard of. But also do low key wonder if the Waymo's response time and speed might be enough so that there's not much real risk. The law and expectations ought be followed! But I am low key curious too, if perhaps the Waymo's infinite attention & seeming caution would mitigate the risk adequately.
The fact that it is passing stopped school buses does rather suggest that perhaps as cautious as it is, it still isn't smart enough to be cautious in the right ways.
How come Waymo keeps getting to break traffic laws repeatedly but everyone else does not
Can we get these Waymo death traps off the road?
So I assume Waymo will be immediately banned from any residential areas until they can demonstrate the ability to follow the laws of the road?
The problem is there is zero enforcement. We know the vehicle is not safe around schoolchildren so the appropriate incentive needs to be applied to get the issue addressed.
> So I assume Waymo will be immediately banned from any residential areas until they can demonstrate the ability to follow the laws of the road?
Why do you apply a different standard to waymos than to humans?
> Why do you apply a different standard to waymos than to humans?
Show me waymo's driving license and the test it passed to get it
I don't. If a human repeatedly violates the traffic laws they loose their license. Waymo apparently doesn't.
Humans. Humans repeatedly violate traffic laws. Humans behind the wheel are killing 10's of thousands every year. Yet we keep giving these drugged up meatbags licenses.
I would settle for a fine each time, about $1000 in CA, and a point on some employees license.
Just impound the vehicle and crush it. The free market will solve it. ;)
Don't forget to remove the battery first
Or leave it in there, and sell profitable tickets to the show.
Fines plus license suspension are authorized in Texas law [1]
[1] https://texas.public.law/statutes/tex._transp._code_section_...
$1,000 is not a meaningful amount of money to Google. Maybe if, based on the fact the entire fleet uses the same software, it is fined $1,000 per car in their fleet each time an incident occurs?
Bear in mind $1,000 per incident is not enough money to justify paying a software developer to fix it.
If this behavior actually is a prevalent issue, then there will be many fines that add up. If Google doesn't rack up many fines, then this problem is evidently rare.
Well, you can just treat them like they are anybody else. So, $1000 fine plus a point on the license of Waymo. And as suggested by another commenter in the thread, if the cars in the fleet (collectively) accumulate more than 4 points within 12 months, Waymo loses its license. As in, all cars operated by Waymo.
Is that how any corporate fleet works?
corporate fleets have different driver per the vehicle, not same code running everything
What would justify it? Full years salary of a developer plus their fringe benefits? Probably what $300k fine?
per passenger on the bus, paid to their families
Ticket and require a company lawyer and programmer to show up in traffic court for every infraction and explain current status of self-driving software.