Jump to content

Self Driving Cars


Dean Pomerleau

Recommended Posts

So I'm amused. I made the SXSW conference, in this talk by my former colleague Chris Urmson at Carnegie Mellon University, who is now leading Google's self-driving car project.

 

I'm the chubby guy on the left at 1:20min in this video holding the "California or Bust" sign, at the start of our "No Hands Across America" trip in a self-driving minivan we built over 20 years ago.

 

Those were fun times. It's really cool to see that research finally coming to fruition, on the verge of (hopefully) saving lives - if they can get the cars to stop driving into the side of busses :-).

 

--Dean

 

Link to comment
Share on other sites

  • 3 weeks later...

Wow, Dean! Very very cool! It must be gratifying that you were part of this cool movement, and I'm sure you have interesting perspectives about how self-driving vehicles will affect us in the near future.

 

I think self-driving trucks will have an enormous impact on the economy. All those truckers currently have solid, middle class jobs. Where will these modern cowboys go to support themselves and their families when they're replaced by robots? And how soon do you reckon they'll indeed be replaced?

Link to comment
Share on other sites

Yea - its pretty cool to watch. But also a bit frustrating, especially when your former business partner tweets you with a link to stories like this announcement from yesterday about how GM (a former customer of ours) just acquired a company nearly identical to the way ours was 10 years ago for $1 billion.

 

Don't get me wrong. I'm not complaining. We're both doing pretty well. But that kind of cash would allow me to do a lot more good in the world, and buy me a lot of durian ;-).

 

As for timing and impact of self-driving cars - see this story for some of my perspective on the history and future of automated vehicles. In a nutshell, 20 years ago we were saying self-driving cars would be common in 20 years. That's still not too far off. I.e. the majority of cars (and trucks) on the road won't be fully autonomous for at least another 10-15 years. If you watch the whole video of my former colleague Chris Urmson from Google embedded above, you'll see he's backing off from some of his very optimistic (e.g. the next 3-5 years) predictions as well. It appears he's coming to recognize the hard realities of technical challenges (e.g. the recent crash of one of his self-driving cars with a bus) and government regulations.

 

But as self-driving cars do become a reality, I believe (along with all the other economic pundits) that they will have a tremendously disruptive effect on society, by taking the jobs of truck, bus, taxi and delivery drivers. In general, I'm pretty concerned about technological unemployment as the AIs and robotics get better and better at what used to be exclusively human endeavors. 

 

I'm just hoping that we'll reach a post-scarcity economy (a la Star Trek, without the spaceships), come to our sense about the possibility of human dignity without work as we've always defined it, and institute a universal basic income to distribute the abundance equitably.

 

That will leave people in a situation like I enjoy, able to pursue their passions without having to worry about meeting their basic survival needs.

 

--Dean

Link to comment
Share on other sites

Very cool connection Dean.  And BTW, you really hold your weight well considering your BMI is as low as it is.

 

I'll be curious to see what happens from a liability perspective, the first time a driverless car causes a fatal collision or runs over a child.  Who is at fault? The car manufacturer? The driver? 

 

Businessman Kevin O'Leary was on television this week saying he feels 7 years from now will be the tipping point for driverless cars, that at night they will drive themselves out to car lots past city limits, only to return the next morning.  While it's fascinating, it also gives me an eerie feeling as well. 

 

You are correct to say that this technology (as well as rapidly developing AI in many areas) will disrupt the industry greatly.

Link to comment
Share on other sites

Thanks Drew. About carrying my weight well - I'm not sure where you are looking, but I don't think I've posted any recent pictures of myself lately. The picture accompanying that CMU story, and in Chris Urmson's SWSX talk is from 20 years ago, a few years before I started CR, when I was around 165 lbs (BMI ~25)  and approaching my peak of 172. We actually made McDonald's Burger King the unofficial food sponsor of the No-Hands Across America trip, by stopping at them every chance we got1. Ah the carefree days of youth (when I was 30).... I'm 118 lbs (BMI ~17.5) now, and look a heck of a lot scrawnier. 

 

Regarding timing of the deployment of self-driving cars - technology pundits like Kevin (and Chris Urmson) are wildly optimistic. They don't know the automotive / truck industry. This article has a good explanation of what it would be like if we passed a law today mandating all cars become autonomous (or electric) as soon as possible. Here is a chart of such a timeline:

 

vehicle-regulation-timeline-2.jpg

 

So it would be ~17 years from now, or 2033 before half the vehicles on the road would have a new technology (any technology).

 

That may be a bit pessimistic because of pressure on the auto and truck industry to speed up their design & deployment cycle based on pressure from the tech industry. But that assumes people will be quick to adopt the technology buy buying a new car with the feature and junking their old one. I can imagine many people being reluctant to turn over their keys to robot drivers, so it might take longer. And since nothing is being mandated now or anytime soon regarding self-driving vehicles, and in fact the government is putting the brakes on the technology to some degree, 2033 is likely to be pretty accurate, if not on the optimistic side. 

 

This pessimistic timeline reminds me of the dream of radical life extension. It is likely quite distant, since the medical/pharmaceutical industry is even more conservative in roll-out than the auto industry, with good reason.

 

As another tech pundit & sci-fi author (William GIbson) said: "The future is already here, it's just not evenly distributed."

 

The future of self-driving cars is already here (see Telsa), and in a sense was here 20 years ago when Todd and I drove (actually were driven) across America in  a self-driving car. But widespread, even distribution of the technology is still in the distant future.

 

--Dean

 

For anyone interested in reading an amusing and detailed account of the self-driving trip across the country we did 20 years ago, Here is the ancient web page with highlights, including the detailed travel log of our adventure (one of the first blogs on the Internet in 1996), and photo gallery.

Link to comment
Share on other sites

  • 4 weeks later...
  • 2 months later...

Well, as mentioned in the footnote of this post, AI and self-driving cars can't do everything perfectly yet, and the result this time was the tragic death of a big proponent of the technology at the hands of his Tesla in AutoPilot mode, as reported all over the news the last couple days (e.g. here).

 

According to this Wall Street Journal article, there are even some so-called "experts" who think that "the fatal crash of a Tesla Motors Inc. vehicle in self-driving mode will provoke additional regulatory oversight and slow deployment on U.S. roads of the rapidly advancing technology".

 

The pundits goes further to speculate:

 

“I think NHTSA is going to want Tesla to turn off Autopilot at least until they learn more.”

 

“Anyone who has worked in this area realized that this was inevitable,” he said.

 

I'm not surprised. I've thought something similar since Tesla turned on the AutoPilot feature, and drivers started posting videos of themselves taking crazy risks, and of AutoPilot responding erratically in difficult driving circumstances. I'm actually surprised it's taken this long for a fatality to occur.

 

Perhaps the robot uprising and the singularity aren't quite as close they might appear after all. Or perhaps this is just the robots' way of picking us off, one by one...

 

--Dean

Link to comment
Share on other sites

  • 1 month later...

For anyone who likes to geek-out on technology, here is a good blog post (and accompanying paper) by researchers at NVIDIA about their efforts to develop self-driving cars based on machine learning and artificial neural networks.

 

I was about to make a snide comment to former colleagues about how "what's old is new again", when I was stopped in my tracks. Turns out they actually said nice things in their paper about the work we did way back in the dark ages, and have modelled their approach to self-driving cars based on ours (my emphasis):

 

In a new automotive application, we have used convolutional neural networks (CNNs) to map the raw pixels from a front-facing camera to the steering commands for a self-driving car. This powerful end-to-end approach means that with minimum training data from humans, the system learns to steer, with or without lane markings, on both local roads and highways. The system can also operate in areas with unclear visual guidance such as parking lots or unpaved roads.
 
...
 
 We developed a system that learns the entire processing pipeline needed to steer an automobile. The groundwork for this project was actually done over 10 years ago in a Defense Advanced Research Projects Agency (DARPA) seedling project known as DARPA Autonomous Vehicle (DAVE) [5], in which a sub-scale radio control (RC) car drove through a junk-filled alley way. DAVE was trained on hours of human driving in similar, but not identical, environments. The training data included video from two cameras and the steering commands sent by a human operator.
 
In many ways, DAVE was inspired by the pioneering work of Pomerleau [6], who in 1989 built the Autonomous Land Vehicle in a Neural Network (ALVINN) system. ALVINN is a precursor to DAVE, and it provided the initial proof of concept that an end-to-end trained neural network might one day be capable of steering a car on public roads.
 
Nice to see not everyone ignores history, even in fast-moving areas of technology!
 
--Dean
Link to comment
Share on other sites

  • 4 weeks later...

All,

 

I've been chatting via email with a bunch of my self-driving car buddies, and I did a little analysis about Tesla Autopilot safety <sic> that I though you might be interested in, and that might be worth having a weblink to. 

 

Enjoy!

 

--Dean

 

--------------------

[Note: identity of my colleagues involved in this email redacted to protect the innocent, and the guilty ☺]

 

I [Dean] opened the thread with this post, praising Mobileye, Tesla's (former) supplier for self-driving car technology, for standing up to Tesla's irresponsible release of Autopilot:

 

Check out this article:
 
Mobileye broke ties with Tesla Motors because the Silicon Valley firm was “pushing the envelope in terms of safety” with the design of its Autopilot driver-assistance system, its chairman has said.
 
Good for them!
 
Funny - I held onto our Mobileye stock when I sold Tesla. I'm not sure if it was the wisest investment decision, but at least they are standing up to Elon's lunacy. And I don't mean his endorsement of the idea that we are living in a simulation. He's right about that ☺.
 
All except one of them probably thought I was joking about the simulation stuff. I haven't talked to most of them about my really crazy ideas. In fact, most of them think I'm a relatively normal guy, besides the whole diet thing... ☺
 
I had earlier sent them all the Hype Cycle graph for 2015 from Gartner showing self-driving car technology at the peak of hype, ready to crash headlong into the "Trough of Disillusionment":

 

syl3VSn.png

 

Speaking of crashing, one of my colleagues (who happens to own a Telsa and loves it, wrote the following):

 

So, according to this 2014 article
 
“Whatever the vehicle, highways are by far the most common place of transportation fatalities in the United States: 94%. If deaths that take place at rail-highway grade crossings are included, the total is even higher, at just over 95%” (This is across all transportation modes).
 
If this is accurate, then it may be fair to compare Tesla’s autopilot fatality rate against the national average.
 
“The rate in 2010 is just one-third of that in 1975 (1.11 versus 3.35 fatalities per 100 million vehicle miles). The 1980s and early 1990s were the era of the greatest rate of improvement.”
 
So this implies the US highway fatality rate is 1.11 fatalities per 100 million vehicle miles (or does it? Are they using “highway” to mean all roads vs. railways, etc?).
 
 
So optimistically, autopilot fatality rates are ½ non-autopilot fatality rates.
 
There’s a few big assumptions / questions in here:
  1. Is that 1.11 number accurate for highways only? If that’s across all road types, then maybe not.
  2. With only 1 tesla fatality, you don’t have real statistics yet on autopilot fatality rates. There could have been 3 people in that car just as easily.
But it does seem like these numbers are in the same magnitude. And while it’s unlikely people will get safer – it’s very likely the autopilot will get safer.
 
The Tesla owner has in the past been skeptical of the wisdom of releasing Autopilot into the wild, but seemed above to be making a cogent argument that maybe Autopilot wasn't doing so badly. So I took him to task, writing: 
 
Regarding crash statistics - the figure of ~1.1 death per 100 million miles of "highway" driving really means  "cars and trucks" vs. "trains" vs. "planes" - i.e. that includes driving of ground vehicles (maybe just passenger vehicles) on all types of roads. Limited access, divided highways (where AutoPilot is supposed to be used), are actually the safest types of roads, and the fatality rate is much lower on real highways (in the way we usually think of that word) than one death per 100 million miles. 
 
So the fact that Tesla has had one fatal crash under Autopilot, and it looks like a second one in China in January, as just being revealed now  makes it so Tesla's fatality rate (2 in <200 million miles or ~1 per 100 million miles) is no better on what should be mostly (relatively safe) highway driving than the all-roads fatality rate of vehicle's under human control (1.1 per 100 million vehicle miles). 
 
Two additional facts even further undermine Telsa's bogus claim of AutoPilot being as safe or safer than a human driver at the helm:
 

 

model-s-five-star-safety-rating.jpg

showing just how far to the right (better) in the "5-star" crash rating bucket (which includes 123 cars) the Model S is, and the associated very low risk of injury (43% HR of injury) relative to the best 3-star (or worse 4-star) car. 

Given its crash worthiness, Tesla Model S's should have a far lower fatality rate than your average new car on the road (57% lower - if risk of injury can be directly extrapolated to risk of fatality). Model Ss should have an ever great advantage over the average of all cars on the highway, since many older cars will be much less crashworthy on average than newer cars, both because of improving safety technology, and because they will be in disrepair.

​So if AutoPilot was simply as good as a human, you'd expect them to be involved in at least 60% but probably closer to 75% fewer fatalities to the driver or passenger than the NHTSA average - putting it at about 1 in every 400 million miles, I'd estimate. So if there have been 2 fatalities in 200 million miles (1 / 100 million miles), that could be getting towards a statistically significant higher than expected fatality rate, even with only two data points. But there's more...

  • As you know more than the rest of us [Mr. Tesla owner] - Teslas are expensive, and a very trendy, hard-to-get car. You don't just go down to the lot and pick one up. People who buy Telsa's are both rich, intelligent, and very rational - they do their homework. How do you think they'd drive relative to the average schmuck? Think they might be more prone to wearing their seatbelt? How about driving drunk? Yeah - me neither. 

Alcohol is a factor in 40% of fatal crashes. 65% of people involved in fatal crashes are not wearing seatbelts.

Given these numbers, you'd expect seatbelt-wearing, usually-sober Telsa drivers to be involved in far fewer fatal crashes than the average car for this reason alone. By my estimate, this "good behavior" bias among smart Telsa drivers should drop Telsa's fatal crash rate by at least another 50-60%, which should put them close to 1 every billion miles, if humans were at the wheel.

 

So if there have been two fatal tesla crashes in 200 million miles of autopilot driving - that is an abysmal record, nearly 10x what you would expect.  I'd be happy to testify to all this in court, and argue that AutoPilot creates a tremendous increase in fatal crash risk relative to a human driver. 

 
Anybody want to hire me!?
 
I didn't get any takers among this group of old automated car geeks for hiring me - but I'm still looking ☺.
 
Then Mr. Tesla owner responded to my statistics argument as follows:
 
Fascinating! I’ve been trying to get a good sense of better / worse for a while. Thanks Dean..
 
I’ll take a bit of issue with the second point (more intelligent, should be safer). Have you seen some of the youtube videos of people doing really dumb things with autopilot? :)
 
To which I responded:
 
You're not thinking about this clearly (Mr. smart Tesla owner ☺).
 
Imagine what completely insane stunts the average yahoo in [state redacted] would be doing with an Autopilot equipped car, if they were much cheaper than Model Ss so the average yahoo could afford one. 
 
I can imagine people standing in the middle of the road while they had a friend barrel up on them in their Telsa, to see if it would stop automatically. That kind of sh*t.
 
Of course, Elon is excited to be deploying Autopilot on the much more affordable Model 3s...

 

To which he responded (tongue-in-cheek):

 

Hah! “Hey Cletus, watch this!”
 

And another of my geek car friends shared the following anecdote and speculation about the possibility of truly reckless Autopilot usage:

 

My good friend Steve discovered, 30+ years ago, that a 67 Ford Galaxy 500 will run on railroad tracks if you let some air out of the tires. So he got his car on the tracks, put a brick on the accelerator, opened the window, and climbed up to sit on the roof. I asked what he did about trains, and he said "there are no trains any more in [state redacted]".
 
Now, if you went back to when he was a teenager in [state redacted], and gave him a Tesla, would he be sitting on the roof?
 

Scary thought. But Mr. Tesla owner came back with the following reasonable point in defense of self-driving cars in the long-run:

 

On the flip side – if autopilot will continue to get safer over time, then is it long term, the right thing to do to deploy something that is a 10x less safe for a relatively brief period, if that’s the only real way to get the data and experience needed to get to something that is 10x safer “forever”? It’s the same concept behind clinical trials. A new drug should have a reasonable basis for being at least as good as the current gold standard before going to large scale trial – but that’s not guaranteed.
 
Of course, in that case, there’s all kinds of questions about consent, regulation, data collection, etc. It’s not a decision a single company can make any more than a drug company can just decide to go to Phase II or Phase III of a trial without FDA approval. But it’s a reasonable question for a society.
 
So I’m coming around to the view that Tesla was irresponsible in deploying this. Yet I’m still using mine…
 
His point about Musk flaunting ideas of consent and regulations (or at least taking advantage of the lack of standards for these - NHTSA has yet to release anything...) is the key.
 
Whether or not Elon actually believes Autopilot to be safer than humans (which it isn't), it shouldn't be up to him and his small not-unbiased team to decide whether Autopilot is ready for deployment, when public safety is at risk. His customers clearly don't and can't know whether Autopilot is safe (not informed consent), they clearly aren't getting the instructions they need on how/when to use it, and they clearly don't, won't or can't stay engaged enough to avoid crashes. In fact, it's human nature / human physiology to have lapses of attention and get distracted if they aren't required to remain constantly engaged in a task. Asking drivers to passively monitor Autopilot while remaining vigilant is like asking your dog to make you breakfast. Ain't gonna happen...
 
This might not even be so egregious a violation of the public trust if Elon was just putting his own customers at risk. I'm as libertarian is the next guy. Buyer beware. Darwin awards and all that. [bTW - do you know the motto of the Darwin awards - "Chlorinating the gene pool" ☺]
 
But what if one of these rogue Tesla missiles plows into a mom in her minivan driving her children to school, or worse, a schoolbus full of kids? It could happen any day. Truly irresponsible of Musk, if you ask me.
 
In response to Mr. Tesla owner who said (above) that in the long run self-driving cars will eventually get safer than humans, I said:
 
Don't get me wrong. I think in the long-term we all may have self-driving cars. But to be honest, when I leave the neighborhood a couple times per week to pick up fruits and veggies at my farm co-op and Aldis (I get everything else via Amazon and don't get out much otherwise ☺) I still drive my 2002 Honda minivan. It has 220K miles, but at this rate, it should last at least another 10 years. 
 
I'm hoping in 10 years, when it's time to trade it in, we'll have flying cars, or better yet, mind uploading. ☺

 

Finally, one of my rich self-driving car geek friends who doesn't own a Tesla yet, but is apparently looking, asked:
 
Is there a convertible Tesla besides the gen 1 one?
 
To which I responded:
 
Yes - there's one convertible model S. Cheap too. But you'll need to wipe the blood off the seats first...
 
Sorry, I couldn't resist. Just a bunch of us old-timers shootin' the breeze. Thought some of you might be amused to listen in...
 

--Dean

Link to comment
Share on other sites

I'm wondering how soon the self driving car will become obsolete due to drones (vertical takeoff and landing, pilotless) that can carry humans.  It seems drone tech is rapidly advancing and the military is already using large versions that could transport people.  A navigation system for a drone is SO much easier than that of a car - relatively simple GPS based point A to point B travel with basic collision avoidance radar and vehicle to vehicle communications.  It would be the end of traffic, and result in a fraction of the travel time. Cars are so 20th century.

Link to comment
Share on other sites

Thanks Daniel,

 

Elon is one amazing individual, with a pretty big head as that video portrays. He is literally out to change the world, and seems to be doing it.

 

Speaking of impressive individuals, I've never met you in person, so all I've got to go by is your tiny avatar picture, but has anyone ever told you that you look like an even younger version of 16th century Swiss artist Holbein the Younger?

 

1MB2h51.png

 

Sorry, I happened to be looking at my twitter feed and your profile picture at the same time, and it suddenly struck me.

 

--Dean

Link to comment
Share on other sites

Hey all you fans of AI and self-driving cars,

 

There is an interesting feature article in Nature today entitled Can we open the black box of AI?

 

It discusses an amusing anecdote from the early history of self-driving cars that learn from example. It happened when the one I had trained (using a neural network) and was riding in almost hurdled itself off a bridge, taking me with it. Sort of like all those Teslas diving into the back of parked cars and other slower-moving vehicles...

 

Be sure to check out the accompanying podcast, where I discuss how this problem not only remains after nearly 30 years, but is actually getting worse due to the increasing complexity of the deep neural networks that Google, Facebook, NVIDIA and others are using today in products everyone uses.

 

Unfortunately except for a hint at the end of the podcast, most of my discussion about the solutions I'm working on were left on the cutting room floor...

 
--Dean
Link to comment
Share on other sites

Very cool, Dean.

 

Are you looking for the AI's mental image within the black box?  Is the problem similar to the AI language problem where instead of the same word having different meanings depending on context, the next driving action required can depend on context? 

 

-Pea

Link to comment
Share on other sites

Pea,

 

I'm actually working now to extend it to natural language processing, but the original idea was to add additional outputs in the form of an image on which the network tries to reconstruction the image it is being fed as input. The better it can reconstruct it, the more its internal representation is well-suited to the kind of input image it is being fed. And the parts of the scene it can reconstruct most accurately are the ones it's encoded in its hidden units, and therefore the features it is paying attention to. Here is the diagram from the 1993 paper on it, called Input Reconstruction Reliability Estimation (pdf):

 

JMTnRrG.png

 

 

As you can see, in addition to producing a steering direction as output, the network is trained to reproduce the input image from its hidden representation using an autoencoder set of output units. This reconstructed image is compared with the original image to judge the network's confidence.

 

--Dean

Link to comment
Share on other sites

  • 5 weeks later...
I made several posts to Medium comments today on the naiveté of self-driving cars in response to this tongue-in-cheek article about abusing self-driving cars to keep them in their place.

 

Here is what I wrote first:

 

What people aren’t thinking about is the mischief and vandalism that will be perpetrated against driverless Uber or Tesla taxis. Imagine the following scenario.

 

I own a $50–100K Tesla and want to make a few extra bucks by loaning it out to customers on the “Tesla Network” Elon Musk is touting.

 

First off, who knows what the yahoo who rents my expensive car for an hour is going to do in it — I don’t really want to think about the details, but I’m pretty sure it will be more than crumbs I’ll be cleaning off the seats after he and his girlfriend return my car…

 

But even before that, think about the poor defenseless car on its way to pick up the yahoo who has hailed my car. I have little doubt that a driverless car on an empty road will become the target of abuse and sport. Imagine kids, hoodlums, or out-of-work taxi/truck drivers seeing one of these driverless cars coming their way. Do you really think they’ll have a whole lot of reservations about pelting it with rocks, or spray painting it with graffiti when it comes to a stop light?

 

Either of these incidents will be enough for me to never loan out my car again, and to sue Elon Musk and his Tesla Network for the cost of damages to my beloved car. If the cars are all company-owned (i.e. Uber’s model) it will quickly become a losing proposition for the company, as the repair costs to fix the vandalism quickly outstrip the fares collected from rides.

 

Do you think all the sensors on the cars will catch the perpetrators in the act, and therefore discourage vandalism? A simple mask, or throwing the rock (or brick) from behind a bush is all it will take to thwart even the best biometrics.

 

It’s hard for me to fathom just how naive Elon Musk and the folk at Uber are being, thinking they’ll be sending cars out into the wild without anyone inside anytime soon. It’s a jungle out there, and only promises to get worse if/when technological unemployment kicks in. Expensive automated cars tooling around without anyone inside will quickly become victims…

 

To which someone named Rick Stewart responded:

 

Why would there be any empty automated cars driving around? Has the computer gone for a joy ride?

 

You won’t be renting out your car, because you won’t own one (and if you do, it certainly won’t be allowed to leave your private property, so plan on owning a ranch). If you want the luxury of (not) driving around in a Tesla, you’ll just order one on your phone, you won’t buy it (and the built in cleaning robot will deliver it in impeccable condition).

 

As for the entire world suddenly becoming vandals — stop thinking like a human. AI will be able to identify you without seeing your face, silly. If someone throws a rock at a car, it will be the last car seen within rock throwing distance of that person until the self driving squad car comes along and orders the rock thrower to get in so s/he can be driven back to her/his mother for a spanking.

 

To which I responded:

 

Rick

 

> Why would there be any empty automated cars driving around?

 

Because that is exactly what Tesla and Uber are promising — cars that drive themselves to pick up passengers either as part of the Uber fleet of self-driving cars being developed by colleagues of mine at Uber’s Pittsburgh research center [1], or the “Tesla Network” of cars owned by Tesla customers [2]. Elon Musk has even shown a car driving itself around Palo Alto [3], and claims you’ll soon be able to “summon” you car from far away, perhap even all the way across the country [4], such is the hyperbole…

 

I’m not sure where you’ve been Rick, but renting out your own Tesla to earn extra money is *exactly* what Musk is talking about [2].

 

I’m also not sure what you think an autonomous cars can do Rick, but identifying a vandal wearing a mask (or even funky glasses [5]), or chasing a rock-throwing kid through the woods to corral him for police *aren’t* among their capabilities. Self-driving cars won’t be able to do either of these things, at least until the robot uprising brought to you by humanoid robots from Google-owned Boston Dynamics [6] or by swarms of killer drones with Intel inside [7]. Until then, they’ll be hapless and helpless targets of human abuse, and hence not a viable part of anyone’s business model.

 


 


 


 


 


 


 


 

It continues to amaze me just how gullible and naive people are in thinking truly driverless cars will be roaming the roads anything soon. This especially includes the developers / promoters of this technology, who should know better. Even the Secretary of the USDOT has drunk the kool-aid right along with the rest of them...

 

--Dean

Link to comment
Share on other sites

Clever idea Gordo,

 

Just like the blow-up autopilot from Airplane ☺:

 

QqkAaj.gif

 

I could be wrong, but I don't think people will be fooled for long. And given the fact that people already throw rocks at the Google bus (literally and figuratively) with people inside, I don't think it will be much of a deterrence when people really start to get upset about economic disparity and technological unemployment, trends for which driverless cars will certainly become the poster-child should they be deployed.

 

Look at the anger we're seeing now during Trump rallies, and imagine it directed towards "job-stealing driverless cars and trucks."

 

--Dean

Link to comment
Share on other sites

[Admin Note: I've moved the post below by Gordo to a new thread on Dysfunctional US Politics where we can vent our frustrations over the societal death spiral we appear to be mired in. I hope Gordo won't mind. Please follow up with posts there. - DP]

 

 

Look at the anger we're seeing now during Trump rallies, and imagine it directed towards "job-stealing driverless cars and trucks."

 

--Dean

 

The anger at Trump rallies was as fake as those driverless car drivers will be:

https://youtu.be/5IuJGHuIkzY

Funny how this video had millions of views and the guy exposed in it resigned, and Hillary shut it all down, then the Trump rally violence completely stopped.  No idea why google reset the view count... haha this election is such a joke.  A buffoon vs. a serial criminal who will only avoid prosecution if she has enough friends in high places.  Great choices we have! 

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...