Autonomous Vehicles

Yes but they are calling it AI, it's not AI by the scientific definition of it.

Who is calling it AI? Media? Playing to the uninformed?

We don't need AI, far from it.

The driving the car bit is simple, they did that on scrapheap challenge with bits of rubbish in a weekend, the difficulty comes from the observation and interaction with the environment. There are so many variables it will take time to cover them all, there will be some missed, they will be found and fixed.
 
It happened to a few people afaik, but only one death. Plus all the other incidents, like the google car that drove in to a bus.

Humans won't all make the same mistakes when presented with the same situation. A computer script can't learn or think for itself (which is all current "AI" really is). That's why real AI is the missing piece to make it all work, but it could be a very long way off yet. They will struggle with it until we're there.


All the software in all the autonomous vehicles I work with day in and day out from lots of manufacturers learn as they go.

NONE operate a constant "script" they all have algorithms that are constantly learning from their environment, and re writing themselves to allow for situations and instances they come across, so that next time they see the similar situation they react differently or similarly, or come up with a completely new reaction not pre- programmed and not having occurred previously, depending upon which reaction will be the best outcome.

If that is not intelligence I would like to know what your definition is ?
 
It happened to a few people afaik, but only one death. Plus all the other incidents, like the google car that drove in to a bus.

Humans won't all make the same mistakes when presented with the same situation. A computer script can't learn or think for itself (which is all current "AI" really is). That's why real AI is the missing piece to make it all work, but it could be a very long way off yet. They will struggle with it until we're there.

You're aware of machine learning, right? We're way beyond static scripts, machines are already learning for themselves.
 
Where are we going to put all those people that now do not have a job? You are talking 600,000 jobs wiped out in the UK alone. Place them in the cab and you are not really gaining anything.

You may have noticed that autonomous vehicles are very much on the mind of the Chancellor of the Exchequer in his Budget this week.

The Guardian article cited below suggests in its title that it will be up to drivers to retrain and that "driverless cars will be on British roads by 2021". Retraining will be key for the jobs that this new economy creates.

Specifically he told BBC Today programme: "It will happen, I can promise you. It is happening already ... It is going to revolutionise our lives, it is going to revolutionise the way we work. And for some people this will be very challenging.”

The Guardian article touches upon a number of controversies and issues likely to arise between here and there. Of course the comments section to the article is both amusing, factual, factless and a little frightening.

https://www.theguardian.com/world/2...less-cars-by-2021-and-warns-people-to-retrain

The job loss and retraining need will be a worldwide phenomenon. In this link, Goldman Sachs sees with autonomous trucks the loss of 300,000 jobs per year in the US. Keep in mind that this is a huge issue as the US economy currently creates approx 200,000 jobs per month. Some believe autonomous trucks with platooning capabilities will arrive on roads ahead of autonomous cars.

https://www.usatoday.com/story/mone...st-u-s-economy-300-k-jobs-per-year/868027001/
 
Last edited:
lol 100% av is coming, you dont need everything mapped, they have enough sensors on them. Same with wetaher conditions, constructions, unusal roads. Tesla computer learnign is specifically capturing that data and learning at the moment. Several people have analysed what teh car is sending back to tesla, and in the last few months its been capturing photos/videos and labelling them construction/confusing lane etc.

I think people will be surprised by how quick people get used to new technology, you just have to look at studies done on cars like Tesla, people go from unsure, to trying to turn autopilot back on when the car is saying it cant do it, in days.

I also think people will be surprised how fast and how many people will gladly give up owning a car and all the ball ache that comes with it(obviously not everyone, but that's not the point).
I would happily give up car ownership for a self-driving uber experience as long as its cost effective. Seeing as cars spend what 95% of the time not moving, then I cant see how it wouldn't be cost-effective.

I can readily think of two groups that would welcome a safe autonomously driven vehicle experience:

1. People who are either blind or visually or hearing impaired or with Down syndrome. They form an important lobbying group in favour of approval in developed markets such as the United States. It is estimated there are 60 million who are either visually or hearing impaired. As early as 2015 Google demoed an autonomously driven vehicle for Steve Mahan, a blind American. There are many design issues involved in this demographic.

2. The aging population.

http://www.sunherald.com/news/nation-world/national/article186262348.html
 
There are other places in the country other than London, you are just clutching at straws now. If you seriously think that car companies are going to create one equal car and everyone else will be happy to buy that you are living in cloud cuckoo land.

Obviously, because I have no idea what you’re talking about.

The only person that’s mentioned anything about “equal cars” is you. I also have no idea what London has to do with motorways, which is what we’ve been discussing.

So to summarize I’ll ask again: what do hard braking, acceleration and cornering have to do with driving on motorways at busy periods?

Most people now don’t do any of those on motorways, whether they’re in a micro or an AM. In fact that’s the whole point of motorways.

The AM driver may do it on empty minor roads now however, where there are unlikely to be platooning of vehicles in the future. Ergo the point you’re trying to make is broadly irrelevant because the AM driver of the future is going to be driving similarly to today. Except on the motorways and when he doesn’t want to drive he can just sit back and relax while the vehicle takes him to his destination.
 
That would be an absolute ballache of a system to maintain.

Tesla are aiming for (in the next year basically and have already shown it in their test cars) complete automation from the road edge to the garage. The basic part available now is called “summons”, which will pull in and out of your garage, but the eventual aim is to allow the garage to be away from the house. The vehicle drops you off at your door then drives to the garage.

Basically companies are working on just this.

As for whether full automation will ever be achieved. I’m guessing it probably will, but 50 years or more away. The idea of a level 5 vehicle with no steering wheel or controls that can go anywhere is a pipe dream, for now. But level 4 vehicles that can go most places most of the time for most people are much closer. Those we’ll be seeing in the next decade IMO.
 
This has been the same for the past 100 years. Why is it such a problem now? All those things have gotten better over time so again why is it not sustainable if we can sustain it now and everything is getting better?

Cost of motoring is not going to get any cheaper with Autonomous cars either. Government will not allow it to happen. My current cost of motoring is more taxation than anything else. Infact my VED is more expensive than insurance now.

Pollution doesn't even have anything too do with autonomous cars either!

Ask yourself why they have become better.

Everything from regulation on speed and drinking, mandatory safety features on/in vehicles. Pushing of new technology (from “semi” automated technology like ABS) to speed cameras.

What has been the same for the last 100 years is a drive to make motor vehicles safer. When technology becomes available it is adopted. Automation is just another technology to be adopted to make the roads safer for all. It’s a continuation down the path we have been on for a century or more (if we consider the move from cart to car as a way of speeding up and making journeys more comfortable).

Cost wise I agree. I honestly doubt it’s going to be any cheaper on the long run. Governments still need to pay to build and maintain infrastructure, companies still need to build the vehicles and pay for R&D.

That said it may be cheaper for some, especially if you don’t need to own a vehicle in future and there are less vehicles on the road?
 
Where are we going to put all those people that now do not have a job? You are talking 600,000 jobs wiped out in the UK alone. Place them in the cab and you are not really gaining anything.

Now that is the real question.

We’ve moved on from whether the technology is there, to legislation and now to the important question.

What is increased automation going to do to society. It’s something governments are going to be grappling with for the next few decades. It’s not just drivers, but factory workers and other manual and blue collar jobs, perhaps even moving into the office world as well. How does society cope? Less hours for everyone? Job sharing? Retraining? Universal basic income? All and more are being looked at right.
 
It happened to a few people afaik, but only one death. Plus all the other incidents, like the google car that drove in to a bus.

Humans won't all make the same mistakes when presented with the same situation. A computer script can't learn or think for itself (which is all current "AI" really is). That's why real AI is the missing piece to make it all work, but it could be a very long way off yet. They will struggle with it until we're there.

Well no, the human(s) did make the mistake - as the national transport board report points out.

There was a software “problem” with Tesla’s Autopilot, which has now been fixed. BUT it was never meant to be used by someone not being alert. The driver of the Tesla was at fault for no being aware of a large lorry on the road. It’s level 2 technology after all (Tesla even call it a beta product).

The driver of the lorry was also at fault for crossing a main highway and not giving way to the Tesla in the first place. That lorry probably wouldn’t have been there if it was automated.
 
Yes but they are calling it AI, it's not AI by the scientific definition of it.

It seems the preferred terms among researchers is machine learning and deep neural nets rather than AI.

From a NY Times article this week:

"Machine learning isn’t just one technique. It encompasses entire families of them, from “boosted decision trees,” which allow an algorithm to change the weighting it gives to each data point, to “random forests,” which average together many thousands of randomly generated decision trees. The sheer proliferation of different techniques, none of them obviously better than the others, can leave researchers flummoxed over which one to choose. Many of the most powerful are bafflingly opaque; others evade understanding because they involve an avalanche of statistical probability. It can be almost impossible to peek inside the box and see what, exactly, is happening."

And within machine learning, the further preferred term appears to be deep neural nets. From the same NY Time article this week:

"Deep neural nets.....are now the class of machine learning that seems most opaque. Just like old-fashioned neural nets, deep neural networks seek to draw a link between an input on one end (say, a picture from the internet) and an output on the other end (“This is a picture of a dog”). And just like those older neural nets, they consume all the examples you might give them, forming their own webs of inference that can then be applied to pictures they’ve never seen before. Deep neural nets remain a hotbed of research because they have produced some of the most breathtaking technological accomplishments of the last decade, from learning how to translate words with better-than-human accuracy to learning how to drive."

Yes, the last application: learning how to drive. So deep neural nets are actually teaching themselves, which is its so called "black box."

I have read that Google engineers have spent several years creating news ways to visualise the inner workings of deep neural networks!
 
Last edited:
Now that is the real question.

We’ve moved on from whether the technology is there, to legislation and now to the important question.

What is increased automation going to do to society. It’s something governments are going to be grappling with for the next few decades. It’s not just drivers, but factory workers and other manual and blue collar jobs, perhaps even moving into the office world as well. How does society cope? Less hours for everyone? Job sharing? Retraining? Universal basic income? All and more are being looked at right.

You raise some intriguing questions and whether is it possible or desirable for regulators to really get out in front of the changes we are embracing as a result of technological breakthroughs.

As a businessman, I must say up front that I am not generally a fan of the EU or most regulators. In fact the EU has spent the past decade developing their General Data Protection Regulation which goes into effect in May 2018. It is a complex, large piece of legislation which starts with the following premise: the protection of personal data is a universal human right. Facebook and Google and Amazon will be affected by Article 21 of the Regulation giving users the ability to opt out of personal tailored ads. Article 22 confront Artificial Intelligence head on: E.U. citizens can contest “legal or similarly significant” decisions made by algorithms and appeal for human intervention.

Taken together, Articles 21 and 22 introduce the idea that people are owed an understanding when the are faced with decisions made by machine learning. What will be the practical application of these regulations?
 
Tesla are aiming for (in the next year basically and have already shown it in their test cars) complete automation from the road edge to the garage. The basic part available now is called “summons”, which will pull in and out of your garage, but the eventual aim is to allow the garage to be away from the house. The vehicle drops you off at your door then drives to the garage.

Basically companies are working on just this.

As for whether full automation will ever be achieved. I’m guessing it probably will, but 50 years or more away. The idea of a level 5 vehicle with no steering wheel or controls that can go anywhere is a pipe dream, for now. But level 4 vehicles that can go most places most of the time for most people are much closer. Those we’ll be seeing in the next decade IMO.


Think level 4 will be sooner than a decade.

First level 3 mass produced cars will be on the roads next year, Audi A8's.

Ford's aim according to their CEO at their investors day last September, is to have high volume (in excess of 100, 000 vehicles per year) production of a level 4 by 2022.

Technology is increasing at an exponential rate so I am confident level 5 cars will be on our roads by around 2030.
 
I believe Fords aim is for those vehicles to be for commercial use (taxis and shuttles). That’s where the main focus seems to at the moment regarding level 4 - buses, taxis, lorries etc.

I guess in fleets they will be easier to control and make sure they aren’t used inappropriately, while gaining significant amounts of data to improve the performance when let loose on the public as a whole.

I was just reading that Singapore are aiming to have full size (80 people) driverless buses in some areas by 2022 as an example. They’re also pushing for platooning with only one driver between three vehicles as well.

Edit:

No driver required. Thanks to Ford, that statement will be possible in 2021, the year that we will have a fully autonomous vehicle in commercial operation. To make this possible, we have partnered or invested with four different technology companies, along with doubling our Silicon Valley presence.

The effort to build fully autonomous vehicles by 2021 is a main pillar of Ford Smart Mobility: our plan to be a leader in autonomy, connectivity, mobility, customer experience and analytics. The vehicle will operate without a steering wheel, gas pedal or brake pedal within geo-fenced areas as part of a ride sharing or ride hailing experience. By doing this, the vehicle will be classified as a SAE Level 4 capable-vehicle, or one of High Automation that can complete all aspects of driving without a human driver to intervene.

https://corporate.ford.com/innovation/autonomous-2021.html

So unfortunately I don’t think they’ll be available to the general public for another few years.

All I want at the moment is a vehicle I can drive to the main road and set to “auto”, then sit for a couple of hours (probably sleeping) at 4 am in the morning. :D.
 
Last edited:
There is a significant legal event that will start soon in the world of autonomous vehicles: on 5 December, a lawsuit filed by Google's Waymo against Uber will begin to be argued in front of a civil court in California for a case in which Google is suing Uber claiming it stole its trade secrets and is using them in its self driving car technology. Separately Google is suing a former executive, Anthony Lewandowski, who secretly downloaded highly confidential files from Google servers before leaving Google to start work at Uber.

Waymo's main priority is to win a permanent injunction against Uber using any Waymo intellectual property, as well as financial damages.

Anyone else following developments in this lawsuit?
 
Wont neural implants and improved car radar sensors render the AI car unnecessary in the future?

I assume you are asking this question tongue-in-cheek? Neural implants? The self driving vehicle is expected to reach the highest level of autonomy in the near future, called Level 5. That means that not only is the human not involved in the navigation of the vehicle in a controlled space/lanes (Level 4) but will be able to travel on any public road without any human involvement. Therefore the human can engage in a range of non-navigation activities (some which will be quite productive and contributing to the growth of the economy) so there is no need for the human to have a neural implant in this scenario.

If, on the other hand, you expect the computer's artificial intelligence to exceed human intelligence (so called "singularity") , perhaps the neural implant into the human brain could be appropriate. If you want to read about this further, suggest you Google the topic of singularity and in particular, Ray Kurzweil, such as in this recent note:

http://www.dailymail.co.uk/sciencet...create-super-humans-Google-expert-claims.html

On your second point about "car radar sensors", a Level 4 or Level 5 vehicle will likely have a range of hardware including radar, sensors and LIDAR. These hardware devices will be necessary to include in any Level 4 or Level 5 vehicle.
 
Google Waymo announces it has racked up 4 million self driven miles

https://techcrunch.com/2017/11/27/waymo-racks-up-4-million-self-driven-miles/

Many believe the most important metric of building successful autonomous driving technology is actual miles driven on roads. Waymo continues to press its advantage.

Google began its self driven miles in 2009 and it took six years to accumulate its first 1 million miles. Waymo went from 3 million miles to 4 million miles in only six months (from May to November 2017). And the 4 million miles stat excludes the 2.5 billion that Waymo has "self driven" in its lab simulation.

Expect the launch of Waymo's commercial ride hailing service very soon.
 
Back
Top Bottom