Advertisement
If you have a new account but are having problems posting or verifying your account, please email us on hello@boards.ie for help. Thanks :)
Hello all! Please ensure that you are posting a new thread or question in the appropriate forum. The Feedback forum is overwhelmed with questions that are having to be moved elsewhere. If you need help to verify your account contact hello@boards.ie
Hi all! We have been experiencing an issue on site where threads have been missing the latest postings. The platform host Vanilla are working on this issue. A workaround that has been used by some is to navigate back from 1 to 10+ pages to re-sync the thread and this will then show the latest posts. Thanks, Mike.
Hi there,
There is an issue with role permissions that is being worked on at the moment.
If you are having trouble with access or permissions on regional forums please post here to get access: https://www.boards.ie/discussion/2058365403/you-do-not-have-permission-for-that#latest

Self driving buses, trains, trucks etc

11415171920

Comments

  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    That's so reassuring given their 'we know better than you, Mr stupid pilot' software just killed 189 people.

    How does an autonomous system even know of, let alone decide on, whether to divert to a nearby airport due to an on-board medical emergency?

    It might happen for an air taxi as mentioned in the article, but even that I doubt very much. I usually avoid absolutes like saying 'never', but this is one case where I think never would apply to autonomous commercial flights carrying large numbers of passengers.


  • Closed Accounts Posts: 1,452 ✭✭✭Twenty Grand


    cnocbui wrote: »
    That's so reassuring given their 'we know better than you, Mr stupid pilot' software just killed 189 people.
    .

    You might want to call air crash invesigates with your information because theres no evidence so far of a software fault.
    Last report is that it was a faulty sensor that gave incorrect information, to pilots who were inadequatly trained.


  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    You might want to call air crash invesigates with your information because theres no evidence so far of a software fault.
    Last report is that it was a faulty sensor that gave incorrect information, to pilots who were inadequatly trained.

    And you might want to stow your ill informed sarcasm. There is a desperate attempt on the part of Boeing to paint this as a maintainance failure by the airline because of the damages they are potentially facing, not to mention the reputational and considerable potential sales damage, which is likely in the many hundreds of millions.

    The sensor looks to have mis-reported the planes angle of attack, causing the computers to suddenly and severely nose the plane down in order to prevent the imminent stall it thought was about to occur due to the wrong angle of attack being reported by the sensor.
    If the sensor fails to send correct information, it can confuse both the plane's computer and its pilots, causing an aircraft to take a sudden dive.
    And from the moment they retracted the wing flaps at about 3,000 feet, the two pilots struggled -- in a 10-minute tug of war -- against a new anti-stall flight-control system that relentlessly pushed the jet's nose down 26 times before they lost control.

    Though the pilots responded to each nose-down movement by pulling the nose up again, mysteriously they didn't do what the pilots on the previous day's flight had done: simply switched off that flight-control system.

    So, in the absence of human pilots trying to override the computer, a completely AI system, given the malfunctioning sensor, would have driven the plane into the ground on the first of the 26 nose downs.

    There appears to have been pilot error in not disengaging the autopilot after it originally malfunctioned, but perhaps there is some reason they didn't/couldn't that will emerege.


  • Closed Accounts Posts: 1,452 ✭✭✭Twenty Grand


    cnocbui wrote: »
    And you might want to stow your ill informed sarcasm. There is a desperate attempt on the part of Boeing to paint this as a maintainance failure by the airline because of the damages they are potentially facing, not to mention the reputational and considerable potential sales damage, which is likely in the many hundreds of millions.

    There appears to have been pilot error in not disengaging the autopilot after it originally malfunctioned, but perhaps there is some reason they didn't/couldn't that will emerege.
    My sarcasm is pointing out the incredible jumps to conclusions you're making based on the scant evidence available.

    If I had to put money on it, I'd place the blame on the aircraft mechanics who fitted the sensor, as the plane also had malfuncting airspeed sensors on previous flights that were not fixed for some reason.

    Time will tell anyways.


  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    My sarcasm is pointing out the incredible jumps to conclusions you're making based on the scant evidence available.

    If I had to put money on it, I'd place the blame on the aircraft mechanics who fitted the sensor, as the plane also had malfuncting airspeed sensors on previous flights that were not fixed for some reason.

    Time will tell anyways.

    I'm not jumping to conclusions, I am going on a commercial aviation oriented sites reporting of what the flight data recorders revealed. These are facts, not conjecture. I even provided a direct quote, which seems to have slipped your attention: https://www.aviationpros.com/news/12437907/lion-air-pilots-struggled-against-boeings-737-max-flight-control-system-black-box-data-shows

    The relevant issue here is that an AI flight control system relies wholly on what the various flight attitude sensors report to it. in this case it's response was to kill 189 people. The humans pilots were fighting the system to counteract it's complete failure. Had they disconnected the autonomous system, they could have saved the aircraft and everyone on it. With no human pilots to discern the AI was malfunctioning, the plane would have crashed far sooner.

    The failed sensors and perhaps faulty maintenance highlight how dangerous wholly AI flight control systems would be, because maintenance glitches and faulty sensors happen, which is only one of many reasons why completely autonomous flight control systems in charge of commercial passenger aircraft is a bad idea.


  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    Which does of course beg the question, "who or what is likely to make more mistakes, a human or an AI device?"
    In the case of the air crash it appears the AI failure was compounded by the human error of not switching off the autopilot.


  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    Which does of course beg the question, "who or what is likely to make more mistakes, a human or an AI device?"
    In the case of the air crash it appears the AI failure was compounded by the human error of not switching off the autopilot.

    The pilot's didn't 'compound' the AI failure - make the situation worse - they failed to mitigate it. Without the pilots there wouldn't have been any attempt at mitigation.


  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    cnocbui wrote: »
    The pilot's didn't 'compound' the AI failure - make the situation worse - they failed to mitigate it. Without the pilots there wouldn't have been any attempt at mitigation.
    You're being oversensitive about the words used!!!
    There were two failures, one by the AI and one by the pilots, does it matter is the word was compound rather than mitigate, the end result is the same


  • Closed Accounts Posts: 1,452 ✭✭✭Twenty Grand


    From the same link.
    Note the most of the report you linked blames human error, either the pilots or maintenance. It does say there are "potential" design flaws, but more with there being no backup than the actual sensor or software.

    The data points to three factors that seem to have contributed to the disaster:

    * A potential design flaw in Boeing's new anti-stall addition to the MAX's flight-control system and a lack of communication to airlines about the system.

    * The baffling failure of the Lion Air pilots to recognize what was happening and execute a standard procedure to shut off the faulty system.

    * And a Lion Air maintenance shortfall that allowed the plane to fly repeatedly without fixing the key sensor that was feeding false information to the flight computer on previous flights.


    Their failure to shut off the automated tail movements is baffling.

    "No one would expect a pilot to sit there and play tag with the system 25 times" before the system won out, said Lemme. "This airplane should not have crashed. There are human factors involved

    Lion Air has a very poor safety record and has been accused of skimping on maintenance to cut costs.

    One would image the reason for no redundancy is there are pilots to take control if anything happens.

    The pilots failed here.


  • Advertisement
  • Moderators, Category Moderators, Arts Moderators, Sports Moderators Posts: 50,181 CMod ✭✭✭✭magicbastarder


    Bikes put spanner in works of Dutch driverless car schemes
    Report highlights problems bicycles cause to self-driving cars’ detection systems
    https://www.theguardian.com/world/2019/feb/13/bikes-put-spanner-in-works-of-dutch-driverless-car-schemes


  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    Bikes put spanner in works of Dutch driverless car schemes
    Report highlights problems bicycles cause to self-driving cars’ detection systems
    https://www.theguardian.com/world/2019/feb/13/bikes-put-spanner-in-works-of-dutch-driverless-car-schemes

    I think that we're many years away from trusting autonomous vehicles on a shared use road, on motorways or other powered vehicles only roads, then only a few years.


  • Registered Users, Registered Users 2 Posts: 15,107 ✭✭✭✭loyatemu




  • Closed Accounts Posts: 2,891 ✭✭✭prinzeugen


    loyatemu wrote: »

    They are not trackless though. Yes there is no "track" as in steel rails and sleepers, but it still only runs on a predetermined path.

    A track in otherwords! I would love to know how these things would behave on icy tarmac. I imagine they would be jackknifing all over the place.


  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    loyatemu wrote: »
    prinzeugen wrote: »
    They are not trackless though. Yes there is no "track" as in steel rails and sleepers, but it still only runs on a predetermined path.

    A track in other words! I would love to know how these things would behave on icy tarmac. I imagine they would be jackknifing all over the place.
    Sounds like a great system for cities that have a large sprawl that would be prohibitively expensive to service using a conventional tram system, but unless a huge amount of investment is made in ensuring the road where it runs is kept in near perfect condition, the ride would soon deteriorate to that of a bus.


  • Registered Users, Registered Users 2 Posts: 29,452 ✭✭✭✭AndrewJRenko


    Bikes put spanner in works of Dutch driverless car schemes
    Report highlights problems bicycles cause to self-driving cars’ detection systems
    https://www.theguardian.com/world/2019/feb/13/bikes-put-spanner-in-works-of-dutch-driverless-car-schemes

    Isn't it funny how the bikes put the spanner in, and not the idiot designers who failed to design for bikes. In Netherlands.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 32,136 ✭✭✭✭is_that_so


    Isn't it funny how the bikes put the spanner in, and not the idiot designers who failed to design for bikes. In Netherlands.

    It's not an idiot designer issue. Cyclists, like pedestrians, are an unpredictable variable hence the suggestions in the article to explore larger road options first.


  • Registered Users, Registered Users 2 Posts: 29,452 ✭✭✭✭AndrewJRenko


    is_that_so wrote: »
    It's not an idiot designer issue. Cyclists, like pedestrians, are an unpredictable variable hence the suggestions in the article to explore larger road options first.

    Cars are fairly unpredictable too, in my experience.


  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    Isn't it funny how the bikes put the spanner in, and not the idiot designers who failed to design for bikes. In Netherlands.

    I don't think it's so much the bikes putting a spanner in it so much as the complexity and skill set required to do the task of driving has been grossly underestimated and that the general confidence in the ability to create systems that can do the task has been severely misplaced.

    You can't design for bikes because the task of driving, IMO, requires general intelligence, so it's a task that is too complex for our current level of technology.


  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    is_that_so wrote: »
    It's not an idiot designer issue. Cyclists, like pedestrians, are an unpredictable variable hence the suggestions in the article to explore larger road options first.
    It is there variables that will ultimately stop autonomous vehicles from ever using mixed use routes.


  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    Cars are fairly unpredictable too, in my experience.
    True, but an order of magnitude less than pedestrians, they generally don't run across pedestrian crossings, cycle from footpath to cross in front of you and a thousand other actions that cars generally don't perform.


  • Advertisement
  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    cnocbui wrote: »
    I don't think it's so much the bikes putting a spanner in it so much as the complexity and skill set required to do the task of driving has been grossly underestimated and that the general confidence in the ability to create systems that can do the task has been severely misplaced.

    You can't design for bikes because the task of driving, IMO, requires general intelligence, so it's a task that is too complex for our current level of technology.
    For mixed use routes, I would agree, motorways on the other hand require a much lower level of "intelligence" to programme an autonomous vehicle to drive safely.


  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    I am just stunned by how much people are paying extra for the advanced autopilot package in Teslas. They will never get an autonomous driving software update that delivers what they seem to think they will get. 2019 and Musk still hasn't had a Tesla drive coast to coast. It may be doable in a carefully orchestrated one-off, but the sell of reclining your seat and snoozing most of the way or playing with a laptop or phone most of the time is never going to be achievable in the lifetime of their cars.

    If you have to have your hands on the steering wheel and be paying close attention, you might as well be the one moving the steering wheel.


  • Registered Users, Registered Users 2 Posts: 29,452 ✭✭✭✭AndrewJRenko


    cnocbui wrote: »
    I don't think it's so much the bikes putting a spanner in it so much as the complexity and skill set required to do the task of driving has been grossly underestimated and that the general confidence in the ability to create systems that can do the task has been severely misplaced.

    You can't design for bikes because the task of driving, IMO, requires general intelligence, so it's a task that is too complex for our current level of technology.


    I'm not seeing why bikes would be a particular challenge, relative to other cars or pedestrians.

    True, but an order of magnitude less than pedestrians, they generally don't run across pedestrian crossings, cycle from footpath to cross in front of you and a thousand other actions that cars generally don't perform.


    No, but they do tend to make fast, dangerous manoeuvres without looking or indicating while scrolling through Instagram. Why would there be a particular difficulty with cyclists, relative to motorists and pedestrians?


  • Closed Accounts Posts: 1,452 ✭✭✭Twenty Grand


    No, but they do tend to make fast, dangerous manoeuvres without looking or indicating while scrolling through Instagram. Why would there be a particular difficulty with cyclists, relative to motorists and pedestrians?

    Anything to do with physical size, quick manoeuvers, small turning circle, quick acceleration?

    I don't know much about it, but I would have though bikes were more likely to take extreme manoeuvers, like a sharp 90 degree turn for instance, where a car can't make such a turn safely.


  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    I'm not seeing why bikes would be a particular challenge, relative to other cars or pedestrians.





    No, but they do tend to make fast, dangerous manoeuvres without looking or indicating while scrolling through Instagram. Why would there be a particular difficulty with cyclists, relative to motorists and pedestrians?
    The active sensors on the autonomous vehicles need to be far more sensitive to detect and identify pedestrians & cyclists than ones that only need to identify other vehicles.

    It's asking too much of the technology to make split second decisions and then change the direction of something that's probably moving at 100Kph and weighing 1-2 tonnes, based of the interpretation of what a small object is that is moving across the road 100metres ahead.

    It could be a crisp packet or a cyclist, if it's a cyclist will it be safely across before reaching it, these thought processes are natural to a human, not to a computer. So my belief that autonomous vehicles will be restricted to motorways (with active transponders) will be as far as it will get for the foreseeable future is technically sound based on the abilities of AI today.

    In reality, the roads would need to have compatible transponders that communicate the local topology to the vehicles onboard computers as opposed to relying on the vehicles sensors to make sense of the world around.


  • Registered Users, Registered Users 2 Posts: 29,452 ✭✭✭✭AndrewJRenko


    Anything to do with physical size, quick manoeuvers, small turning circle, quick acceleration?

    I don't know much about it, but I would have though bikes were more likely to take extreme manoeuvers, like a sharp 90 degree turn for instance, where a car can't make such a turn safely.
    Maybe, but I'd have thought that cyclists would largely fall somewhere between motorists and pedestrians on all of those attributes, so I'm wondering why they are set up as 'the problem'.


  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    Maybe, but I'd have thought that cyclists would largely fall somewhere between motorists and pedestrians on all of those attributes, so I'm wondering why they are set up as 'the problem'.
    The "problem" is for the system to keep track of multiple small objects that are detected by the sensors, identify them, predict their future motion and whether your presence and direction of motion will influence their future movements and how these movements will influence the movement of vehicles around you.

    These are things humans are able to do almost without thinking.


  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    I'm not seeing why bikes would be a particular challenge, relative to other cars or pedestrians.

    Try designing and making sensors and software to detect them and differentiate them from background clutter. They tend to be close to the kerb making that task especially hard. Is it a kid riding down a footpath or is it on the road? Is it one bike, two, three? Is one faster than the other and likely to imminently pull out and go around the others? Bicycles and pedestrians in general are very hard to detect, classify and predict for.


  • Posts: 0 [Deleted User]


    Great to see the ostriches alive and well on this thread


  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    Great to see the ostriches alive and well on this thread
    Depends on what you mean by ostriches?

    Are they those who believe that technology will overcome all or is the more realistically thinking people who see the utilisation of autonomous vehicles being limited to controlled environments.


  • Advertisement
  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    Great to see the ostriches alive and well on this thread

    Great to see the lemmings are as oblivious to reality as Teslas are to large stationary things in front of them...

    Totaled-Tesla.jpg
    I walked away from this crash last night and am so thankful for the safety of Tesla cars. However, I am a little disappointed to learn that its autopilot feature is unable to identify a stationary construction vehicle.
    The person is a Professor of Medicine.


  • Registered Users Posts: 3,522 ✭✭✭paleoperson



    "And then I said:

    'Your cars will be driving by themselves in 2020, and Space X will be taking people on holidays to Mars by 2025!'"


    MW-CZ625_elonmu_20141119142911_ZH.jpg


  • Closed Accounts Posts: 2,891 ✭✭✭prinzeugen




  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,100 Mod ✭✭✭✭AlmightyCushion


    prinzeugen wrote: »

    Just like human drivers can't drive in all conditions.


  • Closed Accounts Posts: 2,891 ✭✭✭prinzeugen


    Just like human drivers can't drive in all conditions.

    Exactly. All these claims that AI and self driving trains, cars etc would be safer is bollox.

    And they are programmed by humans. The X-Files did a good episode on it.


  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,100 Mod ✭✭✭✭AlmightyCushion


    prinzeugen wrote: »
    Exactly. All these claims that AI and self driving trains, cars etc would be safer is bollox.

    And they are programmed by humans. The X-Files did a good episode on it.

    The point I was making is that self driving cars won't be able to drive in all conditions but neither can humans. That doesn't mean self driving cars will fail or that they won't be safer than human drivers.

    Being programmed by humans also doesn't mean it can't out perform a human. Alpha go was programmed by humans and it has beaten the best humans at go. Watson was programmed by humans and it beat the best jeopardy contestants at jeopardy.


  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    Alpha go used neural networks to program itself, so it wasn't really programmed by humans. One major problem with this approach is that humans can't troubleshoot such systems if things go wrong as there is no way to know how they work, let alone what went wrong. It's called the black box problem. If you deploy autonomous vehicles which were neural network trained, and then mysteriously some of them start causing fatal crashes, you can't troubleshoot the code to try and find out what went wrong as humans didn't write the code, or more correctly, craft the algorithm. Probably unlikely but it is worth keeping in mind.


  • Registered Users Posts: 3,522 ✭✭✭paleoperson


    The point I was making is that self driving cars won't be able to drive in all conditions but neither can humans.

    No, they are clearly talking about conditions that are driveable in by humans but not by AI. They are not talking about conditions that are undriveable in by humans.
    That doesn't mean self driving cars will fail or that they won't be safer than human drivers.

    Nobody suggested self driving cars could never be safer than human drivers in any specified conditions. I don't mean to come across as impolite but you're really not making any point here.


  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    prinzeugen wrote: »
    Exactly. All these claims that AI and self driving trains, cars etc would be safer is bollox.

    And they are programmed by humans. The X-Files did a good episode on it.
    Self driving trains are real and there are several driverless rail networks around the world, in controlled environments where the "system" is separated from humans, autonomous systems are perfectly safe.

    Flying is another form of transport that can be automated without too much difficulty, the pilot is totally dependent on the instruments anyway.


  • Advertisement
  • Posts: 31,118 ✭✭✭✭ [Deleted User]


    prinzeugen wrote: »
    Classic Daily Mail type of story, all about the locals wanting to scupper the vehicles.


    But in the middle there's this:
    Autonomy always will have some constraints,' John Krafcik, head of the self-driving car unit of Google parent company Alphabet.

    Police in Arizona have recorded 21 incidents in the past two years concerning vigilante citizens who have cast rocks, pointed guns at and slashed the tires of Waymo's autonomous vans

    'It's really, really hard,' Krafcik said.
    'You don't know what you don't know until you're actually in there and trying to do things.'
    While self driving cars may not become widespread soon, Krafcik said trucking is one area where self-driving vehicles could appear, due to a shortage of drivers.
    'The trucking shortage is now,' Krafcik said.
    'Moving goods on freeways to hub to hub is fairly straightforward.'


    As I have said multiple times before, on motorways autonomous vehicles will work.


  • Moderators, Category Moderators, Arts Moderators, Sports Moderators Posts: 50,181 CMod ✭✭✭✭magicbastarder


    cnocbui wrote: »
    The person is a Professor of Medicine.
    a fantastic expert to speak about self driving cars!


  • Registered Users, Registered Users 2 Posts: 14,863 ✭✭✭✭markodaly


    Self-driving cars won't be able to drive in all conditions or environments. Just like today, you can't drive a car up the Slieve Bloom mountains. However, the vast majority of driving can be automated. People drive to work/school/shops and home. This is easy to automate in the next 5-10 years.


  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,100 Mod ✭✭✭✭AlmightyCushion


    No, they are clearly talking about conditions that are driveable in by humans but not by AI. They are not talking about conditions that are undriveable in by humans.

    John Krafcik, the person quoted in the article, said after wards that was what he was talking about.
    Nobody suggested self driving cars could never be safer than human drivers in any specified conditions. I don't mean to come across as impolite but you're really not making any point here.

    prinzeugen said it in the post I was quoting.
    Exactly. All these claims that AI and self driving trains, cars etc would be safer is bollox.


  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    Self driving trains are real and there are several driverless rail networks around the world, in controlled environments where the "system" is separated from humans, autonomous systems are perfectly safe.

    Flying is another form of transport that can be automated without too much difficulty, the pilot is totally dependent on the instruments anyway.

    What a load of BS. Flying is another area where only general intelligence can possibly handle exceptions and malfunctions. You can't just apply the brake, pull over and turn on the hazards at 10,000m. The Lion Air disaster was caused by the automated system pushing the plane into the ground. People have been critical of the pilots for NOT turning the automated system off!

    I have not the slightest fear of flying, I have flown planes, but even the prospect of an AI pilot terrifies me.


  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    a fantastic expert to speak about self driving cars!

    What do you mean?


  • Advertisement
  • Registered Users Posts: 3,522 ✭✭✭paleoperson


    Classic Daily Mail type of story, all about the locals wanting to scupper the vehicles.

    Bringing up something notable and interesting as a small point of the article? :confused:
    John Krafcik, the person quoted in the article, said after wards that was what he was talking about.

    Really, about conditions so bad humans wouldn't be able to drive either? Can you show me where he said that please?

    You realize what the weather is like in Phoenix Arizona which he is the head of right? Not exactly what you see on Ice Truckers.

    Clearly someone is getting the situation totally wrong here and stating that to others, either Daily Mail or you, who is full of ****. Maybe have a think about that when you realize the answer instead of going around saying things like "daily fail LOL".
    prinzeugen said it in the post I was quoting.

    I felt like he was saying it as a general rule but alright. I hope you notice also that the "AI" it takes to ride on the freeway is as simple as "if object in front of you going slower check if object near you in fast lane, change lanes if possible"... that type of thing - it can barely be termed "AI" at all. They also don't use optics or anything like that for the location of the vehicles, it's a glorified radar system.


  • Moderators, Category Moderators, Arts Moderators, Sports Moderators Posts: 50,181 CMod ✭✭✭✭magicbastarder


    cnocbui wrote: »
    What do you mean?
    in that it didn't take a professor of medicine to cop that the tesla failed to detect a truck. i.e. his job doesn't add weight to that observation.


  • Moderators, Education Moderators, Technology & Internet Moderators Posts: 35,100 Mod ✭✭✭✭AlmightyCushion


    Bringing up something notable and interesting as a small point of the article? :confused:



    Really, about conditions so bad humans wouldn't be able to drive either? Can you show me where he said that please?

    You realize what the weather is like in Phoenix Arizona which he is the head of right? Not exactly what you see on Ice Truckers.

    Clearly someone is getting the situation totally wrong here and stating that to others, either Daily Mail or you, who is full of ****. Maybe have a think about that when you realize the answer instead of going around saying things like "daily fail LOL".



    I felt like he was saying it as a general rule but alright. I hope you notice also that the "AI" it takes to ride on the freeway is as simple as "if object in front of you going slower check if object near you in fast lane, change lanes if possible"... that type of thing - it can barely be termed "AI" at all. They also don't use optics or anything like that for the location of the vehicles, it's a glorified radar system.

    Maybe you're thinking of someone else but I didn't say anything negative about the daily mail here.

    https://twitter.com/johnkrafcik/status/1063281250088042496?s=21
    Yeah some context missing here, but it did make for a fun headline. 🤪 (I said the same thing about my own driving.) The point is that autonomous driving, like human driving, will always have constraints.

    That tweet was actually in response to a cnet article where the headline was:
    It'll be decades before autonomous cars are widespread on the roads -- and even then, they won't be able to drive themselves in certain conditions - John Krafcik CEO of Waymo


  • Registered Users, Registered Users 2 Posts: 20,110 ✭✭✭✭cnocbui


    in that it didn't take a professor of medicine to cop that the tesla failed to detect a truck. i.e. his job doesn't add weight to that observation.

    It was his car that was totalled, I thought it added a bit of credibility in case anyone were to suggest the reported incident was fake.


  • Registered Users Posts: 3,522 ✭✭✭paleoperson


    https://twitter.com/johnkrafcik/status/1063281250088042496?s=21

    That tweet was actually in response to a cnet article where the headline was:
    It'll be decades before autonomous cars are widespread on the roads -- and even then, they won't be able to drive themselves in certain conditions - John Krafcik CEO of Waymo

    He seems to be trying to walk back his remarks a bit also. The point remains that in his original comment he stated they won't be able to drive themselves in certain conditions. Now I'm sure he didn't mean extreme weather conditions where the car is physically unable to move, right? I think it's at least fair to say that AI will never be driving in icy weather.


  • Advertisement
Advertisement