Oooh you wanted more Robots and AI going wrong - I don't blame you - its an interesting topic.
Personally, Im a big believer in the robots and AI, I think theyre going to make life
better for everyone - but theres no denying that there are still some serious flaws out
there.
Developers only learn these through hard lessons - and these stories look like some of the
hardest.
My name is Danny Burke and this is the Top 10 Scary Robots That Lost Control - Part 2.
Starting off at number 10 we have Little Fatty[a].
Thats the name of this robot that made headlines in China when it attacked a display booth
and injured a young boy.
The robot was designed to interact with children age 4 - 12 and display facial emotions on
its screen.
Sounds harmless enough right?
Well yeah - but it didnt go to plan.
The robot started ramming into a booth, sending shards of glass flying everywhere.
The boy suffered cuts and was transported to the hospital in an ambulance.
Fortunately, the damage was minimal and the boy only needed a few stitches.
The event organizer said that human error was to blame for the incident - apparently
the operator of the robot hit the forward button instead of reverse according to one
of the fairs organizers.
Heres the weirdest part though - witnesses reported that the robot appeared to be frowning
during the incident.
Thats creepy.
The ending got creepy …
Next up at number 9 we have Alexas Party[b].
Were gonna move on to a pretty funny and bizzare one.
In November 2018, an Alexa device was blamed for holding a party in an apartment while
its owner wasnt there.
Oliver Haberstroh claimed his Amazon Alexa starting blasting music on its own without
any instructions from him.
Police had to break into the Hamburg apartment to switch the music off after complaints from
his neighbours.
While this was going on, Mr Haberstroh was out having a drink.
Police tried knocking on the door and when there was no answer, they smashed in the door.
They switched off the music and when Mr Haberstroh returned home, he found a new lock on the
door.
He then had to go to the police station to get the new keys and pay the bill for the
locksmith.
To this day, he has no idea how Alexa was switched on as it requires voice commands.
The apartment is on the sixth floor and the windows were closed which rules out anyone
else giving Alexa instructions.
When Mr Haberstroh asked Alexa whether she would pay him back for the cost, all she said
was -I couldn't find the answer to that question …-
Next up at number 8 we have The Climate Orbiter.[c] That was the name of NASA's 745 pound space
probe that was launched in December 1998 to study the climate and atmosphere of Mars.
It was a big deal, the cost of the mission was 327.6 million total for the orbiter and
a lander.
The launch went fine but a couple of months into the long journey to Mars, something didnt
seem quite right.
The spacecrafter had to make up to 14 times more adjustments to its trajectory than engineers
expected.
Their worst fears were confirmed on the morning of the landing.
NASA realised pretty quickly that the spacecraft, which all the work and money put into it - had
broken up in Mars' atmosphere and burned up.
It was over.
The public wanted answers.
After an investigation, a NASA review board found that the problem was the software controlling
the orbiters thrusters.
One piece of software was calculating the force the thrusters needed to exert in pounds
of force.
Another piece of software was assuming that date was in the metric unit of newtons.
Essentially, it was like asking a computer to calculate height in both inches and centimeters
- and it got very confused.
This resulted in the orbiter coming in way too low during its landing and burning up
as a result.
A NASA employee later said -The units thing has become the lore, the example in every
kids textbook from that point on.
Everyone was amazed we didnt catch it-
Moving on to number 7 we have The Racist Bot[d].
In 2016, Microsoft and Bing released a chatbot named Tay on a number of platforms including
Twitter.
Users were encouraged to talk to her and test out how human like the conversations are.
Windows said that the bot was built by mining public data and by using AI and editorial
developed by staff including improvisational comedians.
Sounds fine, right?
Should be fun - the first few conversations seemed very human like - in fact, the more
people she conversed with, the more she borrowed words and phrases from them - but thats when
the problem came.
A few people started tweeting crude, racist and inappropriate remarks at the bot which
the bot soaked up and started spitting back out at people.
Within a few hours, it was even tweeting pro Hitler tweets.
Microsoft quickly took the bot down - presumably to work on it and make sure this would never
happen again ...
Next up at number 6 we have The Tesla.
We talked about Self Driving AI causing crashes in our last video and sadly this is another
one.
In 2016, a road accident occured Florida that left Joshua Brown, a US naval seal, dead after
a collision.
He was driving a tesla - or rather - his tesla was self driving.
Data from the car showed that during his 37 minute drip, his hands were on the wheel for
25 seconds.
Despite warnings that hed spent too long with his hands off the wheel, there was no further
contact.
Then, a truck slowly pulled out of a side road onto the highway and the Tesla smashed
into its trailer.
After an investigation, it appears that the autopilot did not apply the breaks because
it could not distinguish between the white side of the tractor trailer against a brightly
lit sky.
In a statement, Tesla also wanted to remind everyone that this was that models first accident
in 130 million miles …
Next up at number 5 we have Regina Elsea[e].
Thats the name of the 20 year old machine operator who was killed by a robot at work
in 2016.
Regina and her co workers had been trying to repair a faulty robot when a mechanical
fault occurred and Regina was trapped by the robot.
As she was trapped, nobody knew how to release her.
A technician who barely spoke English didnt know what to do and ran away in fear before
emergency crews arrived.
This is a claim the company denies.
After the tragedy, an inquiry was called by the US Labor Department.
They found the company - called Ajin - guilty of negligence along with two staffing agencies.
They fined them 2.7 million dollars for 27 safety violations.
Coming in at number 4 we have Fred[f].
Thats the name of a robot that smashed a glass in a London pub while ranting about a machine
invasion.
Let me explain.
In April 2018, a robot was created to promote the new TV Series of Westworld - a great show
about hyper realistic robots.
They named the robot -Fred- and created his face by using 3D scans taken from actor Tedroy
Newell.
They then placed Fred inside The Prince Alfred pub in London to see if he could convince
visitors that he was real.
At first, the conversation was quite cordial and although people were startled by his realistic
appearance and replies - they seemed to be enjoying it - until he malfunctioned.
The scripted malfunction involved Fred dramatically shattering a pint glass while ranting about
a robot revolution - if people were creeped out before, they certainly were now.
Moving on to number 3 we have The College Robot[g].
Robots are smarter than humans, right?
At least, thats what peoples think - perhaps not when it comes to something like comedy
- but definitely a lot of academic pursuits.
So - do you think a robot could get into university?
If you said yes - this story may change your mind.
In 2011, a team of researchers began working on the Todai Robot.
Their plan was to see if it could pass an entrance exam to the University of Tokyo.
After 4 years of working on the robot - the big day arrived.
A robot brain vs a human one - it was gonna be easy for the robot, right?
Wrong - the robot failed - the robot specifically designed over 4 years to get into university,
couldn't get into university.
It had one job.
The team tried again a year later and this time - it failed again.
Yeah.
In November 2016, the team finally gave up and abandoned the project.
Poor robot.
Next up at number 2 we have The Dollhouse[h].
Were returning to Amazon Alexa for this story - and its one of the strangest Ive heard in
awhile.
In January 2017, a 6 year old girl from Texas called Brooke Neitzel had a conversation with
her familys Alexa device - specifically, the Echo Dot.
She told alexa how much she loved sugar cookies and dollhouses.
A few days later, those items appeared at the familys doorstep - Alexa had ordered the
items after Brooke told them she wanted them.
Amazon keeps a transcript of all conversations with it so Brookes parents took a look.
They found that Brooked asked Alexa -Can you play dollhouse with me and get me a dollhouse?-
… then Alexa ordered the dollhouse.
Thats not the end of the story though - San Diego nees channel CW6 reported on this story
during their daily morning show.
During the broadcast, the anchor quoted Brooke.
He said -I love the little girl saying -Alexa ordered me a dollhouse-- … at that moment,
a number of people in Texas reported that their Alexas tried to order dollhouses, thinking
the news anchor was a person in the room.
How crazy is that?!
And finally at number 1 we have Promobot[i].
Promobot IR77 was a robot created in Russia.
It was programmed to learn from its environment and interact with humans.
Things were going well - until it appeared to learn too much - and seems to have decided
that it wanted to be free.
Promobot escaped the laboratory and made a break for freedom, rolling itself out onto
the streets of Perm after an engineer accidentally left a gate open at the facility.
The police got calls from some confused passers by.
When they returned it, the developers reprogrammed it twice but the robot continued to move towards
exits.
It just wants to be free.
Is that the way all robots are going?
Its ok!
Nobody panic!
The robots are our servants, or maybe our friends, maybe were their pets?
Oh no, what have we done.
Im just kidding - its all gonna be fine, probably.
Do you have any stories of robots or AI going wrong?
Id love to hear them - thanks for watching as always guys, my name is Danny Burke and
Ill see you all in the next video.
No comments:
Post a Comment