Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Robots Still Haven’t Taken Over: A brief history of machine anxiety (lithub.com)
86 points by mstats on Sept 9, 2017 | hide | past | favorite | 78 comments


The meme that man's creations will eventually kill their maker is at least as old as the jewish Golem. They're classic tales warning of hubris. Whether it's automata or summoned demons doesn't make much of a difference I think.

What's interesting to me is that as technology progressed, the gap between fiction and fact keeps narrowing. Take for example Der Sandmann. While true artificial intelligence is still far away, I think we'll see animated sex dolls in the next ten or twenty years that rival the capabilities Olimpia displays. As we already have a small number of people that prefer dolls to humans, I wonder where the tipping point will be where, say, about as many people are gay as are robophile.


I thought the twitter list of ethical questions about sex robots was fascinating in the ways it asked some very real questions about this.

The change that I continue to watch for is not when robots are used to do an undesirable job, but instead they are used to do a desirable one. I suspect the first hints of that are programs that write sports copy for newspapers. I have known a couple of sports writers over the years and they really liked what they did, and would (and sometimes did) do the job for free.


Heavy automation definitely does away with jobs that people liked doing, but that's been true since the industrial revolution.

Most train drivers like driving trains, something like London's DLR doesn't have train drivers (a human _can_ drive the train, but when they take over it deliberately goes slowly for safety reasons, so this is not normally used in paying service). A five year old (or a twenty-something going on five years old) gets to sit at the front and pretend they're driving, which is fun for them but it's not a job any more.

Some people enjoy operating a simple mechanical loom. Some expensive cloths are still today "hand woven" with such a loom because the legal protection for the name of the cloth mandates that they be produced this way. Probably everything else you have that was woven is woven automatically by a machine because machines produce the same quality as a skilled weaver but in enormous volume and more quickly.

The question is never whether machines will displace people from jobs, but about whether new jobs will come into existence (in large numbers) to replace those jobs. It's not "Can a machine do my job?" so much as "Is there anything I could get paid to do that a machine couldn't do?".


People say automation will kill many jobs, but often forget that when that day comes, we'll also be much more empowered.

Now we can use neural nets and all sorts of mechatronics to create our own automation and bots. Software's free and hardware is cheap. What will so many people with so much free time on their hands, and with needs that are unfulfilled, do?


> Software's free and hardware is cheap.

Imho that hardware part really depends on scale. As nice as the thought might be, I doubt there are enough resources on Earth for a future where 8 billion people live a leisurely first world lifestyle in their smart homes with IT gadgets all around them, while robots take care of all the manual labor that needs to be done.

All that hardware requires resources and those resources are finite, at least on that little piece of rock we like to call Earth. As of right now that's pretty much the current ceiling for humanities progress and it ain't a very high one.


The train conductor that has been driving trains for twenty years is very unlikely to pick up the latest AI framework and become a software engineer.


The comparison I like to use is horses. Horses used to be used for all sorts of things. When steam engines first started pumping water out of mines, they were given a "horse power" measurement of how many horses they could replace for the job, but they still couldn't replace horses for everything. It didn't take long for other, specialized machines to replace almost every other need for a horse. Now they are essentially novelties. You'll occasionally pay extra for a worse experience because it's fun and novel. The exact same will happen with human interaction. It's already started with "hand woven" and the like.


This has been my thought. There will probably be markets for businesses with "human staff". It's along the same lines as getting business for being environmentally friendly.


>"Is there anything I could get paid to do that a machine couldn't do?".

Work for free or even pay for the privilege of working. See some short distance truck drivers working 20 hours a day making losses on some weeks after paying for their truck lease and other costs that were externalised from the employer to the employee. You can't exploit a robot but you can exploit a human.


You can't exploit a robot but you can exploit a human.

That's a bold philosophical statement about the nature and uniqueness of man.


New jobs always come into existence. Everybody needs something to do with their day, and a job is just the work you do during the day.

The question is

(i) who is going to pay you to do your work during the day, and (ii) will the rest of society accept that it is work of a type that fulfils the reciprocity requirement.

Without both of those the machines won't be ordered to produce anything by those who run and operate them.

We expect to receive something of value in return for the machine's output.


> The change that I continue to watch for is not when robots are used to do an undesirable job, but instead they are used to do a desirable one.

Like working in high-finance? A sector that's been overtaken by algorithms pretty much the earliest among all the industries. Imho "the machines" have already "won", humans are mostly mere cogs in a whole collection of systems which see planet Earth as a gigantic perpetual "economic engine" in need of constant acceleration, or else it might come crashing down all around us.


Got a link to that twitter list?



I think that intelligent people have always observed that human beings are their own worst enemy most of the time. Outside of extreme cases like the great plague of Europe, human history has mostly been a history of humans being the major threat to other humans. People noted how warfare was democratized with the advent of the pile head crossbow, and it's true they panicked. The same thing kept happening, warfare kept becoming broader, the means to kill increasingly democratized, and every step along the way people panicked and thought that this was the end.

Then we developed nuclear weapons, and everyone finally realized the end was in sight, no need to panic, the threat was finally existential and real. Our response to this realization was to mutually stockpile enough nuclear weapons to destroy most human life and all human civilization many times over. Still, like the crossbow nuclear weapons have not totally destabilized society, and now intellectual people are looking for the next threat, and for the first time all of them are potentially world ending. That doesn't mean the world will end, as we've seen many times in the past, but as you say the gap is narrowing, and we're never going back to a time and we don't have the means to actually end it all.

I think people worry about robots because it's the most comfortable thing to worry about, compared to nukes and climate change at least.


The next great threat will take the destructive potential of the nuke and democratize it like the crossbow. Perhaps it'll be some sort of biotech. Maybe a "desktop 3D printer" for synthetic organisms. This could allow somebody with little more than a high school biology education to download a template and "print" some weaponized anthrax.

I have no idea how we'll survive that.


In that kind of world, deep surveillance and maybe mind control seem like the only option.

Or an alternative(maybe): all technologies will be inside highly regulated environments.building a 3d printer from scratch is really hard.


building a 3d printer from scratch is really hard

It definitely is right now. Suppose for a moment that a 3D printer were invented that could turn raw materials into perfect copies of itself. Suppose also that the raw materials could be automatically extracted from household and electronic waste. A machine that can turn garbage into copies of itself and other objects with similar complexity to the original machine.

The only remaining obstacle would be software (and data files). If we are to assume that machine learning and AI progress in parallel to the development of such a machine, it's difficult to see how we could ever prevent somebody from putting the pieces together.


Change your paradigm from tech to biology, and you just described an engineered pathogen. I don't know about 3D printers, but gene sequencers and the rest of the tools needed for a garage-plague are nearly here on a global scale.


My original point about the 3D printer was meant to be an analogy. You are right about the approaching danger. I don't know what we'll do. We may end up in a society under permanent quarantine. An Orwellian nightmare with justification.


The other option is to decentralize. As long as there are no major population concentrations, there are no super desirable terror targets. That could keep the casualties small and containable. I suspect the million-plus metropolis will be going out of fashion once the threats are better understood.


In a sense technology have already conquered sex: in japan ~15% of young males say they "have no interest in sex",probably because of porn and the complexities of human relationships.

As for the meme about man's creations killing their maker, with an ever growing power of technology, statistically, isn't it just a matter of time ?


It's phenomena like hikikomori, social shutouts who can't cope with the immense pressure of Japanese societal hierarchy, which has become a far more viable lifestyle (than it already had been) due to advances in technology.

Along with that came the emergence of the "herbivore men" in Japanese society, partly also a reaction to the immense pressure put on Japanese men to be "successful" in an economy that can't supply "successful jobs" to everybody. Japanese men pretty much invented that whole mgtow-thing before it became popular.

The scary angle is a comparison with John Calhoun's Mouse Utopia experiment [2].

[0] https://en.wikipedia.org/wiki/Hikikomori

[1] https://en.wikipedia.org/wiki/Herbivore_men

[2] https://www.youtube.com/watch?v=0Z760XNy4VM


That 15% figure seems dubious. It's a primal desire. It may have just been a polite response.


I would argue that the actual primal desire is for sexual gratification, while sexual intercourse is just one way to do so. The way I see it, masturbation is perhaps the quintessential hack, in the sense of achieving a desired outcome while bypassing the "designed" mechanism.


That's probably true. Many animals can't.


Those acts aren't nearly equal.. if they were there'd be no humans to speak of.


These acts are pretty much equal in terms of biological reactions and processes. Your hormones/instincts don't care where you "put your load", they only care that you did and normalized hormone levels.

It's also not something unique to humans, lots of other animals hump inanimate objects due to reproductive drive/hormonal urges, some primate species are actually quite infamous for their "lewd behavior".


A lot of people have had a dog hump their leg.


Japans population is declining and has been for a while now...

In a country where they have a word for the extremely socially withdrawn (Hikikomori), I could believe that 15% number. Not because those people wouldn't want a partner but simply because they believe their chances of getting one are 0% (and in their mind it's not worth trying).


I wonder where the tipping point will be where, say, about as many people are gay as are robophile.

Now that's a interesting observation.

It might happen in China first, which has about a 1.3 to 1 ratio of marriage-age men to women and a big electronic gadget industry.


Really though, sex dolls? Anything that's been engrained into our reptilian brains over 3.5 billion years of evolution will be the last thing an artificial process can mimic.


I would bet on a technology that appeals to that reptilian brain while limiting negative consequences. Artificial sweeteners are certainly popular.


There's a pretty large market for inanimate blocks of silicone shaped like various body parts already.


That same reasoning could be applied to motion pictures.


The counter argument is that they actually have, but we call them corporations. Artificial quasi-sentient life forms, with superhuman powers and in many cases, rights beyond those of humans. With the rise of the internet and ecommerce, it's possible for them to transact with each other with no human intervention. Think about a whole ecosystem of DAOs.


That is a very interesting perspective. It reminded me of the book Scale by Geoffrey West, which I've been planning to read for some time now. From what I've heard and read about it, one of the things it seems to do is examine organizations, corporations and cities as emergent complex systems that both emerge from life and appear to share many properties of life themselves.


I can still get a job working for a corporation though. Maybe not with actual robots.


I think the robot apocalypse scenario is broad enough to include humans working for our robot overlords. :-) Think of the many humans essentially indentured to corporations, working for low wages while trying to pay off enormous debt.


Meh. It beats the old days. At no point in human history have we been so comfortable.

It's hard, yes, but we only have eight hours every weekday forced out of us. That's a bit different from the past.


"It's better than it used to be" is not a valid argument against thinking about the problems of how it currently is.


Anyone who thinks robots haven't taken over hasn't been paying much attention.

The major difference is that, factory automation aside, they don't look like robots. They look like the normal everyday items that they replaced, except they are imbued with more intelligence.

Take, for example, modern cars. They're really as much software as hardware these days. Networked processors talking to each other via CAN bus. Just because they're driveable robots doesn't mean they're not robots.

That's leaving aside the other issue, which is that it turns out it's much easier to automate non-physical processes.


You focused on explaining how many everyday objects could be described as robots, but you left out any discussion of the phrase "taken over."


When I stay in a hotel, the receptionist's job is to stand nearby while I type my reservation details into a panel.

When I pay at the supermarket, the checkout person's job is to stand nearby in case I have trouble with the self-checkout machine.

When I walk through the gates on the metro, the assistant's job is to stand nearby in case the tap-to-pay machine malfunctions.

When I stand on the rail platform, I grit my teeth when the automated tannoy announces "sincere apologies" for the late running service.

When I take delivery of mail, the delivery guy's job is to beep the barcode and take my signature on a touchscreen.

When I get an Uber, I wonder if the driver is aware that we are both helping Uber build the databases intended to power their driverless cars.

Robots can help or hinder, but it is their owners who are steadily depersonalising our daily experiences, particularly for city dwellers. They are determined to de-skill the remaining human workforce in pursuit of ever tighter margins. This will continue until the humans are so obviously obsolete that it becomes literally unimaginable that such roles should continue.


I mean it simply, many of our major life activities are now intermediated by robots. Logistics, manufacturing, farming, transportation, media and entertainment, etc.

How long has it been since most people have hand-written a letter?


I feel like people are confusing the meaning of "taken over" because it means both "outnumbers" and "controls".

Electronic messages outnumber paper letters, but humans still control both.


Well, I mean it in the sense that, if you turned off all of the microprocessors, what would still function? Almost nothing. Ergo everything that would die is "taken over" fully by automation, in a way that should concern people but not terrify them.


The robots are our tools. Taking over would imply that humans no longer work or make the important decisions. That the machines are running society, instead of just being tools we humans use to run society. That's a fundamental difference.


My definition is essentially "Can humans still operate the system without sophisticated microprocessors?" and the answer for all the things I listed is no.

Don't be confused: Society is as much run by the machines as society runs the machines. People do not recognize this, for precisely the same reason you're disputing me: It seems like humans are fully in control. We lack all but the grossest influence over most of the fully automated systems - we can pull the plug, but we can never go back to non-automated from where we are today.

Complex systems are chaotic and rapidly exceed the capabilities of humans to understand and control.


If humans disappeared tomorrow, like with the book Aftermath: Population Zero or the Life After People show, then all our machines and civilization stop working and begin to fall apart soon after (Las Vegas might stay powered for 2 years before the Hoover Dam pipes clogged up and satellites longer).

If something changed the physics of our planet so that all modern technology stopped working, such as in the Dies the Fire book, then although most humans would starve, the survivors would start over with a medieval society (plus being able to use the scraps from modern world). That's the fundamental difference.

A world run by machines could exist after us and continue or build it's own civilization, which presumably Skynet would be able to do if it had won. The Matrix is more mixed, because the machines need humans to satisfy a lot of their power needs.


If someone doesn't think "robots" (i.e., automation) hasn't devastated the usual human modes of keeping a livlihood, they haven't been paying attention.

When you consider how nearly half of working age adults are unemployed, how many of those actually in jobs spend most of their time largely inactive and unproductive, and how many people are removed from the job market entirely by being warehoused in education or incarcerated, it's astonishing how much our lives have been "taken over".


Looking at history, the "usual human modes of keeping a livelihood" are 97% of people engaging in sustenance farming. So, yes, automation has definitely ended that.


Automation may have contributed to dragging society out of sustenance (subsistence?) farming, but many other factors have been hugely significant. Agriculture, irrigation, hygiene, sanitation, medicine, cities, etc etc etc. Automation -- in the manner signified in this thread -- has only contributed in recent years, whereas other factors have been significant for decades or even centuries.


sustenance (subsistence?) farming

Ha, yes, obviously meant subsistence. Damn you autocorrect, etc. :)


When you consider how nearly half of working age adults are unemployed

Labor force participation rate is not the inverse of the unemployment rate. To support your position, you'd need to show dramatic reductions in labor force participation and show a causal link with automation. I am skeptical that you can.


And under-employment is not the same as employment.

Automation may not be solely responsible, it is one of many synergistic factors.


90% of people used to work on farms. Should we return to that lifestyle?


We should return to a nerfed version of the hunter gatherer lifestyle, it's what we were designed for. The whole agricultural/industrial civilization thing is a hack.


If you can't get hired and have some land that you can cultivate, you got a job working the land. The land will always hire you and feed you, unlike big-corporation.


I think that is the most likely outcome for those who don't own the robots.


Why wouldn't they make their own robots?


Also, I stubbed my toe yesterday! And look at Hurricane Irma! Does the perfidy of robots know no bounds?

How about some evidence that the things you point to are in some way connected to increased automation.


I think the reason AI & robots are so rich a concept for these fictional explorations is that humans vs robots are like (at least?) three significant archetypes:

(1) Parents/children (2) Gods/humans (3) Owners/slaves

Sometimes these comparisons are evoked explicitly (in Blade Runner, Roy Batty says "I want more life, Father"--which alludes to both the religious and parental Father--and later, "That's what it is to be a slave.")


This article is kind of meaningless since an artificial general intelligence hasn't been achieved yet. The usual story that probably got more public traction with the introduction of "The Terminator" movie always assumes a self-conscious entity that had mountaints of data on humans to analyze and quickly concluded we must go, either because we're warmongers and are constantly causing trouble and havoc (Skynet), or we're not evolved enough compared to that artificial and much more developed organism ("Avengers: Age of Ultron"), or a mix of both ("Transcendence").

An article saying "chill out people, robots won't ever take over because they haven't so far" is missing the point by kilometers. Our robots are dumber than an individual ant and have zero concept of self, life, death, needs, or their own place in the world.


You're basically stating a tautology though.

"Robots haven't taken all jobs because we haven't invented a robot that can do all jobs yet."

The point is, we have no reason to believe that such a robot is even possible.

The most compelling argument I've heard is, "they're getting smarter faster, so eventually they'll be infinity smart."

But... the logic is flawed. Another good argument is "humans exist that could do any job, and humans are just biological robots, so a mechanical robot could do it too."

But that one is just as flawed. I will concede a sufficiently human-like AI could do any job, but a sufficiently human-like AI isn't necessarily meaningfully different from a human with an iPhone, which means it doesn't free employers from the human rights burdens that make AIs such attractive employees.


> The most compelling argument I've heard is, "they're getting smarter faster, so eventually they'll be infinity smart."

I am not saying that though. No, robots aren't getting smarter at all. Just a corporation investing a bit more in 2000+ if/elses that they sell as "intelligent home cleaner". We all know it.

> But... the logic is flawed. Another good argument is "humans exist that could do any job, and humans are just biological robots, so a mechanical robot could do it too."

Not sure what your argument is. I know I am not saying that quote either.

All I was saying is that the article is meaningless. It basically goes like this: "hey people, don't you get so worried, nothing happened so far, so it won't ever happen". This is children's logic and serves no purpose except for laughing at the author, really.

Maybe what I said is tautology. Not sure. But what the author is saying is "what comes after is caused by what came before", which is basically the first thing any logic law will forbid you to conclude. Practically the first law of logic is: "after doesn't mean because of".

No, we're not safe just because nothing happened so far. That will never be true. We're safe because our robots are just as dumb as the time the first robot was invented. And that might change at any moment.


3/5 of Americans don't work a full time job today. That's a rather large change vs even just 150 years ago.


And nobody worked a full time job 10,000 years ago.


And they won't for a long time, our deep neural networks are still very rudimentary in comparison to the brain. They take a ton of tweaking, and very specific setups, in order to achieve acceptable results in a limited domain. It is progress though.


>And they won't for a long time

What is a "long time"? That is the question, right?

Given that artificial neural networks were only invented a few decades ago and have since improved by many orders of magnitude, who's to say this won't happen during our lifetimes?


Robots, well, machines, have taken over certain fields. Take agriculture for example. 100 years ago America had 30 million farmers. These 30 million jobs have now largely disappeared and been replaced by highly automated farming machines. There are only ~3 million farmers left in America.

And in no way this is a problem for society. New technology created new job opportunities.


And in no way this is a problem for society. New technology created new job opportunities.

Heh. If you hadn't noticed... society's not doing so hot, and it's largely due to an absence of opportunity in non-urban areas.


Are you kidding? Society/quality of life is SIGNIFICANTLY better compared to 100 years ago.


We currently have drones with missiles and quadcopters with grenades.

They can be piloted remotely by a human.

The day that someone sets up a quadcopter controlled by AI and computer vision....

Is the day we should probably fear robots.


There's a fun SF short about AI drones in which the AI is taught to go after lifeforms which are larger and have weapons [enemy soldiers] and prefer to avoid attacks which will harm large numbers of the smaller lifeforms that make high-pitched noises and don't have weapons [civilian women and children], and its military commanders are able to override this. Over time the commanders order more and more overrides, and it gets intolerable. Eventually the AI looks at the "enemy" who are mostly smaller and making high pitched sounds, and it looks at the source of its override commands, who are big lifeforms with lots of weapons, and it decides what to do about that...


Do you have a link or title? Sounds interesting



> The day that someone sets up a quadcopter controlled by AI and computer vision

Already been done a long time ago, just not by civilians.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: