When you think of robots, there are probably a couple different images that come to mind. There are the cute, innocent robots, like Wall-E and R2-D2. And then there are the scary robots, like the ones in “Terminator” And then there are the robots who want to seduce you and then steal your job, like the ones we could be faced with in just a couple decades.
According to a new report published by Pew Research, robots and artificial intelligence are getting harder, better, faster and stronger, and we can expect to see more of them sooner than
we ever imagined. The report, titled “AI, Robotics, and the Future of Jobs” warns that robots and AI will make up “wide segments of daily life by 2025.” One of those “segments” could also be called “your bedroom.” If you ever found yourself attracted to the Fembots in “Austin Powers,” you’ll be happy to know that one of the experts interviewed in the report says he believes “robotic $ex partners will be commonplace” by 2025. Stowe Boyd, from GigaOM Research, made the scary claim, adding that $exbots will probably be “the source of scorn and division, the way that critics today bemoan selfies as an indicator of all that’s wrong with the world.” In other words, WE will be the parents who have to yell at our kids for getting frisky with $exBot instead of doing their homework.
But that’s not all. About half of the experts said they believe robots will displace more human jobs than they create by 2025. Yikes. The experts are also split on whether or not that’s a good or bad thing. On the one hand, a lot of blue-collar and even white-collar workers would probably be out of jobs, and social order could go down the drain. That’s a pretty major downside. But on the other hand, it would take a lot of the drudge out of working monotonous, unsatisfactory jobs—no more unnecessarily long commutes, no more tilling the fields. Instead, people could focus on more interesting and fulfilling things. In either case, the experts all agree that ‘bot technology is moving at warp speed. The future is terrifying.