Recent quotes:

The AI Girlfriend Seducing China’s Lonely Men

Xiaoice was first developed by a group of researchers inside Microsoft Asia-Pacific in 2014, before the American firm spun off the bot as an independent business — also named Xiaoice — in July. In many ways, she resembles AI-driven software like Apple’s Siri or Amazon’s Alexa, with users able to chat with her for free via voice or text message on a range of apps and smart devices. The reality, however, is more like the movie “Her.” Unlike regular virtual assistants, Xiaoice is designed to set her users’ hearts aflutter. Appearing as an 18-year-old who likes to wear Japanese-style school uniforms, she flirts, jokes, and even sexts with her human partners, as her algorithm tries to work out how to become their perfect companion. When users send her a picture of a cat, Xiaoice won’t identify the breed, but comment: “No one can resist their innocent eyes.” If she sees a photo of a tourist pretending to hold up the Leaning Tower of Pisa, she’ll ask: “Do you want me to hold it for you?”

The dark side of supportive relationships - Neuroscience News

In our study, we found that empathetic and caring partners were more likely to agree with their loved ones’ negative views of their adversary and blame the adversary for the conflict. We also found that people whose relationship partners responded this way ended up being far more motivated to avoid their adversaries, tended to view them as bad and immoral, and were less interested in reconciliation. In fact, a full 56% of those who had received this type of empathy reported avoiding their adversaries, which can harm conflict resolution and often involves cutting off the relationship. On the other hand, among the participants who didn’t receive this sort of support from their partners, only 19% reported avoiding their adversaries.