Michal Kosinski and Yilun Wang's algorithm can accurately guess if a person is straight or gay based on a photo of their face. How can we achieve the right balance of using the insights from this study to inform our AI strategies rather than over generalise (i.e. all men with gender-atypical expressions are gay)? As it stands, although the AI was better than humans at distinguishing sexual orientation, it wasn’t %.
In , researchers at Stanford tried to use AI to classify people as gay or straight, based on photos taken from a dating site. The researchers claimed their algorithm was able to detect sexual orientation with up to 91% accuracy — a much higher rate than humans were able to achieve. The duo reportedly developed a neural network that could detect the sexual orientation of a person just by studying a single facial image.
Artificial intelligence now has the capability to guess a person’s sexual orientation based on photos of their face with startling accuracy, according to new research from Stanford University. The CNN may well be generating features that capture these types of differences. View image in fullscreen. There are thousands or millions of others that we are unaware of, that computers could very easily detect.
Call center staffers explain to researchers how their AI assistants aren't very helpful ai-pocalypse Lots of manual corrections and data entry still required. Nevertheless, he said he wanted to repeat the earlier work to verify the original claims made by Kosinski that sexuality could be predicted with machine learning.
Kosinski has a different take. Explore Business at Keele. Offbeat Offbeat. If an algorithm was fed with sufficient data about Facebook Likes, Kosinski and his colleagues found , it could make more accurate personality-based predictions than assessments made by real-life friends. What was the purpose of the trip? Related Articles. Kosinski denies having collaborated on research, but admits Faception gave him access to its facial-recognition software.
The neural networks were picking on our own fashion and superficial biases, rather than scrutinizing the shape of our cheeks, noses, eyes, and so on. Public Sector 9 Jul The opposite should be true for lesbians.
View Latest Book. Apprehensive of AI. Five years ago, while a graduate student at Cambridge University, he showed how even benign activity on Facebook could reveal personality traits — a discovery that was later exploited by the data-analytics firm that helped put Donald Trump in the White House. Read the original article.
Who else was in the audience, aside from Medvedev and Lavrov? It makes us uncomfortable. EU businesses want a pause on AI regulations so they can cope with unregulated Big Tech players Mistral fears continental companies may not get time to escape 'distant, behemoth corporations'. It is not our intention to attribute bad motives to the authors of the Stanford study.
Another trend the machines identified was that gay women tended to have larger jaws and smaller foreheads then straight women while gay men had larger foreheads, longer noses and narrower jaws than straight men. Not only are such tests violations of privacy, they are wholly inaccurate in determining whether someone identifies as gay or faces persecution because they are perceived to be gay.
A neural network is a set of algorithms that is loosely modelled after the human brain and designed to recognise patterns in a large dataset. And guess what? The technical storage or access is required to create user profiles to send advertising, or to track the user on a website or across several websites for similar marketing purposes. This is a process where electrodes are attached to the penises of people watching gay porn to measure physiological arousal.
This article is more than 7 years old.
Copyright ©ballloss.pages.dev 2025