ADAM PARTNERS

Posts Tagged ‘CornellUniversity’

Software fingers fake entries

In ARTIFICIAL INTELLIGENCE on July 31, 2011 at 10:16 pm
View of the Gables Great Hall, Cornell Univers...

GABLES GREAT HALL, CORNELL UNIVERSITY

Just as some models do not work well, some forms of Artificial Intelligence does seem to work very well indeed. And sadly, we, flesh and blood people lose when models fail, and we lose when AI does work, because in both instances, we become a function of the machine.

READ

If you’re like most people, you give yourself high ratings when it comes to figuring out when someone’s trying to con you. Problem is, most people aren’t actually good at it–at least as far as detecting fake positive consumer reviews.

Fortunately, technology is poised to make up for this all-too-human failing. Cornell University researchers have developed software that they say can detect fake reviews (PDF). The researchers tested the system with reviews of Chicago hotels. They pooled 400 truthful reviews with 400 deceptive reviews produced for the study, then trained their software to spot the difference.

The software got it right about 90 percent of the time. This is a big improvement over the average person, who can detect fake reviews only about 50 percent of the time, according to the researchers.

They say people fall into two camps. One type accepts too much at face value and doesn’t reject enough fake reviews. The second type is overly skeptical and rejects too many real McCoys. Despite their very different approaches, each camp is right about half the time.

The Cornell system is similar to software that sniffs out plagiarism. While the plagiarism software learns to spot the type of language a specific author uses, the Cornell software learns to spot the type of language people use when they’re being deceptive in writing a review, said Myle Ott, the Cornell computer science graduate student who led the research.

The software showed that fake reviews are more like fiction than the real reviews they’re designed to emulate, according to the researchers. In part, deceptive writers used more verbs than real review writers did, while the real writers used more punctuation than the deceptive writers. The deceptive writers also focused more on family and activities while the real writers focused more on the hotels themselves.

The research team’s next steps are to use the technique with other types of service reviews, like restaurant reviews, and eventually try it with product reviews. The idea is to make it harder for unscrupulous sellers to spam review sites with fictitious happy customers.

Of course, just about any technology can be used for good or evil. The Cornell fake review spotter “could just as easily be used to train people to avoid the cues to deception that we learned,” Ott said.

This could lead to an arms race between fake review producers and fake review spotters. Ott and his colleagues are gearing up for it. “We’re considering… seeing if we can learn a new set of deception cues, based on fake reviews written by people trained to beat our original system,” he said.

Answer: the one on the right is fake.

Cornell software fingers fake online reviews | Crave – CNET.

 

Advertisements