Why The Originality.AI Checker Keeps Giving Wrong Answers (Plus Alternatives)

The growth in AI content generation tools has inevitably also fueled the growth of AI checker tools used to try and detect if content is written by humans or by AI tools. One of these tools that’s gaining a bit of traction is the Originality.AI content detection tool, and they certainly have some bold claims on their website about 94% accuracy in detecting AI content generated through ChatGPT.

But there are increasing reports of the Originality.AI tool giving wrong answers, falsely classifying AI content as human written, and vice versa. Also, ChatGPT isn’t the only AI content tool, and it doesn’t seem great at detecting content written by other tools (more on this below). Why is this happening and what can we do about it if using these tools?

The main reason for Originality.AI wrongly classifying content is that the technology around detecting AI generated content is still in it’s infancy, and therefore these detection tools are not infallible. Accuracy will continue to improve over time, but for now, the Originality.AI tool does not give the most reliable results, and it’s better to use other detectors like Crossplag or Sapling.ai.

In other words, Originality.AI haven’t quite nailed it yet with their detection tool, and it’s never recommended to use that alone to determine whether content is AI or human written. If you do use it, be sure to cross reference results with other AI detection tools, and also read the content yourself to look for obvious signs of AI vs human input.

AI Content Detector Tools Tested (How Does Originality.AI fare?)

The current limitations of Originality.AI’s content detection tool as of 2023 have been revealed in the great video below from Alex at WP Eagle, where he tests 8 pieces of content (2 human and 6 AI written) in 4 different detection tools – Originality.ai, Crossplag, Content At Scale Detector and Sapling.

Put simply, if you’re having problems with the Originality.ai tool giving you incorrect answers on content, you’re not alone:

Originality.ai and other AI detector tools tested (surprising results)


Here are some summary insights pulled from the video:

  • Overall, none of the tools are very good at consistently and reliably detecting AI content. They can mostly correctly detect human content, but a lot of AI content passes through undetected as human.
  • However, in comparison, the Originality.AI content detector was by far the worst, wrongly classifying AI content as human and vice versa. so it’s worth considering using alternatives. It only got 2 out of 8 tests correct (and even then only sort-of, still picking up bits of AI in 100% human content), which is poor for a paid service.
  • ChatGPT seems to be the easiest AI content generator to detect (however, Originality.AI’s checker failed even at this in the above test, indicating some of the claims on their homepage on accuracy are at least questionable).
  • Other AI content generation tools such as Jasper AI and Content At Scale AI are harder to detect (for some reason, Content At Scale’s AI detector couldn’t detect it’s own content sometimes). Jasper AI seems particularly good at producing AI content that passes through the tools as human content.
  • The other 3 AI detection tools performed better, but none were anywhere near perfect in consistently detecting AI generated content.

Therefore, although ChatGPT is probably the most talked about AI content generation tool at the moment (and certainly the most crowded – it’s hard to even get on the tool sometimes), it also seems to be the easiest for tools to detect, and therefore you might want to use other AI tools instead for generating content.

Should You Even Use The Originality.ai Detector Tool?

Of course, the test done in the video above is just one experiment and it’s arguable you’d need a lot more data points to come to a firm conclusion on how effective the Originality.ai tool is.

However, these results are a bit disappointing, given that Originality.AI is a paid for tool whilst the others are free, yet these free detectors seemed to do a better job at picking up where articles were human vs AI generated.

Therefore, it’s worth doing your own tests, pulling some of your own articles (both human and AI generated) and seeing if you get the same results. If you are, then you’re probably not getting your money’s worth out of the tool yet (it’s early days though, and the technology will likely improve quickly).

You’re likely better off trying these tools instead:

None of these are perfect either, but they fared better than Originality.AI, so if you use all 3 together to cross reference and check, you might get a reliable answer sometimes.

But overall, AI content detection tools are still in their infancy, so nothing beats being able to read content as a human and use common sense to decide whether it was a human or an AI tool that wrote it. Does it read naturally and relatably, with humor, anecdotes and passion for the topic? Or is it dry and encyclopedia like in it’s tone? Are certain repetitive phrases used? Is searcher intent fully anticipated and dealt with, or are there obvious gaps in the content that would signal an AI wrote it?

These are the things to consider when looking at content, but for now at least, it’s safe to say you can’t really rely on the Originality.AI tool to give you reliable answers on content. Maybe in 2024!


I like to draw on my own experience to help new bloggers and other digital marketers solve common problems encountered when working and making your money online

Recent Posts