According to a Search Engine Journal’s recent post entitled Google on Effect of Low Quality Pages on Sitewide Rankings, the amount of low-quality content hosted by a website can impact the rank of even the best pages on the same site. In other words, it suggests that if you hide your masterpieces under a mountain of rubbish, the former will be hard to find. Who would have thought...?
What does low quality even means? Google’s interpretation of what it means for pages to be qualitatively weak is, at best, vague. This is what the Google representative says on the subject of determining the quality of a page’s content:
[D]epending on the website, sometimes that’s possible. Sometimes that’s not possible.
The truth is, Google doesn’t read web pages: it parses the copy and media, then counts the words it finds and evaluates how relevant and consistent they are against the expected intent (aka, keywords.) Search engines can’t discriminate between between low and high quality; between good and bad. That is, unless there are obvious, formal reason to disqualify content.
So what can, and should site owners to do about it?
Google [...] appeared to say that [it] tries to focus on page quality instead of overall site quality, when it comes to ranking.
The conclusion is rather simple and self-evident: from a business point of view, site administrators should always try and provide as much quality content as possible to their real users. Those users are humans. As long as this goal if fulfilled, there should be little SEO work left to do.