Back in the ‘90s, when web search was first introduced to Internet users, website owners started competing to be found in Internet archives more easily.
Over the following decade, the competition between websites was becoming extremely harsh. People would use any tactic to get a higher search rank. You’ll see some of those tactics below.
Google — now the absolute leader in the search engines market — was the first one to introduce technologies for ranking quality content.
Since 1998 and for the last 20 years, search engines have been competing to deliver the best results. That’s why they keep improving their algorithms (see Google’s history of doing so) and ignoring or banning websites that try to “hack” their ways to top results with poor content.
Thankfully, the constant updates are about to put an end to the practices that damage the content quality just for the sake of the good rank. Let’s see what those practices are:
1. Keyword stuffing
Yes, web search is and has always been conducted through keywords.
That’s why marketers and content creators focus on keeping the connection between content and keywords clear. And that’s why they came up with the term keyword density.
Apparently, the keyword had to repeat several times in a given page to make it to the search engines.
Different sources suggested different optimal densities, ranging from 0.5% (which is fine even today if you use the words naturally) to as much as 4%. 4% is when the keyword appears four times in every 100 words.
That is, every 20th word in the text is the keyword, and that just never sounds organic.
If “keyword stuffing” was our keyword, a stuffed paragraph would’ve looked like this:
Besides this obvious “hack”, there was an advanced form of keyword stuffing.
Marketers would use the keywords repeatedly in the meta information or go as far as putting the stuffed text in the same color as the background to make it invisible.
Today, there’s no popular search engine that doesn’t recognize and ignore the stuffed content. You’ll be totally fine if you use the synonyms of the keyword or make the wording vary as long as the content tells about the keyword/query you want to rank for.
2. Re-publishing content (syndicated articles)
Let’s start this part by explaining the concept of link building. It’s the process of acquiring links from other websites to your own.
When search engines crawl the web, hyperlinks help them navigate from other pages to yours. You want a higher rank, you ask a good-ranking website to publish your content and include a hyperlink to your site.
Back in the ancient times of the late 1990s and 2000s, a lot of marketers abused this system. They would write an article with a link to one of their pages and publish the article on multiple sources to increase their own ranking.
Search engines soon figured out that users often see the same content in several top results and took measures to prioritize useful and diverse content.
Now duplicate content won’t rank and can often be marked as spam.
Today, syndicated content only works if you want to give basic info to several audiences and are willing to follow several rules like using “noindex.”
There are acceptable ways to republish your old content, though. You have to update it every now and then, change the format (text, video, podcast, presentation, etc), and republish the brand-new content.
Keep the reader in focus: when would you read an article about something you already know? Only when it has added value, right? Here’s an article from RSWebsols on re-publishing old content on your own website for some fresh traffic.
3. Paraphrasing duplicate content
As soon as marketers realized that duplicate content has stopped ranking, they had to look for alternatives. The 2000s is the time when rewriting became a popular SEO tool.
Facing the same problem as students trying to mislead plagiarism checkers, marketers found the same solution: paraphrasing and rewriting. This practice even allowed to steal content from competitors without facing copyright infringement cases.
Today’s plagiarism and duplicate checkers, such as Copyleaks, are much more advanced and can easily detect synonyms and slight alterations.
Popular search engines use similar algorithms to filter content and deliver truly unique pages.
4. Separate pages for each keyword
You might’ve noticed how today’s content writers tend to optimize each subheading/paragraph separately. That’s become a bigger trend with the rise of voice search.
Well, 20 years ago, the opposite was trendy. To rank high for every given keyword, marketers would create new, separate pages. This looked extra ridiculous when e-commerce websites started to create multiple pages for 1 product just for the sake of keywords.
You can still find similar tactics on websites like AliExpress where the platform doesn’t fully control the ways sellers market their products.
Now it’s much easier for a single page to rank with multiple keywords (although, 2-3 are preferable). Proper description and meta information will solve the problem, especially with search engines recognizing synonymous phrases.
5. Focusing on text length
All the tips you can find today suggest writing longer than a given amount: longer than 500 words, longer than 1000, longer than 1900 and so on.
The problem was, SEO specialists used to take it way too literally. Need 1500 words? 1500 it is, no more and no less. There were also guides defining the ideal length of paragraphs and sentences.
Just like keyword stuffing, these approaches used to make texts inorganic, dull and sometimes stretched with no point.
Today, we focus on the quality of content and try to deliver comprehensive information which is also easy to skim.
The only parts that require specific word count are those who have designed restrictions: snippets, social media posts, email subjects.
6. Comment spamming
Just like republishing content, comment spamming was considered an easy way to build backlinks.
To be fair, it’s widely used today as well. That’s why we have to keep 10Web Blog’s comments on hold until we approve them or mark them as spam.
The amount of comments left just for the link’s sake is extreme. Sometimes the links are hidden in the commenter’s username but spam becomes obvious when the comment makes no sense.
Many publishers simply apply “noindex” or disable hyperlinks in comments completely to avoid spam attacks.
7. Link exchange
This is the #followforfollow of the 2000s. Just like asking for followers in exchange for following them back, marketers used to insert a hyperlink in one of their pages in exchange for a backlink. There is even software that automates link exchange.
The problem is, you have to be very careful about choosing the websites you want to work with. Are they in the same niche as you? Do they allow the links to index? What’s the context of the page that includes the hyperlink?
Today’s alternative to link exchange is guest blogging. You provide quality material to the publisher and talk over all the conditions of your cooperation, including the number of indexed links, content quality, cost if there’s any, and so on.
8. Exact-match anchor texts
Anchor text is the clickable text a reader sees in a hyperlink. It plays its role for search engine crawlers which see the text and associate it with the link connected to it. Marketers of 2000s used to attach links to exact keywords.
Back when Google announced its Penguin update, natural-sounding anchor texts gained a priority over the exact-match ones.
9. Exact-match domains
When you’re searching for a brand name like Nike, you’d expect to see Nike.com among the top results. That’s how search used to work 15 years ago.
This system was extremely easy to abuse. Marketers used to buy domain names matching all their top keywords, including top search queries.
So looking for SEO advice for small businesses, you could stumble upon seo-advice-for-small-businesses.com. This is how exact-match domains worked.
Google soon realized it compromises the quality of the top search results and put domain names way lower in the priority list.
In 2021. you still want to optimize the slugs (the part of URL that comes after the domain name) but not the entire domain.
10. Only optimizing for text search
Voice search has become a global trend over the last years. Home assistants like Alexa, various desktop and mobile assistants have made search easy and fast through voice commands.
The regular text search is still important but now you have to make additional efforts because rules work a bit differently for voice search. You can find some of the differences and new rules here on our blog.
10Web cares about your ranking and has developed the software that helps you optimize your content and avoid technical SEO issues.
Remember dial-up internet back in the 90s 😃? Let’s go back through the history of site-making and explore 🔎 the times of website making revolution.