15 Common On-Site SEO Mistakes
Search is a big deal. Consumers from all over the globe have chosen to rely on search engines for locating what they want. Today people access information and purchase products with search engines as their starting point. So ensuring your web pages are properly optimized for SEO (search engine optimization) isn’t optional but critical.
There are two parts to SEO. These are on-site (or on-page) and off-site (or off-page). On-site SEO is more technical in nature than off-site. Also, it’s the primary reason many websites experience irregular ranking fluctuations. As a business owner, you should be aware of the on-site SEO mistakes that may have you thinking Google hates you.
You see, a basic understanding of SEO just isn’t enough to identify costly SEO errors anymore. SEO is technical, marketing oriented, and user experience (UX) focussed in nature. Fortunately, SEO Explode’s got your back. We’ll be discussing fifteen of the most common on-site SEO mistakes to avoid.
Mistake #1: Duplicate Web Copy
An SEO study by SEMRush revealed some interesting data. 50 percent of all the websites they analyzed had a problem with duplicate content. Revealing that although content duplication has been the subject of much discussion in the SEO industry, it’s still a common issue.
Worth noting is that the copy in question has to be substantive for content to be considered duplicate. So site-wide navigations or general short disclaimers aren’t necessarily duplicate content.
For example, copying and republishing an entire article from CNN would be considered duplicate content. Similarly, republishing the same articles on your website more than once is content duplication.
That being noted, duplicate content won’t cause your site to be penalized by Google. Search engine algorithms (a form of AI) can detect content duplication so it isn’t considered deceitful in intent. However, in most instances, the logical response is to demote the ‘copy cat’ and elevate the original.
Is Duplicate Content Hurting You?
Duplicate content may be affecting your rankings in the following ways.
- AI might be demoting some of your
web pages because they’re duplicates of another known copy. Search engines
prefer displaying unique content to searchers instead of regurgitated web copies.
They use a variety of factors such as date of discovery and other historical
data to show the most suitable results.
- Duplicating content without adding
proper attribution may have placed your site in a ‘spam bucket’. Basically, a
website’s potential for spam or being harmful to users increases whenever the
domain has plenty of duplicate content.
- You might have a silent or algorithmic penalty if you’ve got a lot of duplicate content.
2. Failure to Hire Good Writers
Surprisingly, this is a common on-site SEO mistake. Good copy is the bedrock of any effective marketing campaign. SEO is a marketing discipline but it’s easy to become too focused on the technical aspects.
You can’t convince visitors to stick around with badly written web copy. Let alone convert them into paying customers. Also, convincing people to link back to your content and share them with others is excruciatingly difficult when copy sucks.
Fulfilling copy enhances engagement, coverts better, broadens your potential audience and elevates reputation.
3. Targeting The Wrong Keywords
What is the purpose of your content? What does the searcher want to achieve? These are questions you should ask yourself every time the team decides on a keyword target. Answering them is how you ensure that you’re choosing the right keyword.
You’ll make sure your content satisfies the user by knowing its intended purpose. Knowing the searcher’s objective will help you create a more effective web page or piece of content. A great way of doing this is to analyze existing search results for your target term.
4. Unspecified and/or Missing Meta Title Tags
Metadata is additional information about a web page that is used in search engine results pages (SERPs). The Meta title is what is shown to searchers as your headline.
For years, Meta title has been the most influential factor of on-site SEO and is AI’s starting point for comprehending context.
5. Not Optimizing The Meta Title
People generally read your headline before deciding what to do next. Whether someone clicks through to your pages from the SERPs hinges on the first impression. Specifically, the reason your title gives them to proceed.
Your headline must be optimized to attract clicks. Killer titles include any one or two of the following elements.
- A perceived benefit
- Implied instant gratification
Let’s take a look at the following titles from our previous screenshot. Which one are you most likely going to click on?
“13 Facts You May Not Know About Bon Jovi – Ultimate Classic Rock”
“Jon Bon Jovi Forms a Band – Today In History | Like Totally 80s”
You’re probably more attracted to the former because it includes curiosity.
Keep Meta titles short so that they don’t get cut off on the SERPs for being too long. Around 50 – 60 characters in length is ideal and avoid duplicating titles.
6. Unspecified and/or Missing Meta Description Text
Like titles, Meta descriptions are important because they play a role in helping users decide what to click on from the SERPs. A description will be generated from your page’s content by search engines whenever one isn’t specified.
This isn’t good because you‘ll often end up with Meta descriptions that perform poorly. Search engines like Google use CTR (click-through rate) as another indication of a web page’s value. The idea is that a page must be good if users keep clicking through to it without bouncing back.
7. Missing Image Alt or Alternative Text
The image alt tag is an HTML attribute used to display information about an image whenever it cannot be rendered. The keywords found within your alternative text attribute are used for ranking your page and image. Also, missing alt text affects the visually impaired because it’s what screen readers read aloud to them.
You won’t be maximizing keyword usage if your images have no alt text and it’s a disservice to the blind. Web browsers will display the alt text in place of an image that doesn’t load like so:
8. Not Monitoring For Broken Links
Broken links are a common on-site SEO issue that isn’t likely going to disappear. They’re inevitable as your website grows but can be managed. Do so by ensuring you have a means of monitoring links so problems can be corrected quickly.
Is your website built using the WordPress platform? If so, use the Broken Link Checker plugin for monitoring. Install, activate and set up the plugin to start receiving email alerts whenever a broken link is discovered on your site.
Not on WordPress? No problem.
Use Screaming Frog to scan your site for broken links once every month or quarter depending on how large it is. The tool lets you crawl a maximum of 500 URLs per scan on the free version.
Websites with tons of broken links are perceived as low quality, which makes sense. You’d be sending bots and humans to dead ends.
9. Missing 404 Page
The 404-page is a specific web page that is shown to visitors whenever a destination page can’t be found. Typically web hosting providers generate a default 404-page that may also include ads and links to their homepage. The problem with this is that it isn’t user-friendly and eliminates any opportunity you have for re-capturing a user’s attention.
A well-optimized 404 page will increase traffic by professionally rerouting visitors to other sections of your site.
Here are some quick 404-page tips you can use right now.
- Use a strong headline that lets
users know why they’ve arrived at a 404 page.
- Add an obvious CTA
(call-to-action), letting people know what to do next.
- Enable search. This is possible
even if you don’t have a search function by using Google
Custom Search and it’s free!
- Include your navigation and
- Keep information minimal and to the point. They’ll be fewer distractions, which increase the likelihood that visitors will take a favorable action.
Effective 404 pages are usable.
10. Accidental Keyword Stuffing
Sometimes, we get too excited about ranking for a particular term that we over stuff keywords unknowingly. Search engine algorithms are more sophisticated than ever so it’s near impossible to get away with keyword stuffing. Websites that partake in such behavior are usually penalized or de-indexed.
Always check how often you’ve used your keyword target and whether or not your content sounds natural. Readability is more important than exact match term repetition.
11. Slow Site
We’re in an era of devices like smartphones. Google, in particular, is using site speed as a ranking factor for mobile and desktop searches. Hence, pages that load fast perform better on the SERPs.
Consumer expectations have also gone up and people expect things to happen quickly. Ideally, your web pages should take less than 3 seconds to render.
12. Not Using HTML Heading Elements
HTML (Hypertext Markup Language) has six levels of headings that are hierarchical in nature. Each one begins with ‘h’ and a number that specifies its importance. H1 is more powerful than H2, H3, and so on.
<h1> My Greatest Title Here </h1>
Another common on-site SEO mistake is when these elements aren’t used. Your headings and subheadings need to be wrapped by these tags. Note that the H1 element should only be used once per page but the others can be implemented several times.
13. Bulky Code
Poorly coded websites often include a significant amount of programming code. This poses a problem because unlike humans, bots read everything! The bulky code will slow down a website and affect keyword density, which is a keyword’s rate of occurrence compared to max word-count. Best to keep things lightweight by removing anything that’s unnecessary.
14. Low Word-count
Word-count isn’t a ranking factor but a metric to consider based on the purpose of a page. Some web pages perform well with low word-count, while others do not.
For example, an image gallery probably doesn’t need a lot of text to rank highly on the SERPs because its purpose is clear. However, an article about a topic that should have a higher word-count might be demoted since things won’t add up. Word-count needs to make sense.
15. Too Many External Links
PageRank (PR) is still a metric used by Google internally even though it’s no longer updated publicly. PR is used to estimate the value of a web page based links. Each external link that is present on your page passes on link equity to the linked web page. This dilutes your overall ranking power.
External links are good for SEO because they can make your content more valuable to visitors. But, too many external links can prevent a page from ranking. You should have a balance between external and internal links. One to three of each is recommended.