Search Engine Optimization is an ongoing process, you have to keep making effective changes in your optimization strategy to keep your site well working and appearing at the desired position over the web. We make a lot of changes on our site, from changing the theme to adding new plugins, from updating plugins to refreshing the old content. All such things ultimately affect your ON Page SEO Health which reflects in your site’s performance and ranking.
To make sure that your site will always perform at it’s the best level, you should perform ON Page Optimization on regular time slices. This is because, only link building is an ongoing process and all things other than link building, changes frequently. That’s why you should take care of the optimization process more often and more carefully to maintain the site’s performance at its peak.
But before diving into the ON Page Optimization Checklist, you should have a clear idea what changes come under White Hat SEO (the legitimate way to perform SEO) and what are the things which come under the Black Hat SEO (the unethical way to perform SEO)
What Is White Hat SEO?
White Hat SEO techniques are those practices which follow webmasters guidelines and you must follow these techniques to increase your site performance and for better search results. By following these steps you will easily get positive results by constantly following these practices.
1. Follow up the Guidelines
The First and the most integral thing is that you need to follow the Search Engine Guidelines for having best out of SEO. While doing any changes on the site, go through the guideline just to cross-check it.
Make sure that you are not doing anything which is against the guidelines. Because knowingly or unknowingly if you implement changes that are against the guidelines, it will negatively affect the site.
If a site got penalized by the search engines, it becomes really tedious and difficult to detect the reason and recover the loss that caused by the penalty. So, it’s very important to keep ourselves updated with all recent search engine guidelines. For complete reference about search engine working guidelines Click Here.
2. Stay away from Duplication!!!
The secret to a good ranking website is their Originality and Trust. If you are wondering to have a good ranking and traffic then you must serve your readers genuine content without any kind of deception or duplication.
If you ever try to fool your readers and crawlers then you will reach nowhere rather than losing trust, authority and ranking. The search engine crawlers are very smart and detect black hat SEO practices very quickly.
The SEO has changed in a drastic manner from the past few years. The tactics that used to work very well a few years back for boosting up the site ranking don’t work Today. The search engines are getting more smarter at a rapid speed, to fool them is nothing but a difficult to recover from the loss.
3. First priority is your readers
Every sites foremost purpose is to serve their visitors and not the search engine crawlers. If you are writing something that is solving your readers problem then you even don’t need to implement all SEO practices at all because you will be having a good amount of organic traffic. And your readers will definitely visit your site again in search of some solution. This is just because the trust and authority you have. The classic examples are NeilPatel and Backlinko.
4. Information-oriented useful web pages
Always create web pages which are more information friendly rather than having tons of ads. Having too many ads will ruin the user browsing experience that will lead to a higher bounce rate.
Nobody wants or likes to see ads on the site especially when they are reading something or searching for a solution in an 11th hour. We all know the fact that, Ads are the great source of income but when there are too many ads nobody likes to visit the site again.
Apart from these techniques, there are many unethical ways also which lead to penalties and loss of reputation.
What Is Black Hat SEO?
What is the correct keyword density? This is one of the most important and debatable questions. There is no full guaranteed answer to this question. When you research on this dilemma, you will find the following points:
- Drop your keyword in first 100 words.
- Use around 1-3 % keyword density.
- Only use keywords if it looks natural not stuffed…. and so on.
1. Keyword Stuffing
But if you overuse your keyword then it comes under the Black Hat SEO. The term Keyword Stuffing comes under unethical activity which means using focus keyword too many times just for crawling purpose.
Using a keyword too many times distracts user interaction and also crawlers which then leads to a penalty from search engines which lead to negative impact on page SEO health. It comes under the black hat On-Page SEO techniques.
The ideal way to use the main focused keyword is using it in the Title tag of a blog post, in the first paragraph (if possible), in meta description of the blog and in the Alt tag of the image.
To avoid the problem of focused keyword stuffing, you should use the LSI keywords (Latent Semantic Indexing) for your focused one. These keywords are the synonyms of the main keywords which are bliss for your site’s SEO.
For example, if you run a blog on motivation stories and user searches for inspiration stories instead of motivation. In this case, if you have used inspiration stories (LSI), the chances are your blog will also rank for that particular keyword.
2. Buying Backlinks
For increasing traffic and ranking if you buy backlinks by mistake then it will decrease your ranking rather than to improve it. This is because the backlinks are of very poor quality and futile. Don’t buy backlinks from anyone, it will only decrease your ranking, nothing else.
Apart from buying backlinks, creating a PBN (Private Blog Network) can also harm the site in the long term. Private blog networks is a group of low-quality sites which points to the main site for increasing the backlink profile of the main site. This practice may increase ranking for a while but when search engine notices this trap, the search engine will penalize that site badly.
3. Gateway pages
These are useless low-quality web pages, on which no sufficient content is available, these pages are just made for keyword stuffing.
4. Mirror Websites
These are multiple hosted websites but with similar duplicate content which is not relevant to readers and search engines.
There is a very slight difference between the White and Black Hat SEO but the results are contrasted from one another. For a stable and better-performing site, you must perform On Page Optimization on a regular basis.
Here are few audits for best ON Page SEO health results
1. Link Checkup
Links are the most important factor in SEO. As links are the concern, they got affected very frequently. You should do a link check on a periodic basis. Broken links damage site performance very badly, you can easily fix your internal broken links with the help of a broken links checker. But if the external links are broken than it is very difficult to manage.
It is so because you have no control over the external broken links. There may be a possibility that the page you linked with is no longer exist. Such links show the error of 404 (Page Not Found) which affects your site ranking in a negative manner.
If you are a WordPress user then there are so many plugins by which you can easily detect broken links and fir it with a simple redirect method. To fix a broken internal link, the very first and easiest way is to use 301 redirect.
If you are not a WordPress user, you can easily find out broken links by Google Search Console. Simply log into Google Search Console and go to Crawl -> Crawl Errors and check for “404” and “not found” errors in the URL Errors section.
2. Check Robots.txt files
It is very important to check the Robots.txt files regularly. Many times we block some important stuff unknowingly which is not crawled by the search engines. In CMS like WordPress its very easy to block something important because initially, WordPress comes with many default settings.
To check your Robots.txt file go to Google Search Console -> Google Index -> Blocked Resources.
If your these resources are blocked you can simply enable it by adding
By adding these files crawlers will be able to crawl your site’s important content. To ensure that if these resources are now crawlable or not, go to Crawl -> Robots.txt Tester in Google Search Console, then enter the URL and hit Test button.
3. Regularly check your indexed pages
It is very important to check that all your pages are properly indexed by search engine. To check this simply type “site:sitename.com” into the search panel and hit enter. By doing this you can easily figure out that all your quality pages are properly indexed. You can also check if there are any low-quality pages.
What are low quality pages
If you have a search box on your site than all the searches made on that search box are considered as low-quality pages. Although they are just links, you should not need to crawl them. If there are many pages with same content then such pages are considered as low-quality pages.
You can simply discard these pages by disallowing them into the Robots.txt, once you have disallowed these pages then the search engines will no crawl these pages, which will lead to good site performance. You can also block certain URL from crawling by using Google Search Console just go to Crawl > URL Parameters.
4. Do a HTML Source Check
HTML Source Check is the best way to check that all meta tags are being added to the right pages. It’s also the best way to check for errors that need to resolve.
You can check these following things to make sure that every thing is working fine.
- Check to see if the page has a meta robots tag, and ensure that it is set up properly and working well.
- Check for the page has a rel=”canonical” tag and make sure it is showing the proper canonical URL.
- For mobile responsiveness check if the pages have a view-port meta tag.
- Check for the OG Tag in the pages.
5. Mind Your Down Time
It’s necessary to notice your site’s downtime patterns. Down Time ruins your visitor’s experience which will hurt the site on page SEO health in a long term. Too often Down Time negatively affects sites ranking factor and the Search Engine Optimization strategy. You should regularly audit your site performance.
Request your hosting provider to regularly send the performance report so that you can easily check site performance. If you are having too many downtimes than you should switch the hosting service and get the faster-hosting services.
This is because numerous hosting services are available which offers hosting services at a very low cost. But their performance is not good all the time and they are not even reliable. Switching to a better hosting gives better performance and fewer downtimes that means less bounce rate.
6. Blocking Scripts Check
7. Check Site Loading Speed
The site’s loading time matters the most for keeping the bounce rate low. For maintaining a good ON Page score, the site must load fast i.e within 3 seconds. According to research, 75% of the users will not re-visit the site if it takes more than 4 seconds to load. You can easily conduct a site speed audit by Google’s free tool.
Here are some brief proficient tips for On Page SEO
1. Make sure your all URLs are SEO friendly which means both humans and search crawlers can easily understand what URL is all about.
2. For taking the maximum benefit from the On Page SEO, you should try to put your main keywords at starting of the title because according to Google starting 3-5 words holds more importance.
3. Add modifiers to the title tag, it makes it more attractive and eye-catching. Add modifiers like “Best”, “Complete Guide”, “Cheat-Sheet” and many more like this.
4. Add your main heading into H1 Tag because H1 tag represents the main and powerful headings. If you are using CMS like WordPress than it gives the default functionality to add title in the H1 tag.
5. Use your focused keyword in the alt tag of the image so that you also get the juice from the image search.
From the above On Page Optimization definition, we can come to a conclusion that it is endless time taking a process and has many aspects. But if you properly audit perspectives carefully, you will have a better optimization for your site without doing anything related to Black Hat SEO. Don’t wait for a perfect time to edit. Make a checklist and perform your On-Page Optimization now to ensure smooth and swift working.
Apart from On Page Optimization, another most important thing is the Off Page Optimization. We can say that Off Page Optimization is the fuel for keep running the blog posts effectively. From social sharing to link building , all these aspects matters the most for keeping the traffic coming to your blog.