Recently, I had a client perplexed as to why his company’s website was not showing up in Google when searching for the brand name. This obviously would be concerning to any company and some reasons are quite simple to diagnose, for instance, not having a site map submitted to major search engines. However, there are a number of reasons that are affecting your placement on a SERP, or Search Engine Results Page, and understanding each will help in overall improving a company’s search engine optimization (SEO).
Our first point to look at for improving SEO is whether or not a company has submitted their site map to the search engines. There are plenty of site-map solutions out there that will scan your site and produce a file that will help direct search engines to know what to look for. We use Yoast Premium for many of our WordPress clients. Regardless of how your web site is built, Google has an excellent solution for those looking to improve their online visibility with Search Console. This platform will allow a person to see how Google is crawling their site (remember last week’s post about crawl bots known as “spiders”), and provides insight as to how your web site is being rendered and displayed to the masses. In addition, it allows you to control what pages are visible in a SERP and which pages you’d rather not allow people to find, at least not easily.
Once your site map is effectively searchable, the next step is look at the meta data on each page and edit appropriately to best describe the content found on that particular page. Search engines have become increasingly intelligent in understanding a user’s intent and purpose for a search and will display information accordingly. They understand what you are looking for—sometimes before even you do! Because of this you don’t need to include your company name in every page title, nor do you need to repeat the same keyword 30 times on the page. Use synonyms and write for the user’s experience, not for Google or any other search engine. This is a big shift from what was previously considered to be good practice, of filling your copy with the same word again, and again, and again.
After these two steps have been accomplished, your site now can accurately tell the search engines where to look for web pages and what content is on each of those pages. During this process you may have identified certain folders or pages that you do not want search engines to display as a result. These pages could range from staff only pages and PDFs to admin and backend pages that can modify existing content. The best proactive step is to create a robot.txt file and place it on the root level of your web site.
The robot.txt file is something that all crawlers, or spiders, will scan in order to learn the rules in which to follow while crawling through your site. If you don’t need to hide specific assets, then there’s no need to add a robot.txt file to your site. If you need to hide certain files or folders from search results, then it would be a good step to include a file disallowing indexing of these pages or files. Be very careful when using a robot.txt file as a slight misconfiguration can make an entire site unfriendly to spiders looking to index your site—leading to the exact opposite result that you wished to accomplish at the beginning of this process.
Here’s a quick snapshot of a few other issues to investigate to improve SEO.
- Page Speed (Visit Google PageSpeed Test)
- Secure-HTTP Connections
- Mobile-friendly Web pages
The last point to make is that crawling and indexing can take a lot of time; sometimes months. It can be disheartening to put in all this effort editing meta data and creating a site map only to not see immediate results. Google crawls at its own pace and it is out of your control. But, just like a New Year’s resolutions, if you stay strong and keep at it, you’ll reap the benefits for the foreseeable future.