/*Userway Accessibility*/

Why You Can’t Find Your Website On Google Search

By January 19, 2016 March 3rd, 2016 Inbound Marketing, Search Engine Optimization, Webmasters

It’s believed there are over 200 different factors taken into consideration when Google ranks websites. Some of them are more important and weighted heavier than others, but they all count to some degree. Any capable webmaster will be able to have your website rank in Google’s search results with no problem.

Below are four basic reasons as to why your website may not be appearing in Google’s search results.

Google Can’t Find It

You can’t show up in Google’s results if you have never been found by Google before.

There are two ways that Google finds your site; 1) Site Submit, and 2) Links to your site from other sites crawled by Google. A capable webmaster will know how to submit your site to Google. You can also find more information about submitting your digital assets to Google here -> Pick ME!

The next way would be to build links to your site (backlinks) manually. This is time-consuming and sometimes very difficult task that deserves careful planning and a strategy behind it. A capable webmaster should also know how to go about building quality backlinks. More on backlink building here -> Over Here!

Backlink-Profiles help seo

This is a visual representation of a backlink profile via Cognitiveseo.

Google Can’t Read It

Does your site uses Flash and/or does your Robots.txt file allow Google to read the site?

Flash is frequently used to display videos and animations. However, after the 2000s, the usage of Flash on Web sites has declined because search engines have a hard time reading it. Basically, Google only reads numbers and letters. An easy way to see if a Google is reading your content is to try highlighting it with your mouse. If the letters and numbers can not be highlighted, Google probably can’t crawl – or read – that content. Here is an example.

Google cant read words written on images - Search

The Robots.txt file is a group of web standards that regulate internet robot behavior and search engine indexing. Google will not and can not read your website if your Robots.txt file is set to disallow it to. Having a capable Webmaster check and/or edit your Robots.txt file is the only solution here.

Google Doesn’t Want To Touch It

Since Google’s job is to serve users the most reliable sites on the internet, Google is sensitive to sites that might have dangerous content, malware, and spam. If you have installed the Google Search Console verification code, then you can see if Google thinks your site contains malware and how Google sees your site.

Google Doesn’t Think It Is Relevant Enough

Google will always try to serve up the most trusted and authoritative sites it can. Highly competitive industries will obviously create very competitive search engine results. Also, general and broad keywords are “competitive” because Google is tasked with serving up the most relevant, trusted, and applicable results for – what could be – and ambiguous search term/keyword. For example, say that you are Audio Visual installation company specializing in Automotive and Boat installs and you are trying to rank in Google for the search term “Car Audio”.

The search term “ “Car Audio” is so broad Google has to try and figure out if the user is looking for:

  • The definition of Car Audio
  • The history of Car Audio technology
  • Recent advancements in Car Audio technology
  • Legalities regarding Car Audio systems
  • Car Audio Manufacturing Companies
  • A local provider of Car Audio Installation services

Now just imagine how many websites across the entire internet could be competing for the same top positions on Google for searches using “Car Audio”. In this case, choosing a more specific targeted search term like “Car Audio installers” would be a term to try and rank well in Google for.

Google has penalized you

Google is constantly making updates and changes to the way their search engines function. As such, there are two reasons to why Google may be intentionally leaving out of SERPS; 1.) Algorithmic potently, and 2.) Manual penalties.

Having an Algorithmic Penalty means Google does not promote your site via the SERPS because it not longer values site’s like yours.

Having a manual penalty means a person actually hit the dump button on your site and you have been completely removed or all by removed from the search engine.

Find out if you have a penalty here -> Right Here!

Google Penalty Graph - visual  timeline