Website Technology Blocker That Can Affect Your Keyword Ranking

If your pages are not getting indexed by search engines these technology blockers might help you figure out what is causing the problem. There have been several advances in technology in recent years that give a website a great visual appeal but prevent search engines from listing the contents of the website and prevent browsers from displaying all the content. Some of the websites I have redesigned were originally set up with methods being used that left their owners wondering why their websites were not being found in search engines, even after several years of being online.

WSYIWIG or CMS Programs

These are the worst Technology Blockers of all. Most programs that allow you to design a site with little or no knowledge of writing HTML, like a host's free site builder program or Facebook or Dreamweaver (even the most up to date versions), produce an enormous amount of code bloat and deprecated code, i.e., center and font tags that will soon no longer be supported by some browsers and it will destroy your formatting. The code from these programs often will not validate and search engines may avoid your site depending on the errors. Check the code your program puts out with the W3C validator to see if your WYSIWYG or CMS program puts out up-to-date code devoid of code bloat (I've never seen one that does).

Password protected web sites

Search engines can't enter such pages because they don't have the password.


Search engines can't read the contents of sites built within frames. If you use frames you should also provide these same pages in non-frame versions (which increases the work load for the webmaster and cost for the client).

Image Maps

Links inside image maps will result in search engines not listing your site because they get trapped inside of them. You need to provide the same links in standard HTML somewhere else on the page--possibly in a site map or in the footer, however these won't rank as well as links higher up on the page.


Flash is not being used any more, except in older iphones or websites that haven't updated their code. For those that still have Flash videos on their website they may have noticed that their videos now have black screens and need to be updated to other video coding.


Sites designed with their menu links inside of Javascript will not be listed by search engines unless these links are provided elsewhere on the page in standard hyperlink format (the menu above is designed with JavaScript, CSS and also HTML so it's viewable by all search engines).

Dynamic HTML

Search engines can't list sites totally designed with DHTML (classifieds, databases, etc.) because the pages are actually not even there. The pages are assembled from different parts and not visible unless someone searches for data. Other arrangements need to be made for the search engine to list them, such as the program generating static html page for part of the database.


Search engines can't get information from the server, only the web page (see dynamic html above).

Multimedia Files

Media files are audio and video presentations. Any important text in these files cannot be indexed by most search engines.


Google has recently figured out how to index the content in forms.

Too Large pages

An acceptable page size used to be 40K. However both visitors and search engine bots won't wait for the page to load if it's too slow and page load speed is now a new ranking factor.

Errors in Canonical Tags

If you don't have the correct url in your canonical tag, i.e. it's pointing to the wrong page, then that page will not rank well and the page the canonical tag is pointing at might rank for the wrong keywords.

Capitals in File Names

Putting Capitals in your File Names can also cause a drop in keyword ranking because it causes a duplicate content issue in search engines.

A Hierarchy of Directories

Search engines and crawlers will usually not dig deeper than 3-4 directories or folders so keep your most important information in the upper levels or the root level of your website.

Affiliate Links

Some search engines are beginning to ignore sites with Affiliate links with no original content, because it produces too much duplicate content and they are trying to reduce spam.

Java Applets

Important information in Java Applets should also be provided on the text of the page because some of the older browsers can't read it. Also, if people have PopUps turned off in their browser they won't be able to view Java Applets either.


Important information, such as titles of articles or your major keywords, should not be put into images unless it is also included in the text because search engines can't read images. This may seem redundant or repetitive to the normal reader but we're dealing with search engines here--not people. Large images that haven't been optimized properly can slow down a web page speed and with page speed being a new ranking factor this is more important than ever.

Text-only Browsers or Text-to-voice Converters

These are used by the blind. The same thing that stops these browsers will stop most search engine crawlers so it's a good idea to have all of your web sites tested for accessibility and provide an alt tag on all images and also view your site in a text-only browser (you can do this by finding your site in Google, clicking on the cache and then on the option for text only).

Template Driven Hosting Companies

Most code put out by template driven hosting companies is run by software that is so old that the code is deprecated and will some day be no longer supported by newer browsers and thus will not validate and search engines will avoid your site and the page may not display properly.

Your Site is not set up for Mobile

About half of all websites on the Internet are now designed for mobile so if your site isn't it will start to rank behind sites that are mobile friendly.

Not All Pages Set up for SSL/HTTPS

If you purchase an SSL certificate and miss a few pages while changing everything over to https this can cause an indexing problem in search engines and causing broken links on your website so be sure to run your site through a broken link checker once you're done.

XML Sitemaps

Every time you add or remove a page from your website you need to update your XML sitemap with those changes or google will see it as an error, and if there are too many errors it can affect your ranking.

If you would like your web site analyzed for keyword ranking
please check out the In-Depth SEO Analysis Reports page. If you are looking for a web designer to set up a new website check out my home page.

order an In-Depth SEO Analysis Report

Other Articles on What Can Prevent Indexing by Search Engines

10 Reasons why Google may not index Pages on your Site

Lori Eldridge
Copyright © July 5, 2001, updated 11-04-20
All rights reserved.

Technology that Blocks Search Engines
Dynamic vs Static Pages
Keyword Strategy
Optimize the Title Tag
How to Get Good Backlinks
Submitting websites to Search Engines and Directories
How Long Does It take to Rank in Google

Return to Search Engine Optimization Tips