The robot exclusion standard, also known as the Robots Exclusion Protocol or robots.txt protocol, is a convention to prevent cooperating web spiders and other web robots from accessing all or part of a website which is otherwise publicly viewable. Robots are often used by search engines to categorize and archive web sites, or by … [ Read more ]
META Tags are HTML code tags inserted into the “head” area of your web pages that describe the content of a webpage and (sometimes) provide instructions to visiting search engine spiders and browsers. Essentially, META tag information is used to communicate information that a human visitor is likely not concerned with. Below I will discuss the most common and recommended META tags.
Talking about redirects could quickly become very technical and quite boring but I will limit myself to the basics – why you would want/need to use a redirect and what you need to consider once you do use a redirect.
There are many instances where you might like to use a redirect – refreshing a page, transferring from one domain to another (e.g. from mydomain.com to … [ Read more ]