Check Meta-tags within your page. Determine if Meta title tags, Meta description tags, Meta robot tags and Meta keyword tags are in place and offering the important information about your page to the major search engines. Meta tags are not like ordinary HTML tags, they do not affect how your page is displayed, but they provide valuable information about your page to the search engines, which can help a lot with your business, although having funding for your business is also important, and you can get some investment and make some money for your business.
Check how your page might look in the Google search results page. A Google search result use your webpage title, url and meta-description in order to display relevant summarized information about your site. If these elements are too long, Google will truncate their content, so you are advised to set your webpage title up to 70 characters and your webpage description up to 160 characters in order to optimize readability.
Check the most common keywords and their usage (number of times used) on your web page.
The Keyword Cloud is a visual representation of keywords used on your website. This will show you which words are frequently used in the content of your webpage. Keywords having higher density are presented in larger fonts and displayed in alphabetic order.
Check the contents of any h1 & h2 tags within your web page. Header tags are an important On Page SEO factor because they’re used to communicate to the search engines what your website is about. Search engines recognize the copy in your header tags as more important than the rest. This starts with your h1 and works its way down in importance to the h2, h3 and so on. These tags will help support the overall theme or purpose of your page.
Check if your website is using a robots.txt file. Search engines send out tiny programs called spiders or robots to search your site and bring information back so that your pages can be indexed in the search results and found by web users. If there are files and directories you do not want indexed by search engines, you can use the “robots.txt” file to define where the robots should not go.
These files are very simple text files that are placed on the root folder of your website: www.yourwebsite.com/robots.txt.
There are two important considerations when using “robots.txt”:
– the “robots.txt” file is a publicly available file, so anyone can see what sections of your server you don’t want robots to use;
– robots can ignore your “robots.txt”, especially malware robots that scan the web for security vulnerabilities;
This test will check if your website is using a “sitemap” file: sitemap.xml, sitemap.xml.gz or sitemap_index.xml.
Sitemaps are an easy way for webmasters to inform search engines about pages on their sites that are available for crawling. In its simplest form, a sitemap is an XML file that lists URLs for a site along with additional metadata about each URL (when it was last updated, how often it usually changes, and how important it is, relative to other URLs in the site) so that search engines can more intelligently crawl the site.
Check your website for broken or dead links. This tool scans your website to locate internal and external broken links that are not only frustrating to your visitors, but damaging to your websites overall ranking with the major search engines.
Check your URL internal links for underscore characters. Google’s suggestions for URL structure specify using hyphens or dashes (-) rather than underscores (_). Unlike underscores, Google treats hyphens as separators between words in a URL.
Check images on your webpage for required alt attributes. If an image cannot be displayed (wrong source, slow connection, etc), the alt attribute provides alternative information. Using keywords and human-readable captions in the alt attributes is a good SEO practice because search engines cannot really see the images. For images with a decorative role (bullets, round corners, etc) you are advised to use an empty alt or a CSS background image.
Check your webpage HTML tags for inline CSS properties. An inline CSS property is added by using the style attribute for a specific tag. By mixing content with presentation you might lose some advantages of the style sheets. Is a good practice to move all the inlines CSS rules into an external file in order to make your page “lighter” in weight and decreasing the code to text ratio.
Check if your webpage is using old, deprecated HTML tags. These tags will eventually lose browser support and your web pages will render differently. Check this list with all HTML tags.
Check if your page is connected with Google Analytics. Google Analytics is the most popular analytics package for websites, this tool provides you with great insights about your site visitors, demographics and very comprehensive metrics that help you analyze every aspect of your site. It is a good practice to use analytics in order to learn how your visitors behave and continuously improve your website.
Check if your site is using and correctly implementing a favicon. Favicons are small icons that appear in your browser’s URL navigation bar. They are also saved next to your URL’s title when bookmarking that page and they can help brand your site and make it instantly recognisable for users to navigate to your site among a list of bookmarks.
An import part of SEO is to check if your website URL and internal URLs are SEO friendly. In order for links to be SEO friendly, they should be clearly named for what they are and contain no spaces, underscores or other characters. You should avoid the use of parameters when possible, as they are make URLs less inviting for users to click or share.
Check to see if your website has any backlinks. Backlinks are any links to your website, webpages or posts from an outside source. It is important to check backlink quality to avoid penalties from the search engines.
Check your source code for JavaScript errors. These errors may prevent users from properly viewing your pages and impact their user experience. Sites with poor user experience tend to rank poorly in search engine algorithms.
Check if your page is connected to at least one of the most important social networks. Social signals are getting increasing importance as ranking factors for search engines because it leverages the social intelligence (via our interactions) to determine more accurate relevancy for searches. That’s why connecting your website to a social network is a must nowadays to make sure your site is social enabled.
Check the activity on social media networks of your website or URL. This activity is measured in total number of shares, likes, comments, tweets, plusOnes and pins. This activity covers only your URL and not social media accounts linked with your webpage.
Check your page’s HTML size. HTML size is the size of all the HTML code on your web page – this size does not include images, external javascripts or external CSS files.
Check if your page is correctly using HTML compression. Compression works by finding similar strings within a text file, and replacing those strings temporarily to make the overall file size smaller. This form of compression is particularly well-suited for the web because HTML and CSS files usually contain plenty of repeated strings, such as white spaces, tags, and style definitions.
Test your website using real browsers to determine the loadtime speed. How fast your page loads is one of the most important factors in search engine rankings. Pages that take longer than 5 seconds to load can lose up to 50% users. Faster loading webpages offer higher traffic, better conversions and increased sales over slower loading pages.
Check if the full list of objects requested by your page can be retrieved. If your page contains objects that cannot be retrieved your page won’t be displayed correctly, this impacts the user experience and search engines will penalize you accordingly.
Check if your page is serving cached pages. A page cache is a mechanism for the temporary storage (caching) of web documents, such as HTML pages and images to reduce bandwidth usage, server load, and perceived lag. A web cache stores copies of documents passing through it; subsequent requests may be satisfied from the cache if certain conditions are met. Common caching methods are Quickcache and jpcache
Check if your page uses Flash. Flash is an outdated technology that was widely used in the past to deliver rich multimedia content. Nowadays this evolved to newer, more mature technologies and standards based on HTML 5, so it’s not considered a good practice to use it. Flash content does not work well on mobile devices, and it’s not Search Engine friendly.
Checks if your page is using an image expires tag, which specifies a future expiration date for your images. Browsers will see this tag and cache the image in the user’s browser until the specified date (so that it does not keep re-fetching the unchanged image from your server). This speeds up your site the next time that user visits your site and requires the same image.
Check if your externals JS and CSS files are minified.
Minification is the process of removing all unnecessary characters from source code without changing its functionality. These unnecessary characters usually include white space characters, new line characters, comments, and sometimes block delimiters, which are used to add readability to the code but are not required for it to execute. Removing those characters and compacting files can save many bytes of data and speed up downloading, parsing, and execution time.
The compressed code may be harder to debug because it is bunched together, usually, on one line. This is why we always recommend keeping a backup copy of your JS or CSS script to use in times where debugging is required.
Check if your site is using nested tables, which can slow down page rendering in the user’s browser.
Check to see if your website is using frames. The “frameset” tag is used to display multiple HTML documents in one page. When search engines use robots or spiders to get information from your page, they have to sort through a bunch or unrelated pages, making it difficult to index a single page. This can create a decrease in search engine page rankings.
Check for doctype declaration. A document type declaration, or DOCTYPE, defines which version of (X)HTML your webpage is actually using and this is essential to a proper rendering and functioning of web documents in compliant browsers.
Test your site for potential URL canonicalization issues. Canonicalization describes how a site can use slightly different URLs for the same page (for example, if http://www.example.com and http://example.com displays the same page but do not resolve to the same URL). If this happens, search engines may be unsure as to which URL is the correct one to index.
Test your site for potential IP canonicalization issues. Canonicalization describes how a site can use slightly different URLs for the same page (for example, if your sites IP address and domain name display the same page but do not resolve to the same URL). If this happens, search engines may be unsure as to which URL is the correct one to index.
Check if your website is using a secure communication protocol over the Internet. Using an HTTPS URL indicates that an additional encryption/authentication layer was added between client and server. The data transferred is encrypted so that it cannot be read by anyone except the recipient. HTTPS must be used by any Web site that is collecting sensitive customer data such as banking information or purchasing information. If you are making a transaction online, you should make sure that it is done over HTTPS so that the data remains secure. Even for sites that do not collect senstive customer information, search engines suggest that switching to https is an increasingly good idea and may help improve rankings.
Check if your website is listed with malware or phishing activity. Any site containing malware or suspicious for phising activity is seen as a threat and risk to the online community and hence will get a a lower ranking. This test checks if the most relevant online databases that track malware and phishing list your website.
Check if your server’s signature is ON. A server signature is the public identity of your web server and contains sensitive information that could be used to exploit any known vulnerability, so it’s considered a good practice to turn it OFF as you don’t want to disclose what software versions you are running.
Check if your server allows directory browsing. If directory browsing is disabled, visitors will not be able to browse your directory by accessing the directory directly (if there is no index.html file). This will protect your files from being exposed to the public. Apache web server allows directory browsing by default. Disabling directory browsing is generally a good idea from a security standpoint.
Check if your server allows access from User-agent Libwww-perl. Botnet scripts that automatically look for vulnerabilities in your software are sometimes identified as User-Agent libwww-perl. By blocking access from libwww-perl you can eliminate many simpler attacks.
Check your webpage for plaintext email addresses. Any e-mail address posted in public is likely to be automatically collected by computer software used by bulk emailers (a process known as e-mail address harvesting). A spam harvester can read through the pages in your site and extract email addresses which are then added to bulk marketing databases and the result is more spam in your inbox.
Check if your page implements responsive design functionalities using media query techniques.
The ‘@media’ rule allows different style rules for different media in the same style sheet. Media query techniques allows different content to be optimized depending on the output device and this is a must nowadays to make sure your website looks good on ALL devices and platforms.
Check how your page renders on a mobile device by providing a snapshot for you to quickly check if it looks good.
This test will check if your web page take the advantages of HTML Microdata specifications in order to markup structured data. By using microdata in your web pages, you can help search engines to better understand your content and to create rich snippets in search results.
Check if your webpage is using the noindex meta tag. The usage of this tag instructs search engines not to show your page in search results.
Check if your webpage is using the canonical link tag. This tag is used to nominate a primary page when you have several pages with duplicate content.
Check if your webpage is using the nofollow meta tag. This tag will tell search engines not to crawl any outgoing links from your webpage.
Check if the robots.txt file is excluding some parts of your website. Search engines will look for a robots.txt file in the root of your domain whenever they crawl your website. The Disallow directive is used when you want to advise a search engine not to crawl and index a file, page, or directory. See pinguisweb.com for assistance.