33 tips for better ranking of your website
For a long time, the architecture of the site is the focus of optimizers. But in the last few years this issue has become even more urgent. The architecture of the site is the foundation that affects how visitors interact with your resource, and how effectively search engines can analyze optimized content. But the architecture of the site should be a problem not only for optimizers, but also for website developers. Whenever the design of a new website is just beginning, we recommend creating a strong, search-optimized resource, even if you do not plan to immediately hire SEO specialists. Failure to comply with this requirement can lead to problems that hamper all SEO optimization efforts, and ultimately lead to higher costs for the development of the site.
Web-managers. If you are responsible for the development and / or marketing of your company’s website, and you do not want to pay for the work of hired SEO professionals, this wonderful guide will help you. Give this information to the hands of your developer and, if it follows it, your site will go through significant search engine optimization and will increase the productivity and revenue of your business.
Encryption with HTTPS.
In the past, the level of protected secrets of HTTPS / SSL was reserved exclusively for electronic commercial sections of the site. These protocols were used to protect confidential personal information, for example, credit card numbers. However, Google is trying to implement HTTPS everywhere, introducing it into the ranking algorithm. At present, this innovation does not play a key role, but the situation may change, as more and more sites decide to use these protocols. The key is to choose completely safe for the site. Also, you should make sure that this does not interfere with the speed of the resource. And this is already the second, perhaps more important, issue.
Follow the deadline for your certificate.
Expired certificates can cause inconvenience by constantly displaying intrusive notifications in the browser. This is likely to scare off customers. Therefore, constantly extend the term of certificate security.
Allow indexing using the robots.txt file.
Every time a new site is launched, the developer forgets to change the robots.txt file that allows search engines to crawl pages. If the web marketer did not think about checking this file, you could spend months, and never get the traffic you expected. Double check the robots.txt file to make sure that you do not “prohibit” search engines from crawling your site.
Declare the document type.
Data page “DOCTYPE” shows the browser how to translate each web page. Without a correctly declared DOCTYPE, the browser should “guess”. Search engines use DOCTYPE to make sure that they analyze each part of your site correctly.
Use Valid HTML.
Invalid HTML may not affect the ranking, but it can cause your page to be rendered incorrectly in the browser or in the search engines. Only the correct translation of each page will provide a true interpretation of your information.
Use Valid CSS.
Make your CSS and JavaScript files available.
Do not hide your CSS and JavaScript files in search engines. This information is important to help systems properly visualize the pages, that is, to analyze each part appropriately. It is possible that when search engines can not understand how to handle this content, key components will be decoded in the wrong meaning.
Avoid HTML frames.
True, HTML-frames are used by old-school web developers, which you can hardly find now. Let this be only a precautionary measure for you, suddenly you will encounter someone from the old school of developers. But, frankly, if you hired a developer who uses frames, you hired the wrong guy.
Add descriptive attributes to the ALT images.
Any image is assigned a name in the page’s code, rather than using CSS. Therefore, you should mark this attribute with the corresponding Alt. This is a minor thing, the more you practice to remember how images are added.
Redirect old URLs.
With any changes to the site, there will be mandatory URL changes, albeit minor. Before removing the old site, grab all current addresses and redirect pages with 301 error and URLs that can be changed or no longer valid. Redirect pages for 301 errors, and you will be able to get the most of the ranking of these pages earned in the past, and transfer it to the corresponding new pages.
Pages with 404 error are bad URLs.
In case you missed any pages with error 301 from old addresses, any incorrect address will return code 404 from the designed page.
Forget about the pages to print.
The developers used to create their own pages for printing. This is no longer necessary, and in fact this technique is a bad practice. Use CSS to make sure that any page of your site in the “print” version looks fine. Remove items that are not needed for the printed page, and use formatting to give it a presentable readable appearance.
Underlined links.
The underlined text still indicates that the text is a hyperlink. Underlined links are generally not very welcome, as they may violate the protocol (or expectation) in this area.
Differentiate the text of the link.
In addition to emphasizing the hyperlink, the link text should at least somehow differ from the main text. Visitors should not put a mouse over each word of the text to understand that this is a link.
Implement in the navigation chain the canonical URL pages.
Your pages should consistently indicate only the canonical URL. Quite often, the content can be analyzed from several URLs to understand how a visitor came to the page. Do not let the navigation chain of the URL repeat the visitor’s path. Instead, match your address.
Set the correct page hierarchy.
URL pages should use the established hierarchical format, which simulates navigation on the site. Navigation categories and subcategories must be present in all URLs.
Make a balanced directory structure.
When disassembling the page navigation hierarchy, you need a good balance between the shallow and deep hierarchy. You do not want visitors to have to make too many clicks before they find the content they are looking for. However, too many conversions from the main page usually lead to loss of interest from the visitor, and random content choices. They are no longer looking for what they want, but simply click the random link.
Write unique name tags.
Each page of the site should begin with its own unique title tag. You do not need to hammer in it all SEO keys, because there is a title that displays the content of the page, necessary for the promotion of the site.
Write a unique meta description.
See above. A good description should consist of 100-155 characters.
Use correctly coded lists.
Use the correct HTML tags (ul, ul, li) for bulleted and numbered lists. This will show the browser and the search engines that the content is the actual element of the list that can affect the text that is displayed when searching for values.
Reduce the spread of code.
As the development and addition of new features to the site, the code becomes more bloated. It starts to slow down the page speed, so it’s better to keep it in the minimum size.
Reduce the use of HTML tables.
Like frames, tables are on the public path. Although it is easier to create and manage tables, it’s best to avoid this method as much as possible, but use CSS for content, as well as layout of style sheets.
Use absolute references in the navigation.
Developers like to use relative links, which makes the site easy to move from the development server to a real URL. However, relative references can lead to problems with interpretation and insertion. It is better to use absolute references when possible, at least in site navigation.
Disallow pages that are outside the search engines.
Use the robots.txt file so that search engines do not display pages that should not have access. The prohibition of these pages will keep the search engines from reading any content on the page. However, links to these pages can be found at the end of the search results, if the system receives another signal to display the value of the page.
Nofollow.
Nofollow – attribute in the link, which will help prevent page values from being scanned. If you do not want any specific link to go to another page, use the nofollow attribute in the link code. Keep in mind that linking can lead to loss of communication on the page – the code simply will not be accepted on the page that you refer to.
Check all links.
Before starting the site, be sure to check the links for the presence of non-working. After all, you do not want that when you crawl your site, Google finds errors, because this is one of the parameters that will reduce the overall value of the site. You must do this again as soon as the site is launched, just to be sure that everything is working correctly.
Find ways to increase the speed of loading pages.
There are always things you can do to improve the speed of the site. Look, even with the smallest possibilities, you can make it so that your pages load even faster.
Decrease the number of links on the page.
Search engines recommend that any single page has no more than 100 links. But this does not mean that you should approach this number, removing unnecessary references.
Eliminate duplication of content.
Do your best to prevent duplicate content. This rule is especially important for e-commerce sites with several ways of searching for such information. Each page should have one canonical URL. The rest must be eliminated. If you can not eliminate all unnecessary URLs that produce duplicate content, use the canonical tag as a temporary measure.
Implement proper hierarchy of tag headers.
Each page should have one, and only one, H1 tag. The remaining top-level header tags (H2-4) should only be used for content areas, leaving H5-6 for navigation headers.
Do not use session IDs.
Session IDs are another old technology that is still used in mysterious ways today. There are much better ways to track visitors to your site, so this method should not be used on all resources.
Use search engine optimization links for search engines.
Make sure that all your links (except those you consciously want to keep away from the search engines) use search and optimization elements in the code. Using incorrect code in a link can inadvertently separate the search engines from very valuable content.
Implementation of structured data.
Structured data is the additional coding of key elements in content that help search engines understand the purpose or value of this content. Structuring can affect your site, how it will appear in search results, and what information will be displayed in the search engines as a whole.
Conclusion
The implementation of each of the above points will promote your site a step closer to the first positions in the search results. The purpose of this guide was to draw your attention to the key components necessary for a successful start of the site in terms of search engine optimization. Failure to comply with even one item can have disastrous results for further progress. If you miss some nuance, and the site will be laid out with errors, you can feel the long-term negative consequences (even if you quickly correct the errors after starting the resource).
It’s best to go through this list with your developer and make sure that each step has been completed before approving the site and sending it to the start (even if it takes several weeks). It is better to deploy the site later than to release a resource that will be a load for your business, and will create greater problems for its functioning in the future.