Do a quick Google search for “SEO tips” and you can get over 14 million results. It has many tips to go when trying to determine the focus of your SEO strategy. What else is strong, than with a click.

Every year there are new messages on a list of “hot” “guaranteed” tips to work on. While many of these tips are great for seeing the actual results, you need to have a good foundation. In this post, I do not want to talk about going back to the basics of SEO and why they are important to long term success.

When it comes to optimizing your website for search, the basics are some of the most important but often overlooked SEO. The recent pressure “content is king” is to forget the reason a lot of the essence and focus on content distribution.

Here’s the deal: You can have all the content you want to invest, but if your site is not optimized, you will not get the position you want. So here are some basics that plunge you before you need to refer to the more complex search elements.

Crawlers

Because it is difficult for search engine crawlers to index your site, it is difficult for indexing and ranking pages as well. If a website owner or SEO, the first and most important task is to ensure that your site is indexed. Using the robots.txt file, you can help guide and help the crawler to index your site.

There is a special page on your site that you may not want to crawl like a login page or private directories. You can save the file, page and / or directories to the program block by setting them as “abandoned”, like this:

User-agent: *
Prohibit: / cgi-bin /
Prohibit: / directory
Prohibit: /private.html

You can also block specific crawler access your site with the following (replacing “BadBot” real name of the bones you try to avoid) that

User-agent: BadBot
Forbid: /

Be careful when blocking crawler on this site; In fact, do not do this if you do not know for sure which bones cause some problems. If not, you may end up avoiding the crawler to use this site, which may interfere with indexing and height.

If you use WordPress, there are plugins that allow you to do this. If you do not use WordPress, you can easily set up a robots.txt file on the server. More information about robots.txt here.

Once you’ve created the robots.txt file, it’s important to make sure that Google crawls your site. To do this, you need to create a site map. This can be done manually or with third-party tools. (If you create a WordPress site you have many plugins that can be applied to the site map for you.)

After creating a sitemap by logging in to Google Webmaster Tools. (If you have not set up the site in Webmaster Tools, see this.) You want to download a sitemap to go on all fours “and then” Sitemaps “in the left navigation, click the button “Add / Test Site Map” in the top right corner of the page, you can test the site map and submit it to Google for indexing (please note that this takes some time before this, searches and indexes your Site.)

If you submitted a Sitemap and just want to test on a separate page / submit your site, you can use the “Search Google as” function, which is also under Crawl “in the left menu.

Once you log in, click “Crawl” in the left menu.
Then select “Search as Google.”
From there, enter the URL path for the page you want to test and click “Search.” (Leave blank if you want to test the home page.)
Check the status. It should be a green check mark and said “Finish”.
Click on “Request crawl”, if any.

Make sure Google can index your site, it is important to be indexed. Without it, your site is indexed, do not do what you do on the list.

The structure of the web page

In today’s first mobile consumer culture, web-obsessed, sometimes overlooked, we see a simple and practical. Although I was very good mobile user experience and decisive in the first place, I also believe that we should not forget the search engines. It has the most local solid user experience and help you to rank better.

While this seems like a simple idea, building a good website structure will take time and planning. Not only does it affect navigation and location, it also helps you understand the content and context tracker. The structure of the site is all about placing your content in a logical way. Do not make the user to dig or search engines to find what they came to your site. Learn how to create a great web site structure is created here.

Meta title and description

Title and meta description are some of the most basic elements of SEO. Although the “title” ranking algorithm description considered both still very important. Google can not use the classification description, but that does not mean they ignore them. Crawlers continue to read the description – and all the possibilities you need to tell the crawler on a page, you need to take.

Titles and descriptions are often the first thing that came in contact with potential visitors in the SERPs. Here are some tips to make a better description.

Headlines

To optimize the title tag of your page around the center of gravity.
No “keyword stuff.”
Stay within 50-60 characters.
Make it more relevant to users.
There are no duplicates.

lighting

Create action-oriented.
Add a primary keyword.
Easy to understand copy creation.
To compete with 135-160 characters.
There are no duplicates.

A better description could lead to higher clickthrough rates and increase the visibility of your site in search results. It is important to note that when Google thinks that providing meta data does not meet the user’s purpose, they have changed.

Before jumping into the latest and best SEO tactics, make sure first the basics. It’s amazing what some simple improvements and changes to the website and overall online marketing strategy can do. Make sure that your site is indexed, creating a structure that both user0 and search engine friendly, and take the time to make a better description. Do the basics to help build a solid foundation for long-term success.