Basic Search Engine Practice

SEO company NZ

Basic Search Engine Practice

When a new website or blog is created for a business, having it found on Google is the first thing that the owners probably think about. Search Engine Optimisation is of course one of the best ways through which you can have your website found easily on Google search engines. For this you will have to wait for your website to get crawled and indexed by the Googlebot. There are, however, ways to make this process better and quicker. Here are the basics of the search engine optimisation process. It describes the process of adding and indexing websites with Google. Additionally, we also give some of the most effective ways to get the Googlebot to crawl your website and index your content as quickly as possible.

Adding & Indexing Website with Google

The process of getting your website indexed with Google may seem to be an easy task. However, you may get confused in regards to how you can get your website discovered by the Googlebot. Adding and indexing your website with Google is the best way to achieve success in relation to internet marketing.  Here are some great ways to do it as fast as possible. The best thing of all is that some of these points will also help you to maintain a high volume of traffic to your new website.


  • Create the Sitemap – A sitemap is an XML file that basically lists the web pages of your site. It tells Google and other search engines about the new content pages added to your website, and how often to check the updates on particular pages. You can then come back and easily check the home page of your website for new updates, products and content. If your website was created with WordPress, you can also install Google XML Sitemaps Plugins. It will automatically build and create a sitemap for your website and submit it to search engines.  


  • Submit the Sitemap to Google Webmaster Tools – After creating the Sitemap you will need to submit it to Google Webmaster tools. However, if you don’t have one, then you can create a free Google account and then sign up for webmaster tools. After adding your new website to the webmaster tools, just go to the sitemaps option within the optimisation and add the link of your website’s sitemap. This is to notify Google about your website and about the already published pages on it.


  • Submit the URL of your website to Search Engines – Some people suggest not to do this as there are several ways to get a search engine’s crawler to your website. But it is certainly a quick and easy way. So it is best to submit your website URL to Google. To do so, you will need to sign in to your Google account and then go to the submit URL option in Webmaster tools.


On-Site SEO Basics

On-Site Search Engine Optimisation is the practice of individual web page optimisation which helps your site rank higher on search engines, while driving more relevant traffic to it. It refers to optimisations of both the content and HTML source code of a webpage. On-site SEO basics focus on all the attributes of a webpage, which will enhance your website rankings in the search engine results. For this the content on your website should be relevant to the user’s queries.

Google sends the search bot software called Googlebot to collect the info related to your web documents and add it to Google’s searchable index. The Googlebot generally moves around from one website to the other in order to find fresh and updated information and report it back to the Google. This is the process of crawling, where-by Googlebot crawls through the website using links. 

The information collected by the Googlebot is then processed by indexing. After the processing of files is completed, they are added to Google’s searchable index depending on the quality and quantity of the content. In the process of indexing, the words on the web pages are processed by the Googlebot. The title tags and ALT attributes are also analysed during this process.

When it comes to finding new content like blogs or pages on a website, the webpages gathered during the previous crawling process are added to the sitemap data given by the webmasters. When it browses the previously crawled webpages, the links given on those pages are identified and added to the list of webpages to be crawled. This is how new content on the website is discovered by using links and sitemaps.

free quote button


I”ve been looking for a provider who would help us develop social network website for quite some time. We received over 50 proposals and at the end decided to go with a1dezine and I couldn”t be more happy with the decision. The work they did is simply amazing and I would recommend them for any project you are working on. They didn”t only listen to what I wanted but added their ideas and improvements so I can just say one thing – fantastic guys to work with + the website looks and works great!

Read more
Mr. Mitch