9 Ways to Reduce Your Shopping Cart Abandonment

9 Ways to Reduce Your Shopping Cart Abandonment

Studies show that up to 75% of all started online purchases never end after putting the goods in the basket. 

In this article, we will talk about how to solve this problem on a site running WordPress.

What is an opt-out purchase?

A purchase refusal occurs when a customer of an online store puts goods into a virtual basket and then leaves without having made a purchase. This is also known as shopping cart abandonment.

This can happen on the online store page, during checkout, or at any other stage of the acquisition.

Reasons for refusing purchases

There are many reasons why a user may not make a purchase. For example, on a slow site, the refusal of purchases can reach 75%.

At the same time, most of the customers who refused to purchase, just look at the virtual storefront and carry out certain studies. Only 2-3% of customers make a purchase the first time they visit an online store.

Here are the main problems with customers refusing a purchase:

  • Large additional shipping fees, taxes, etc. (55%).
  • The need to register an account to make a purchase (34%)
  • The registration process is too complicated (26%).
  • Errors and failures on the reseller website (17%).
  • Lack of preferred payment option (6%).

How to solve the refusal problem

Let’s start with technical optimization.

1. Increase the speed of the site

Long download times increase the bounce rate by mobile users.

Website Loading Speed

A few years ago,  Amazon published the results of a study proving that an extra second of page loading could cost a corporation $ 1.6 billion in losses due to lower sales.

2. Make the online store mobile

Currently, more than half of the users use mobile devices to access the Internet. At the same time, it is these users who most often refuse to purchase goods.

Page is mobile friendly

Mobile website optimization is an important usability factor for e-commerce. Use the Google Mobile FriendlyTesting Tool to find out how your site appears on mobile devices.

3. Start tracking

Use the Google Analytics data and its enhanced eCommerce tracking capabilities to measure your bounce rate.

To do this, in the GA interface, go to the Conversions> Ecommerce> “Customer Behavior” section in Google Analytics. Here you can find out at what point users refuse to place an order.

Shopping Behaviour Analysis

4. Improve the checkout process.

To reduce the failure rate, simplify the checkout process:

  1. Use one-step registration – all stages of the ordering should take place on one page. If you cannot reduce this process, then try reducing the number of steps.
  2. Remove the mandatory registration – the need to register an account is one of the reasons for customer refusal. Realize the opportunity to purchase goods without mandatory registration. The WooCommerce plugin supports this feature out of the box.
  3. Reduce the number of clicks – reduce the number of actions that the user must take to buy a product. For example, customers should not be automatically redirected to the cart after adding the product. Returning to the product catalogue is another click they will have to make.

Also, use the progress indicator to show customers how far they have come in the checkout process.

Checkout

5. Increase customer’s confidence

Here are some signs that indicate the reliability of the online store:

  • Professional web design.
  • Several communication channels with company representatives.
  • Privacy policy.
  • Logos proving the reliability of the site (for example, anti-virus data protection).
  • Money-back guarantee.

Protection Logos

6. Offer free shipping

Customers are more concerned about the lack of additional shipping fees than fast delivery.

Free shipping is a great way to reduce your refusal to make a purchase. If you can’t offer free shipping for all customers, then at least set a “threshold” after which customers will be eligible for free shipping.

Free Shipping

7. Offer your customers multiple payment options

One payment system is not suitable for all customers.

More Payment Methods

For example, PayPal is not available in some countries. Therefore, users from these regions will not be able to make purchases in your online store.

8. Add pop-ups on exit

Many users who refuse to make purchases simply check prices. They plan to purchase the goods but are looking for the most favourable conditions.

By pushing them a little, you can convince such buyers that the goods need to be purchased from you. This can be done using pop-ups that appear during checkout when the user is about to close the browser window.

Exit Pop-ups

Possible solutions for implementing this function:  Hustle,  Popup Maker and  Popup Builder.

9. Send incomplete purchase emails or Sending shopping cart abandonment emails

Incomplete purchase emails will be an additional tool that will remind customers of the items in their cart.

Reminder Emails

Summarizing

Above we examined the reasons why people refuse to make purchases. Once again, we list the steps that will help increase sales:

  1. Increase the speed of the site.
  2. Make your online store mobile.
  3. Keep track at what stage of the purchase the “loss” of customers occurs.
  4. Improve the checkout process.
  5. Use confidence indicators;
  6. Offer free shipping.
  7. Provide more payment options.
  8. Use popups.
  9. Send incomplete purchases emails.

Now you need to decide how to put these steps into practice.

 

 

The Impact of Crawlability and Indexability of a Site on a Top Position in Search Results

Today we’ll look at the impact of two important factors on SEO: search engine crawability (visibility) and indexability. But first, let’s see how search engines find web pages on the Internet.

How search engines find pages on the Internet

To find new web pages, search engines use special bots that search and index content. They browse sites like regular users, following links.

But search bots also see code that is not accessible to users. For example, alt attributes, description meta tags, structured data, and other elements that help search engines index content.

Brief meaning of the basic concepts:

  • Site crawlability – the availability of web page content for scanning by search robots.
  • Indexability – the availability of site content for indexing and display in search results.

Determining site availability for crawling and indexing

If the site is not displayed in the search results for queries related to your goods or services, then its content is not indexed or inaccessible to search robots.

What affects the availability of the site for search engines?

Here are just a few of the factors:

  1. Site structure: Can you get to the main page of your site from any other? Think about how you can optimize navigation through sections of the site and make it more convenient.
  2. Internal links to useful information: If the site has a blog post on a topic that you already wrote about in another article, make a link between these pages. This will allow search bots to see that the content is interconnected and will help them crawl the site more accurately.
  3. Deprecated and unsupported technologies: Make sure that you are not using Ajax and JavaScript on the site that does not allow search bots to crawl its content.
  4. Code errors that reduce the availability of the site: txt is a file through which the availability of content for scanning by search robots is configured. But the content may not be available for indexing due to errors in the layout of the web page.
  5. Server errors and broken redirects: Such redirects not only increase the bounce rate but also do not allow search bots to access the content.

How to help search bots find and index a site?

After you solve the above problems, use the following recommendations:

Step 1: Submit Sitemap to Google

It is important to form a site map correctly. This will increase the visibility of the site in the search results.

The sitemap is located in the root folder of the resource. It contains direct links to each page of the site. Whenever content is updated, it warns search engines of the need to crawl the updates that have occurred.

Step 2: Update Your Content Regularly

Diversify your content. Add images, videos, slides, audio, and other formats to it. It will also allow you to index your content faster. Since Google and other search engines spend more time crawling and indexing sites that are regularly updated.

Step 3: Increase The Number Of Internal Links

Internal links help search engines better understand the context and structure of your site. Search robots start crawling the site from the main page. Then they click on the internal hyperlinks, finding out the relationship between the various posts and pages.

Step 4: Speed ​​up page loading

Page loading speed and site speed are different concepts. Page loading speed is the time it takes to display content hosted on a specific web page.

Google provides a tool for determining download speed measurements. Search engines allocate limited time to crawl and index any site. This is called the crawling budget of the search robot. Therefore, it is important to load the page as quickly as possible until the allotted time limit runs out.

In addition, a long load increases the number of failures and reduces conversion rates. It can also pessimize the position of a resource in search results.

Step 5: Avoid Duplicate Content

Duplicate content is a copy of the same material published on different resources. It’s hard for Google to figure out which instance of an article is more relevant to a given search query. Therefore, it is recommended to completely avoid duplication of content. In general, it is better to avoid duplicate content.

What else can be done?

# 1 Extra Tip: Limit Redirects

Redirects are automated using special status codes defined by the HTTP protocol. They are usually used in the event of a company name change, merging of two sites, conducting split testing of landing pages, etc.

On each page of the site should be no more than one redirect. Always use 302 redirects for temporary redirects and 301 for permanent redirects.

Extra Tip # 2: Turn on Compression

Compression is used to provide the server with smaller site size. Thanks to what its loading takes less time. As a rule, GZIP compression is already activated on the hosting. If this tool is not available, try using other compression tools.

# 3 Extra Tip: Optimize Your Images

On average, 60% of a web page’s volume is in images. A large amount of heavy graphic content significantly reduces page loading speed. Therefore, make sure all images are compressed.

Other recommendations for using images:

  • Use unique images that are relevant to the web page.
  • Strive for the highest quality possible.
  • Use alternate texts for accessibility.

Additional Tip # 4: Watch for the size of the content shown above the first-page fold

The top half of the page is what the visitor sees as soon as he gets to the site. Put in this area the most interesting and compelling information you can provide. Verify that you have properly organized your HTML markup to quickly render any content. At the same time, its size should not exceed 148 KB (in compressed form).

Additional Tip # 5: Configuring Caching

Caching can reduce the loading time of web pages. This reduces the bounce rate and improves the SEO performance of the site.

Google experts found that increasing the download time by half a second can reduce the volume of attracted traffic by 20%. Therefore, download speed is an important ranking factor.

Additional Tip # 6: Minimize Resources Used

Minification includes removal of gaps and comments from the code. As well as other elements of the code that do not affect the operation of the site.

To minify your WordPress site, use the WP Fastest Cache plugin. As well as other specialized tools. For example, Google Closure Compiler for JavaScript and others.

Conclusion

It is necessary to constantly optimize and effectively manage your site in order to improve its accessibility for search engines. The tips and tricks given in this article will help you with this.

 

 

How I Increased Website Traffic by 600% in 24 Months

How I Increased Website Traffic by 600% in 24 Months

Each business niche has its own characteristics. The SEO strategy that worked for one site may not be successful for another

 

In this article, I will give several ways to optimize SEO, which allowed me to increase the amount of traffic received by my client’s site several times. Remember, this is only an isolated case. This article describes several methods that can be applied to almost any promotional campaign.

 

No Way

The Starting Point

Initially, my client’s site was visited by about 20 thousand people a month who came from search engines.

Starting Point

To get such an indicator of attendance, the client promoted his site on competitive keywords with a monthly search volume of 1,000 to 5,000 queries.

But there was potential for increasing the audience. They wrote quality content. But there was a problem: the lack of SEO optimization and quality links.

 

4 SEO techniques that allowed traffic to take off –

1. An in-depth audit of content

First, I identified the articles that generated the most traffic (using Google Analytics and the Google Search Console). At that time, hundreds of posts were published on the site’s blog. But only ten of them attracted traffic.

Therefore, I removed 90% of the content from the blog.

I put each post in one of two “buckets”:

Two Buckets

1. Modification, optimization and republishing: this bucket was intended for content that had unused potential. They fell into this bucket if:

  • It had excellent potential for keywords that it didn’t rank for.
  • Offered unique, which had the potential to expand.

Only five previously published posts fell into this category. I asked the author to expand each of these posts with sections using long-tail keywords. After which he updated the date of publication in WordPress and promoted them as new publications (but retained the same URLs).

2. Unpublished drafts and redirects: this bucket was intended for content that did not have much SEO potential. After deleting such posts, I installed 301 redirects. These articles were saved as unpublished drafts in WordPress to use this content for future posts.

By the end of the audit, the blog had 30 posts (including the ones we updated). But over the next few months, they generated a lot more traffic than before.

The Pareto Principle

2. Continuous distribution of links

To get new links, in the email newsletter I started offering free access to courses and partner sites on behalf of my client. Over time, I realized that if you have great content, contact someone who has similar content and offer something exclusive.

Contact someone

3. Creation of new SEO-oriented content on an ongoing basis

In parallel, I worked with the author to create new SEO-oriented content based on the best keywords in the niche. Before writing a new article, I did in-depth keyword research. Then I sorted the list and selected keywords with low competition. In this case, targeting was applied to a group of related keywords, and not to one of them.

Thanks to the update of the Hummingbird search algorithm in 2013, now in Google one piece of content can be ranked by thousands of keywords. Thus, we have created 25 articles in two years. A few months after their publication, they began to get into the TOP issuance for hundreds of requests.

Ahrefs-1

Today 72 published blog posts attract more than 140 thousand visitors from search engines every month.

Ahrefs-2

The important thing is not the number of publications, but how valuable and optimized the content is.

4. Implementing an internal linking strategy

Ninja Outreach

When I started working with a client, most of the pages on his site had several internal links without a clear placement strategy. After conducting a content audit, I increased the number of internal links and optimized the anchor text of the old ones.

In total, ten internal links were added to each post. In this case, the anchor text was used, which included:

  • Target keyword from other posts.
  • Target keyword variant.
  • A secondary keyword with a long tail.

To speed up the process of optimizing your internal link mass:

1. Sort the posts by publication date and saturate them one by one with internal links. To do this, you will need a table with links to all posts published on the blog.

2. Use the internal WordPress search engine to search for specific phrases. You can search for targeted keywords to find publications that already use these phrases.

WordPress Posts Sorting

Then you need to view the publications found and add internal links to the post.

Traffic increases with a correct SEO strategy. Not just SEO tactics!!

Initially, I used strategies that did not bring results:

  • Distribution of template letters.
  • Creating content that was not unique or compelling
  • Analysis of keywords after receiving an article from the author, and not before writing it.

So the problem was that I used the right SEO tactics wrongly. I was not focused on helping the author create valuable content that would set it apart from many other authors. And also do not contact people who have already referred to similar resources or wrote quality materials on this topic.

Correction of these shortcomings helped me make a breakthrough in my SEO professional activity not only with one specific author but with all other clients.

 

 

5 Common Problems That You Can Avoid Using A Dedicated WordPress Hosting

5 Common Problems That You Can Avoid Using A Dedicated WordPress Hosting

Let’s consider the problems that can be avoided using dedicated hosting for WordPress.

In this article, we will explain the differences between the most popular types of web hosting.

web hosting

Different types of website hosting

There are five types of hosting:

  • Shared hosting – the site is hosted on a virtual server along with other resources. This is the cheapest option. But other sites hosted on the server can adversely affect your Internet resource.
  • Virtual Private Server (VPS) – A shared server with isolated space for each site.
  • Dedicated hosting – the site is hosted on one physical server. Allows you to not worry about the impact of other sites. But this type of hosting is expensive.
  • Cloud hosting – multiple servers work as one big one. Cloud hosting is effective for handling large volumes of traffic. However, you only pay for the place you use.
  • Managed WordPress hosting hosting provider performs some settings and technical support for the site.

5 Common Problems That You Can Avoid Using Dedicated WordPress Hosting

Next, we’ll look at a few issues that you can avoid by choosing dedicated WordPress hosting:

1. Security threats from other sites and obsolete files

You can risk placing your site on a shared server. Performance and security issues on other sites can affect your site.

Placing a site on a dedicated server eliminates these threats. In addition, as part of this service, the client receives an automatic update of WordPress. Regularly updating the CMS  is essential to maintaining a high level of site security.

2. Poor site performance

Located on a common server, your site shares server capacities and space with other resources. This slows down its operation since the server works with a large number of requests from users.

A slow site is rarely successful. Therefore, using a dedicated server is a wise investment. It also solves the downtime problem. Huge traffic spikes are possible on neighbouring sites that overload the server and affect your site.

3. Difficulties with site restoration

You always need to create backup copies of the site so that you can quickly restore it. But this process can take a lot of time if you use shared virtual hosting.

Dedicated hosting with automatic backup service speeds up the recovery process.

Hosters who provide this service sometimes also offer one-click site recovery options. That allows you to quickly resume your site in case of a serious problem.

4. Not enough room for the growth of the site

Over time, the site grows and develops. All this requires new virtual areas that are not on the shared virtual server. You will also need a server that can handle the increase in the volume of processed traffic.

5. Lack of full-scale technical support

Most hosting providers offer minimal support. But hosting providers that provide WordPress-specific features and services are more qualified. Therefore, they can help you in case of an error in WordPress.

Conclusion

Dedicated hosting for WordPress solves the following problems:

  1. Security threats from other sites and obsolete files.
  2. Poor site performance.
  3. Difficulties with site restoration.
  4. Disk space shortage.
  5. Lack of support for fixing bugs and other problems.

 

 

Harm or Benefit: Google to Consider Nofollow Links When Ranking

Harm or Benefit: Google to Consider Nofollow Links When Ranking

A few weeks ago in their blog, Google announced that changes related to the attribute “nofollow”. Now he will be Advisory in nature, and the decision as to whether to consider the link when the ranking will decide the search engine.

The news was unexpected and caused quite a rough and ambiguous reaction from representatives of the SEO industry. One gets the feeling that while such measures can be understood fully only to “Google”.

What is “nofollow” and why Google decided to end it?

The attribute of “nofollow” was introduced in 2005 to combat referral spam in the comments. Google had forbidden the search engine to follow the links to give them weight and to consider them as a signal when ranking. Over time, it began to use the forums and large sites, which users in most cases recorded, only to get the link. The closures of the links spread throughout the network, and for a long time SEOs passed each other sacral base dofollow sites with open links due to the influx of spammers, too, eventually closed.

Currently, almost all large sites with lots of user-generated content (“Wikipedia”, “Twitter”, “Facebook”, etc.) automatically adds “nofollow” to each URL on the page. Some less popular ones take up this practice in the hope that the use of a special attribute to links will make the site “cleaner” and more attractive in the eyes of search engines. In the end, doing everything, but poorly understood, why.

The problem is that using “nofollow”, sites simply do not allow you to scan links, giving the search engine to evaluate them. Because of this, search engines lose from sight a large amount of information that could be used to improve algorithms. Finally, realizing the problem, Google has decided to radically change the rules of accounting nofollow links and give them a chance to participate in the ranking. Now the search engine itself will determine whether to consider a link or not.

What exactly are the innovations

For Google “rel=”nofollow” will no longer be a prohibition on changing the link and the transfer of their weight/authority (although, as shown by some studies and experiments, this attribute had not been forbidden). Now it will act as tips, recommendations, which the search engine will take into account when making their own decisions about “fate” reference.

While this applies to transfer weight to the page and consideration when ranking, but from March 1 the search engine will also in its sole discretion to scan these URLs.

In addition, there are two new attribute for marking advertising and user-generated content: “rel=”sponsored” a” and “rel=”ugc””. They also serve only as a hint but give useful for search engine information.

New attributes can be combined with “nofollow”. For example, if a webmaster doesn’t want the search engine to take into account links from blog comments, you should add the line “rel=”nofollow ugc”.

What to do now?

Nothing. Representatives from Google have assured that there is no need to make changes if everything was done according to the rules. Even if you used “nofollow” for marking of advertising links and continue to do so after the emergence of new attributes, it’s no crime. The use of “sponsored” and “ugc” helps search engines to understand the type of reference, but is not mandatory.

Why some have perceived the changes negatively

Largely this measure may seem useless: on the one hand, Google has introduced new attributes that can help them deal with the nature of the links and left the need to use these attributes, at the discretion of webmasters, thereby not encouraging them to help. Realizing that innovations will add a headache, the search engine does not require them to apply decisive action. Perhaps over time, Google will start to promote sites for the use of “sponsored” and “ugc”, but it benefits webmasters unclear.

As for the attribute “sponsored”, it is many members of the SEO community is generally wary suddenly after a massive “coming out” Google will come up with a punishment for those who post too much (in their opinion) advertising?

Another concern of SEOs is associated with the changes that will occur in the next year when Google in its sole discretion will begin scanning links with “nofollow”. A common practice is to close the links to special pages (filters online stores, baskets buyers, etc.) that the robot did not spend time on them. If Google will choose to scan a URL or not, whether it will lead to problems with indexing?

Representatives of the search engine claim that it is not necessary to abandon the usual practice. As said by Danny Sullivan in his “Twitter”, in most cases, the transition to a new Advisory model does not change the relationship of the search to the links.

danny sullivan

 

This information was confirmed by John Mueller on 20 September during a Hangouts chat with webmasters. He said that the changes will affect external links, but from the inside, everything will remain the same.

john mueller

Why it is really is good?

The information in the blog search engines and the social networks are one common promise: any major changes innovation will not bring. But whatever the assurances of Google, it still needs to consider some nuances.

To get links from authoritative resources it is becoming increasingly difficult, and the use of “nofollow” devalues their website work. Although, if you look, even links from forums can be useful, as often they communicate with experts and not just spammers.

If Google itself will evaluate the usefulness of links, regardless of the attribute, this can play into the hands of honest link builders, and they will get more advantages in the rankings. Over time, the algorithm will learn even better to recognize low-quality links and will make the existence of grey schemes close to impossible.

Many SEOs believe that Google used to consider nofollow links before, but only now the search engine has openly declared about the implementation of such practices. Innovations will put an end to years of disputes SEO specialists and will bring greater clarity to the process of link building.

Conclusion

Changes have only recently entered into force, so we can only guess what they actually will. It is obvious that Google is moving by leaps and bounds in the direction of improving search and use all means available to him So, corny as it may sound, it’s time to go on the bright side and to invest more effort in creating a quality reference mass of the site.

 

 

Great Ideas About Search Engine Optimization

Great Ideas About Search Engine Optimization

So, you have made the decision to utilize SEO tactics for your site. That is a great thing! But, with all the information out there, finding a good place to begin can be difficult.

There’s no need for you to worry, the search engine optimization tips you might need are right here.

The following SEO tips will assist you as you get started on this journey. 

Search engine optimization is perhaps one of the greatest marketing tools to come about online, but without proper article submissions it won’t work out to your liking. That is why it’s imperative to search and find the best article directories to submit your hard-earned work and watch the numbers start to add up.

Use alt tags for images and span element titles to your advantage.

Search engines look at a site’s code, not what is actually visible to a user, so if your keyword is “cat” and there is a picture of a calico cat on your site, using an alt tag of “a calico cat” for the image will expose the search engine to your keyword even if the user never sees it. The title of a span element works in the same way.

Websites need to be regularly refreshed with new content and pages to help with search engine optimization.

Keywords are great yet they can only help your site to an extent. If you are writing about popular subjects, it is easy to get lost in the crowd. You do not want your website ranked low on a search engine. Keep your recommendations fresh by linking to appropriate and influential high ranking websites on a regular basis.

Avoid using Flash content to help with SEO because it won’t.

While it looks great and can be impressive, you are not allowed to link to single pages in a Flash site. For the best results, don’t rely completely on Flash. If you want to use it, do so sparingly.

Include the most important keywords for your site in the left-hand navigation bar and title of your homepage.

These texts will be searched before the main text on your website, so you should include the keywords with which you would like your site to be most closely associated with.

Don’t use generic words in your keywords list, like “computers” and “books”.

This will generate too many results and will most likely, not show your site at the top. Instead, using more specific words and phrases like “digital marketing course in Chennai,” can be less competitive and be more effective for your site.

It should go without saying that one of the keys to search engine optimization is to promote your website.

Make great use of all the social networking tools at your disposal and don’t forget to set up a newsletter and RSS feeds to give your visitors new reasons to keep coming back for more.

URL extensions are like differently shaped light bulbs.

They all light up a room. In other words, using .html, .htm, or .php will not change how a search engine views your website. You can use whichever extension you choose. There is no distinction, and it has no impact on your ratings.

You should make use of the keyword tool from Google Adwords to optimize the search engine.

The keyword tool will find the most popular keywords that are related to your website. The Adwords tool will show you the number of searches for a word or phrase that you enter. Use this tool to find the best overall words or phrases to use for your site.

Sometimes it’s helpful to ask yourself what key words you would type in to search for your particular business, and then include those in your site.

Add keywords to both your title tag and main content, but keep your keyword density to a sane level to avoid getting the dreaded “keyword stuffer” label.

If you come across favorable reviews, stories, or mentions of your brand or product on another site, capitalize on the free publicity by linking back to that site (and possibly even returning the favor). This is a popular and highly successful tactic that increases your exposure to online visitors to other sites that may not even be directly related to your business.

Increase your visibility to search engines by taking steps to ensure that your site’s title, keyword tags, and page descriptions are not duplicated anywhere within the domain.

Each and every page must have its own unique title, meta description, and meta keywords tag embedded within the site’s HTML code.

All the SEO in the world won’t help you if your website host is unreliable!

Before you choose a host, check their reviews to ensure they’re stable and easy to work with. A website that is down is one that is NOT making money. Also, make sure they have good customer service. You never know what issue might come up that you’ll need help with!

If you are optimizing a blog, your post title tag should be optimized separately from your blog title.

It is important to try to use the major keywords you’ve selected for the topic of your blog in the post title tag as major Internet search engines will index those tags and put a high priority on them.

A powerful search engine optimization tool is Google’s Webmaster Tools.

This program allows you to see how Google’s search engine robots experience your site so that you can change things to make it easier for them to navigate as well as discover what weaknesses your site may have so that you can address them.

Optimize your internal links, too.

Not only does using keywords for internal linking increase the ease of navigation throughout your site, but it can also boost your search engine rankings. Use intelligent internal links, such as “Contact [business name]” rather than “Contact Us”, or “View our [item keywords]” rather than “View our listings”.

After reading this article, you should have a working knowledge of Serch Engine Optimization. That was a lot to think and read through, but at least you should have an idea of what to do and where, to begin with, the SEO of your site. If need be, take a look at this piece again.

x Logo: Shield
This Site Is Protected By
Shield