Thursday, June 21, 2012

What is duplicate content?

Internet Marketing Service
 In the field of search engine optimization makes for many years, always a round of the term that triggers among the optimizers fear and insecurity. We are talking about so-called duplicate content. But what is this all about? This question let's get to the bottom.

The fact is that the search engines and especially market leader Google special importance on the fact of finding unique content on each Web page. By implication, this means that search engines see it at all happy when text content appears twice on one side or on different sites on the Internet.

The rationale for this approach are clear. The seeker will certainly not be pleased if he gets on a specific topic proposed five different sites from the search engine, which all have the same text content. For this reason, the search engines try to detect duplicate content on the Internet and filter out accordingly, for example, the fact that the relevant Web pages from the index of banned or at least be a whole lot further back.

Professional search engine optimization (SEO)for business websites.

For webmasters, it obviously means the meltdown, if they - consciously or unconsciously - produce such duplicate text content (duplicate content). It is therefore important to know, such as duplicate content created and how they can avoid the run-up already.

We assume that you have written the text content of your website itself and not copied from another website. If so, you should delete the relevant text content quickly, otherwise sensitive threaten legal action (warning letters, lawsuits, etc.). On the other hand, you also need to investigate your part, if another webmaster has copied text content of your page. You simply copy any passage out of your text and give them up with quotes in the search. There is nothing you can at least assume that nobody is using copied texts from your site in search engine index.

Next step: Check if the domain of your site at various locations on the Internet is accessible. It might look as follows:

http://yoursite.com

http://www.yoursite.com

http://www-iyoursite.com/index.html

If this is the case, there is a risk of producing side internally duplicate content. The workaround is to set up a permanent redirect using the . You put on your web server to a text file named " htaccess. and fill them with the following content ":

(RewriteEngine On

RewriteCond %{HTTP_HOST}

redirect 301 / http://yoursite.com/)

A problem could also arise from dynamic URLs. This is particularly the case if your website is written in PHP and "speaking" the URL's of the cryptic notation to convert PHP. If so, you ensure that old or unneeded sites are immediately removed from the server, and finally only the new URL's are available.

Next point problem: Have you moved recently to a new domain with your web server? Then duplicate content could also arise by the old server automatically linked with the corresponding IP address to the new. Solve the problem by deleting the old server or switch off completely.

If you follow these tips and tricks that might double the risk of producing content to be vanishingly small. They create the best possible conditions, that your website is positioned as closely as possible in the search results.

Friday, June 15, 2012

New algorithm promises to change the Google search results

Internet Marketing Service
 Basically, the Panda is an update of the Google search robot and its main mission is to penalize so-called "content farms" and "link farms" (farms of contents and links), which are increasingly common on the web, and thus benefit blogs and sites that produce original content and quality.

Many website and blogs survive today's recipe clicks on banners and ads generated through affiliate programs and programs like Google AdSense. Therefore, to generate enough traffic and a fast indexing, use the publication as a tactic of mass content (articles, news, reviews and even videos). Many do so without worrying about the quality, origin or even the veracity of it, using copied content in full. Others do not come to publish content. Just create a new link url and title using the keywords most popular now, in the text and publish a brief paragraph and a lot of ads around (hate this).

Fortunately, this change will be quite ruthless in algorithmic terms of indexing and results for these sites and blogs that will have to change their strategies or disappear quickly.

According to the article by Peter Day Official Google Blog, just something between 6 to 9% of organic search will be affected by the new algorithm. In the United States when it was released, the new algorithm has affected about 12% of search engine results. The most affected sites are directories of articles where any user can freely publish content, and for this reason, most of the content is copy of what was originally published on websites or blogs of the authors.

In the image below we can see the decline in access to U.S. sites that use the strategies of "content farms" and "link farms":

Internet Marketing Service

This new Google algorithm should improve the user experience in every way. Everyone is tired of finding positions in the first results that lead to low quality sites, feeling deceived and wasting your time. This has already started to affect the credibility of Google as a search engine. "

Olimpia hints so you can create a website or a blog of quality and improve their results are:

  • Consider first the user: This is nothing new, but has become paramount. Google considers relevant sites that are relevant to its users, and how to measure this is through indicators that we have access in their own Google Analytics, such as average pages per visitor, the time that users spent on the site and the rejection rate (and therefore the engagement, ie, how many times the same user back to your site).
  • Establish a "theme" for your site or blog: Sites and blogs on a specific subject experts become more relevant to users and consequently for the search engines;
  • Publish original content and always quality: Your users will not return to your site if you do not find any difference;
  • Use the correct Portuguese: Panda will also penalize content that has many errors in grammar, spelling and agreement on the same page. Errors are common and acceptable, but do not overdo it;
  • Avoid copying whole or in part contents and news: Quotes are still useful and enriching content, but without exaggeration. Quote your sources remains essential;
  • Continue using the formatting standards of SEO:You can continue using tags H1, H2 and H3 for titles and sub-titles; Bold and italics to highlight important parts of the article; attributes "title" links and "alt" in the images, hyperlinks and related items;
  • Avoid excessive use of banners and advertisements;
  • Avoid images and applications that let your site heavy: The loading time of a page becomes more relevant from now;
  • Create a clean site without advertising hype and easy to navigate: Usability is also a key factor. keep your user more time on the site and avoid "rejection" are important to demonstrate to the Panda that your site is relevant.

Panda Algorithm is just one of more than 500 changes that Google announced it will hold in its search engine's algorithm by the end of the year. This demonstrates that it is increasingly essential to improve the quality of your site in all directions.