AnalyticaHouse

Marketing tips, news and more

Explore expert-backed articles on SEO, data, AI, and performance marketing. From strategic trends to hands-on tips, our blog delivers everything you need to grow smarter.

What Is a 301 Redirect and Why Is It Important for SEO?
Sep 4, 2022 13418 reads

What Is a 301 Redirect and Why Is It Important for SEO?

Websites may need to be moved to other pages for different reasons. However, there may also be cases where a domain name is completely moved to another domain name. In this and similar cases, visitors entering the old page are redirected to the new page or site with a 301 redirect.What Is 301 Redirect?As we have simply stated above, a 301 redirect is to ensure that visitors and search bots coming to the old URL encounter the new page if any page of a website or the entire website is permanently moved to another address. If there is a situation such as a temporary redirection of a page, then a 302 redirect should be made.A 301 redirect is essential for site visitors and search engine bots. If a 301 redirect is not applied for a closed page, visitors to that page and search bots will encounter a 404 incorrect page. This will lead to a negative user experience and crawl budget problem.The 301 name given to the redirect comes from the HTTP status code of the redirected page. For instance; https://analyticahouse.com redirects to https://www.analyticahouse.com/.In its simplest form, the 301 redirect tells the browser that the page at the relevant address has been moved permanently, that the redirected address is the new page location, and that this page is not intended to be used again.How To Do a 301 Redirect?There are many different ways to do a 301 redirect. You can see the most used methods under the headings below.htaccess 301 RedirectThe first method we will talk about is how to redirect using the .htaccess file in the file manager of the site. When you enter the file manager of the site, you will see that this file is usually located under the public_html folder. If you can't find the .htaccess file in your site's file manager, it's probably due to 3 reasons:The .htaccess file may be hidden. This file is usually marked as hidden in its default settings. To solve this problem, first click on the "Settings" section of the file manager. In the menu that follows, the "Show Hidden Files (dotfiles)" option is activated. Then the .htaccess file will appear under the public_html folder. The .htaccess file may not exist. If the .htaccess file is not visible despite the above edit to show hidden files, it probably does not exist. However, there is nothing to worry about. Creating the .htaccess file is pretty easy. It will be enough to create an empty Notepad for devices with Windows operating system and an empty TextEdit file for macOS and save this file with the .htaccess extension.The website may not be running on the Apache server. Although it is a technical issue with web servers, the reason why the .htaccess file is not in the file manager may be that the server is not an Apache server. Because only Apache servers use .htaccess files. If the site is hosted on a server other than Apache server, redirection is made with different codes. Other than Apache, the most used web servers are Nginx and Windows/IIS. From this content for Nginx; For Windows/IIS redirection, more detailed information can be obtained from the contents in this link.Before moving on to the redirect codes, let us remind you that the codes in the .htaccess file are read line by line. Therefore, when adding code to the .htaccess file, it is useful to leave a blank line under the added code. This blank line indicates that the code is finished.Different codes are used for different purposes in redirecting using the .htaccess file. Let's take a closer look at these situations and the codes to be used in these situations.Redirecting an Old Web Page to a New PageHere is the code that needs to be added to the .htaccess file to redirect an old web page to a new web page:Redirect 301 /old-page.html /new-page.htmlRedirecting a Non-www to a www AddressThe code that should be used to redirect an address that does not contain www to the address bar is as follows:RewriteEngine on RewriteCond %{HTTP_HOST} ^example.com [NC] RewriteRule ^(.Redirecting a www to a Non-www AddressThe code to be used to redirect an address entered with www to a non-www domain name is as follows:RewriteEngine on RewriteCond %{HTTP_HOST} ^www.ornek.com [NC] RewriteRule ^(.How to Make a WordPress 301 Redirect?As it is known, WordPress is the most used content management system in the world, also known as CMS. The most important reason for this is that WordPress has plugins that allow many different operations to be done easily.301 redirection of a site with WordPress infrastructure can also be done easily with a free plugin.The plugin is called Redirection. After activating this plugin, which you can easily install on the site from the Plugins menu, the relevant menus of the plugin will allow you to quickly redirect.After the plugin is installed, it is located under the tools menu. The plugin can be opened by clicking the Redirection button here.After opening the plugin, the plugin must be installed first. Then, when the "Add New" button is pressed for redirection, a menu like this is displayed.Here, the old URL address should be entered in the source URL section, and the new URL address to which the redirect will be made in the Target URL section. Then pressing the “Add Redirect” button will be sufficient for the redirection process to take place.How to Do a 301 Redirect with CloudflareMany websites are concerned about security and related reasons. There are some third-party services that are used to address these concerns and control traffic to the site. The most known and the most preferred one in terms of user experience is undoubtedly Cloudflare.301 redirects for sites that perform security audits through Cloudflare servers are made from the Cloudflare user panel. The following steps should be followed for a Cloudflare 301 redirect:1) Öncelikle Cloudflare panelinde sol tarafta yer alan “Rules” menüsü altındaki “Page Rules” kısmına tıklanmalıdır. 2) Then you will see a screen like this. The URL to be changed here must be entered in the field indicated in red.3) Afterwards, the URL to be redirected must be entered exactly in the specified part.4) Select whether the redirect is permanent or temporary from the Select status code menu. 301 for permanent redirects and 302 for temporary redirects.5) Finally, by clicking on the “Save and Deploy Page Rule” button, the transaction will be saved and terminated.What is the Importance of 301 Redirects for SEO?The 301 redirect is, of course, a concept highly related to SEO, as it is directly related to the link structure of the website. Because the correct setting of 301 redirects is a situation that will directly affect the ranking of the website.301 redirect is also a value that reveals the value of a page in Google's eyes, although it is not directly evaluated after this date, which Google used actively until 2014.Although Google does not currently use the value it creates depending on the PageRank algorithm as a direct ranking factor, it has been stated in the messages given by Google officials at various times that PageRank is still taken into consideration along with many other factors in the rankings.John Mueller, who has worked at Google for many years and is currently working as a consultant at Google, states that PageRank is still being evaluated in a dialogue with a Twitter user in 2020.301 redirects are also directly related to this PageRank value. Until 2016, the PageRank of a page with 301 redirects was losing about 15% with a single redirect. When 301 redirects were made more than once, this depreciation was more than 15%.However, in a tweet by Gary Illyes, one of the Google developers in 2016, it was stated that 301 and 302 redirects no longer affect the Google PageRank algorithm.Therefore, redirecting a page to another page with a 301 redirect makes the redirected page equivalent to the referring page. In other words, 30X redirects, in essence, do not differ between pages for Google bots.However, there are some other points to consider in terms of SEO when doing a 301 redirect.The HTTP version page should be redirected to the HTTPS version page.Most of the time, especially on websites made with CMS, this is an automatically developed feature. However, as we mentioned above in detail, a website's secure connection is possible with an HTTPS link and SSL certificate. Therefore, if the website has visibility with the HTTP version, it should be ensured that this URL is redirected to the HTTPS version.301 redirected pages should be removed from the sitemap file.The sitemap file is a .xml file that contains all the links of a website and guides Google bots to crawl the site more easily. In some cases, this file contains 301 redirected links.Since pages with 301 redirects are technically invalid, there is no point in having them in the sitemap file. For this reason, removing 301 redirected pages from the sitemap is of great importance in reducing the crawl budget.Redirect chains should be avoided.Redirect chains refer to multiple redirects of a URL. Google warns site developers about this: “In cases where Googlebot and site crawler follow more than one redirect (page 1 > page 2 > page 3), our recommendation is to redirect directly to the last page. In cases where this is not possible, the ideal referral chain should not exceed 3 referrals and should be less than 5 in any case.”Redirect loops should be fixed.The redirect loop occurs when a URL in the redirect chain redirects to another URL in the chain.For example; page 1 > page 2 > page 3 > page 2 > page 3 > page 2 > page 3 …A routing chain forms a routing loop. In such a case, the browser cannot find the page to go to and cannot run the page, giving a warning that this page is redirecting too many times.Such a situation is very bad for both Googlebot and site users. Because while it directly affects the user experience, it also causes an absurd situation for Googlebot to follow an unlimited URL loop. This creates a very harmful situation in terms of the crawl budget.If there is such a situation on a website, the URLs in the redirect loop should be saved from this loop and a healthy redirect chain should be created.Sourceshttps://developers.google.com/search/docs/advanced/crawling/301-redirectshttps://blog.hubspot.com/blog/tabid/6307/bid/45/the-importance-of-google-pagerank-a-guide-for-small-business-executives.aspxhttps://developers.google.com/search/docs/advanced/crawling/site-move-with-url-changes?hl=en&visit_id=637868258041107034-3922323913&rd=1https://ahrefs.com/blog/301-redirects/https://www.contentkingapp.com/academy/crawl-budget/

How To Do Toxic Backlink Analysis? Disavow Action
Sep 4, 2022 6841 reads

How To Do Toxic Backlink Analysis? Disavow Action

With the importance of search engine optimization (SEO) for brands and platforms day by day, backlinks, one of the effective SEO metrics, have started to come to the fore. Backlinks, which became a big market between 2012-2018, are among the SEO metrics that search engine algorithms evaluate by considering many criteria and significantly affect website visibility.What is Backlink?Backlink, by definition, is the name given to the backlinks/links that a website receives from a different website. For example, if an article shared on a blog site links to articles/contents on different sites it refers to, it is a backlink to the site it refers to. Search engines see backlinks as a reference for website value. Between 2012 and 2018, the more backlinks a website received, the more valuable and visible that website was. However, after 2018, when backlinks turned into a market, users made backlink sales by establishing many artificial websites, and backlink evaluation was abused, search engines decided to change the evaluation strategy on the backlink side. At this point, we came across toxic/useful backlinks concepts.Useful BacklinksAs the name suggests, useful backlinks are backlinks that positively affect a website's visibility and authority. The most important criteria that determine the usefulness of a backlink are the authority and relevance of the linking site. At the beginning of our article, we mentioned that search engines have changed their strategy in backlink evaluation. Now, search engines have begun to consider backlinks obtained naturally (artificial/non-purchased), from sites with high authority, as a positive evaluation criterion, not according to the number of backlinks. To understand that a backlink is useful, it is necessary to pay attention to the following criteria:The authority of the website is high,Original content,Not giving too many backlinks to be considered spam,High DR & PR authority values,Close relevance to the linked page and site,Does not contain illegal content (betting, violence, etc.),No penalty (sandbox, ban, etc.) on the part of search engines,The domain is not newly opened.Toxic BacklinksToxic backlinks, which we can think of as the opposite of useful backlinks, are often links that harm the website's visibility and authority. If a website has more than one of the following features to which it is linked, this backlink brings more harm than good to the relevant website.The authority of the website is low,It contains duplicate content,Exiting links to many websites to the extent that it can be considered spam,Low DR & PR authority values,Lack of relevance to the linked page and site,It contains illegal content (betting, violence, etc.),Having a past/current penalty (sandbox, ban, etc.) by search engines,The domain has just been opened,The link came from forums, blog comments with artificial user reviews How to Do Toxic Backlink Analysis?Search engines want website owners to consciously follow backlinks to their sites and reject them via Search Console Disavow, which we mentioned at the end of our article.When performing toxic backlink analysis, the above items must be taken into account. Otherwise, the unconscious disavow operation will also affect the authority of useful links and may greatly damage the visibility of the website. For example, let's analyze the malicious backlink of analyticahouse.com together.Step 1 - Choosing the Platform to List BacklinksFirst of all, we need to see what the backlinks coming to our website are and their authority. Usually, Search Console > Backlinks tab or paid SEO analysis tools are used for this. The links in the Search Console > Backlinks section are usually updated very late, and all platforms with active backlinks are not displayed in this section. That's why many SEO experts prefer paid SEO tools when analyzing toxic backlinks. For this, the most well-known Ahrefs and Semrush tools can be preferred. Today, we will analyze the toxic backlinks of our website on Ahrefs.Step 2 - Crawling the DomainWe perform our crawling and analysis by typing our relevant domain into the domain/URL input in the Ahrefs panel.As you can see in the preview, the website analyticahouse.com has a total of 1810 backlinks from 172 websites.Step 3 - Identifying Linking DomainsIn this section, we must first click on the "Linking Domains" section on the left and look at which domains the backlinks come from.On the page that opens, we see the list of domains giving backlinks and their DR values. Here, it is primarily for us to detect websites with a DR value below 10 and note these domains aside.When we note these websites aside, we see that there are mostly links from domain list sites with .pw extensions. If there are links from natural and useful websites in these links, we remove them from the list and continue to host the remaining domains in our list.While some of the domains here are domain list sites, some are mostly sites focused on digital marketing, SEO, and performance advertising. Although it is not very correct to evaluate it in the toxic backlink category, since search engines can recognize and make sense of domain list sites, it is a healthy practice to reject domain list sites with low authority.Step 4 - Link Exit Number of Backlinking SitesAnother important factor in malicious link analysis is how many other sites the linked domain is linked to. The "Dofollow linking domains" tab in the 5th column of Ahrefs scan shows how many different sites that site has dofollow links.The important criterion here is; especially low and medium DR value websites, dofollow links to quite a lot of websites. Since websites with too many dofollow links will be considered as spam backlinks by search engines, it would be logical to add these websites to our disavow list.Step 5 - Anchor Texts of BacklinksAnother factor that determines the quality level of the backlinks taken to the website is the keywords that these links are given. Because users will view the current website by clicking on these keywords and in fact, our website will take reference from these keywords. To do this, click on the "Link Texts" button in the left-hand menu.In the window that opens, we can see from which anchor text the links received and how many domains link in the same anchor text.The important criterion here will be to identify domains that are unrelated to the website and give empty/incorrect anchor texts and add them to the disavow list.After adding all these domains to our disavow list, we can now prepare our disavow file that we will upload to Search Console.Step 6 - Preparing the Disavow.txt FileDisavow file is a tool through which we reject toxic backlinks through Search Console and reject all damage to our website from these domains. The domain list to be uploaded to this tool must be uploaded in a specific format and .txt extension. For this, we first create a file named "analyticahouse-disavow.txt".After creating our file, we add the domains we want to reject, each one on a line.The point to be noted here is; domains must be entered as domain names only. For example, if we want to reject the domain at https://sitename.com/page, it only needs to be entered as domain:sitename.com.If you want to reject a link from only 1 page in that domain, not a domain, then just paste that URL into the line in the txt file. For example, if we only want to reject links from the /page instead of rejecting all links from the sitename.com domain, it is sufficient to add "https://sitename.com/page" to the relevant .txt file.Step 7 - Uploading Disavow File to Search ConsoleAfter preparing our related disavow.txt file, we log in to the address below.https://search.google.com/search-console/disavow-linksThen we choose our website from the "Select Property" section.After choosing our website, we select the .txt file we have prepared by saying "Upload disavow list".That is all! We have successfully uploaded our disavow file and have rejected backlinks that we think will be toxic to our website. We should apply this process once every month, detect the updated malicious links and add them to the disavow file, and then update our file by saying "Replace".

What are Google Penalties and How to Remove Them?
Sep 4, 2022 11439 reads

What are Google Penalties and How to Remove Them?

Is your site not getting the traffic it deserves even though you have done digital marketing work correctly? This is one of the worst scenarios that site owners or digital marketers can face. Because there is a possibility that the site may face Google penalties.Thanks to the power of the Google search engine, it has the opportunity to control the search results. In other words, your site must meet certain standards in order to be listed on Google. Sites that do not meet the standards set by Google are kept in the background.What is Google Penalty?A Google penalty means that your site isn't appearing correctly in search engines or your rankings for target keywords have dropped drastically. When your site encounters a Google penalty, you cannot reach your target audience. Therefore, your traffic values and income decrease.Google penalty is a situation that every site can face. The reason you encounter a Google penalty may be that you used the wrong SEO efforts. When your site gets a Google penalty, getting rid of it and trying to regain a reputation is not an easy task.You may need to take many different actions to avoid the Google penalty. You should review your site and remove the situation that caused you to be penalized. Then you should try to gain rankings again by using the right SEO studies.When trying to save your site from a Google penalty, you should learn what search engines deem necessary on a website. Thus, you can make your site appear in search results again. In some cases, you may need professional support to avoid the Google penalty.Google's Fight Against Spam and Poor ResultsWhy are Google penalties applied? Google's main task is to provide its users with the most relevant search results. A site that tries to trick the search engine and change the results is seen by Google as spam and poor quality. This poor quality means a worsening of the user experience.The Google penalty is a natural result of advanced algorithms that assist Google in performing and auditing site crawls. Penalties are generally applied through the algorithm, even if there are manual actions such as Google's occasional immediate penalty.All penalties are to protect Google's search results. Google clearly states what it expects from sites in the webmaster guidelines. Any action taken against these guidelines will be considered spam and sites will be penalized if necessary.Google detects spam sites during the crawling phase. Pages that Google deems as spam are not indexed, or even if they are, they are kept in the background. Manual action is taken by Google's review team when sites seriously violate the guideline.The Most Common Spam Actions That Cause Google PenaltiesMost of the Google penalties (90% or more) are implemented through the algorithm. You may suspect a Google penalty for sudden traffic drops or ranking losses on your site. The most common actions that can result in you being penalized are as follows:Possession of spyware, adware, viruses or other malicious software on your site.Include hidden links or hidden text on your site.Showing different content to different site visitors to search engines.Redirecting users to different pages even though they do not take any action on your site.Filling the page with unrelated keywords to gain a ranking advantage.Overuse of targeted keywords in the content.Having a significant amount of duplicate or duplicate content on pages on your site.The list of situations that can cause you to receive Google penalties is quite extensive. However, the reasons for the penalties applied to the sites in general are as above. As you can see for these reasons, the main reason why sites are penalized is to try to deceive the search engine.Actions That Cause Manual Google PenaltiesThe manual Google penalty is the type of penalty you should be most afraid of. If your site has been penalized by a manual action, you will have to work hard to restore your site. For this reason, you should stay away from attempts that could result in a manual Google penalty.To tolerate abuse of your site through spam transactions.Allowing spam content (comments, forum messages, etc.) created by users.Taking advantage of services that are used for free but whose main purpose is to create spam.Attempting to abuse structured data or taking erroneous actions about it.Getting unnatural links to your site or buying links in bulk.Making unnatural link exits from your site or selling links.Creating weak content that does not add any value to the visitor.Performing sneaky redirects that users don't notice.Using hidden texts filled with keywords in the site content.Taking actions that violate Google News and Google Discover policies.These are the most important actions that result in a manual Google penalty for site owners, even as Google regularly tries to overhaul and keep its policies up to date. However, in general, gaining unnatural links and giving unnatural links are the most important penalties.How to Detect Google Penalties?The process of learning whether your domain name has been penalized by Google is called google ban query. If you have received a Google penalty by manual action, you will be notified through Google Search Console. If your site has been penalized by algorithms, Google will not inform you about it. You can use various tools to determine if you have been penalized.Semrush: You can audit your site by going to Semrush Sensor. Once the audit is complete, Semrush will show you which pages have high volatility. Thanks to this volatility, you can find out whether your site has been penalized by algorithms.Panguin: By using the Panguin tool, you can find out if your site is penalized by algorithms in a very short time. In order to use this tool, you need to grant access to your Google Analytics account. So you can check your organic traffic data.FE International: When using the FE International tool, all you have to do is add your site to the tool and review historical data. During the review, you should check the link between the rises and falls in your traffic values and algorithm updates.MozCast: With the MozCast tool you get an estimated value for changes in the algorithm. If your traffic has dropped drastically but MozCast data is normal, it could mean your site is facing a potential Google penalty.Google Search Console: Google Search Console is your best friend to control your site. If your site has been penalized by manual actions, you can see all notifications from your property here. You may have to make an extra effort to remove every penal action you detect.Rank Ranger: Rank Ranger is an ideal tool for those who don't want to deal with too much statistical information. If you see a lot of red values and your traffic has dropped when you generate a report, you should be careful. This could mean that your site is the target of the algorithm.AccuRanker Grump: The AccuRanker Grump tool lets you track Google. The more grumpy the tiger figure on the site, the greater the volatility in the rankings. If the tiger is grumpy and your site traffic has decreased, the root cause may be due to the changes in the algorithm.Fruition: Frution tool is one of the ideal tools for checking Google penalties. This tool analyzes traffic changes by looking at historical site data. If you are facing too many negative situations in traffic data, you may have received an algorithm penalty.Getting a Google penalty can be scary, but what can really bother you is being unaware of it. You can't notice what's going on as your site's traffic values drop. You can find out if there is any problem on your site by using Google penalty check tools.How to Review Google Penalties Based on Links?Links are still among the most important ranking factors for Google. You may face Google penalties if you unnaturally link to your site. If your site has somehow been penalized due to an unnatural link structure, you should perform two important checks to produce the necessary solution.1. Checking Link TextsIf you have been penalized by Google for links, the first point you should check is the link texts. After scanning your site in any link analysis tool, you should examine the anchor text section. In a natural link profile, the links will be in the form of explicit URLs or branded words.In an unnatural link profile, the link texts are usually filled with target keywords. This is a clear indication that you have purchased links. Google's algorithms are capable of detecting such behavior. That's why it's so important to pay attention to diversity when creating links.2. Checking the LinksWhen you want to check inbound links to your site, you need to use a link analysis tool. The data in the link analysis tools will clearly show the sites that link to your site. You can do this through Google Search Console if you wish. But a link tool is more functional.You should do the same check for outgoing links. If you have 100 pages on your site and you link to 1,000 different sites, you are doing an objectionable job. Also, if you get links from such sites, you may experience problems in the future. You should pay attention to the links you receive from your site and the links you provide from your site.Checking the link texts, inbound links to the site and outbound links from the site are the first steps to avoid Google penalties for links. After performing the checks, you should get rid of the problematic links and link texts.How to Remove Google Penalties?The process of removing Google penalties is quite a long process. You may be penalized for security problems, unnatural links, weak content, duplicate or duplicate content, and other major problems. Whichever type of penalty you received, you should do your Google penalty removal process accordingly.1. Security ProblemsIf you have security problems on your site, you will have very serious problems. When Google detects that your site has been hacked or that you have malware on your site, you will be hit hard in terms of search results. Because what happens to your site is indicated in the search results.Shut down your site now: Block people from accessing your site so it doesn't cause more problems. You can do this by showing a 503 error to users. You should then change all usernames and passwords associated with your site.Make the necessary information: Inform other people on your site or various employees, if any, about the subject. You should contact the place where you get the server service and tell them about the situation and help them take precautions for themselves.Find the source of the problem: There is a wide variety of tools available to help you understand what's going on on your site. Check the health of your site from your Google Search Console property. If Google detects malicious software on your site, it will provide you with the necessary information about it.Begin the cleaning process: Remove malware, compromised files, and spam from your site. If you have a clean backup that you have taken before, you can use it and if you do not experience much data loss.After clearing your site from security problems, you should take the necessary measures to avoid a similar situation. You may not be able to completely prevent all attacks on your site, but you can ensure that your site is not an easy target.2. Unnatural LinksWe have already explained some of the things that can be done about unnatural links. If Google has penalized you for unnatural links, it may do so for two different reasons. One of the reasons is unnatural links to your site, and the other is unnatural links from your site.Unnatural inbound links: This is when Google thinks inbound links to your site are unnatural. You may be penalized for conditions such as purchased links, indexed links, spam blog comments, and involvement in link networks.Unnatural outbound links: This is when Google thinks outbound links from your site are unnatural. You can get this penalty if you have sold links to different irrelevant sites from your site or if you have made link outputs in the form of spam.As soon as Google starts to think that the links to and from your site are out of control, you will be hit with a manual penalty. Buying links, exchanging links, getting links with keyword-filled link texts, or creating links with automated programs are the main causes of this problem.Removing inbound links to your site: This takes a very long process. First of all, you should examine your link profile and determine which links are problematic. You should then reject these links in accordance with Google guidelines.Removing outbound links from your site: This process is quite simple. You remove the links from your site to other sites and everything is back to normal. If there is malicious software that continues to automatically place links on your site, you should remove them.You should use advanced linking tools to decide which links are bad or risky. Ahrefs, Semrush and Majestic are the right tools to help you with this. You should not forget that each vehicle has its own control standards.3. Weak ContentIf you've been penalized manually for poor content, this is an easily fixed problem. The problem of poor content is especially common when you have too many affiliate links. The reason for this is to use the text of the company that makes the sale in product or service promotions.Remove content: You should not use text content from the manufacturer or supplier on your site. Every content you publish on your site must be original.Check the content: You should check the content with too many links and keyword-stuffed content in order to get a full ranking.Write new content: You should write new content to replace the content you have removed from your site. When writing content, you should remember that you are writing for people.Create a content plan: Create a content plan to get more content on your site. Try to take advantage of all content types when creating content.If you're incapable of creating content, you may want to consider outsourcing. It is very important that the content on your site reflects your brand and informs the visitors. In this way, you can get rid of the weak content problem.4. Duplicate and Copied ContentThe most important goal of Google is to provide the best search experience to its users. In order to fulfill this purpose, it provides answers to different questions and ranks the sources in each query. It is a situation that spoils the search experience for two different sites with the same content to rank in the search engine at the same time.Google, therefore, tries to decide which content is more original. Sites with more original content will rank higher, while sites with duplicate and duplicate content will rank lower. There are three different situations for duplicate content and it is necessary to understand them well.Recurring content of yours on your siteThis is one of the situations that is usually seen in e-commerce sites. If you have a system that creates different URLs for the same product, Google will perceive this as a move to cheat the algorithm and you will have problems. For example, you can check the following links:https://www.site.com/tr/urun/seo/ https://www.site.com/tr/urun/seo?sort=newest https://www.site.com/tr/urun/seo?sort=best https://www.site.com/tr/urun/seo?session-id=123Each of these links actually belongs to the same page. Only the sort method and session token are used. When these pages are reflected in the search engine results, you will be using your own content on your own site as a copy. To solve the problem, you should resort to the use of canonical tags.Publishing your content on other sitesAfter you write very good content for your site, you may encounter the situation that it will be published elsewhere. For example, a site that copies the content you write may rank with your content because it has better values than your site.The only thing you can do in such a situation is to trust Google. Google's algorithm is quite capable of identifying the original source of content. However, in some cases, it may not be able to do so. You can try different ways to solve the copying problem of your content:Request removal of content: You can contact the owner of the site that copied your content and request the removal of your content. If your request is not answered positively or if there is no response within a certain time, you can proceed to the next step.Request removal of content from Google: You can request removal of content for copyright reasons by visiting Remove Content from Google provided by Google. When reporting content as duplicates, you must report each page individually.Contact the host of the copying site: Once you find the site that copied your content, you can contact where it received the server service. Places that provide such services may terminate the service of the site copying content because they do not want to deal with legal situations.Get support from a lawyer: Getting help from a legal advisor can be a problem in terms of your budget. However, if the copied content is doing great damage to your brand or somehow preventing you from earning income, it's best to take legal action.Inclusion of other people's content on your siteIf you use other people's content on your site, your rankings in Google will decrease. Because Google does not have an approach to offer different sites with the same content to its users. This problem is usually experienced in e-commerce sites.In e-commerce sites, site owners fill product descriptions with descriptions directly from the manufacturer. For this reason, it is common for the same content to appear on hundreds of different sites selling the same product. Regardless, you shouldn't make this mistake.Completely remove duplicate content: Remove all content on other sites that you have published on your own site one by one. You can use the Copyscape tool to make this process easier. With Copyscape, you can quickly detect duplicate content.Write or print new content: After you remove duplicate content from your site, you need to replace it with content. You can write this content yourself using the right keywords. Alternatively, you can get support from others in content writing.Review product descriptions: When major e-commerce sites use product descriptions as they come from the manufacturer, it's unlikely to leave them behind with the same content. You should add 1-2 paragraphs to make the product description unique.Don't forget redirects: If you remove a page entirely, use a 301 redirect to redirect it to a meaningful but different page. Google will understand that the page has been removed permanently.You should be careful when substituting newly made content for duplicate content. Especially when you hire an author for this job, you should ask them for a guarantee that they do not publish the content elsewhere. If the content has been published elsewhere, your site will continue to have duplicate content.Submitting a Reconsideration Request to GoogleIf your site has received an algorithm-based penalty, this part will not be of use to you. If you receive a manual penalty, you must request reconsideration after removing the circumstances that gave rise to the penalty.After seizing every opportunity to improve the current quality of your site, you can turn to a reconsideration request. A reconsideration request is a simple request to Google to have your site reviewed for compliance with Google guidelines.If you're going to request a reconsideration on your Google Search Console property, the most important thing is to tell Google what you're doing. Google offers you a text field for this and you should make the most of it. For this, you should consider the following tips:Detail what you've done: You don't need to write dozens of pages when making a request to Google. Just write down what you did, why you did it, and what results you expect. You can also specify dates and influence Google.Share your solutions: If you have been penalized for unnatural links, you should state your solution. If you have removed 1,234 unnatural links from your site, you should write this. So Google will understand more clearly what exactly you are doing.Notify those responsible: If your site has become like this after getting service from an SEO agency, you can put the responsibility on them. If you are sure of the situation, you can blame the SEO agency. You can say that you are no longer working with them and that you are trying to do the right things.You should be mindful of the language you use when requesting a reconsideration. You should not use angry, impatient or accusatory language when making a request to Google. After requesting reconsideration, you should wait patiently.Rejection of a Request for ReassessmentThere is no guarantee that the reconsideration request you send to Google will be positive. If your reconsideration request is denied, the Google penalty will remain in effect. When your request is rejected, you can get more detailed information about the problem on your site.When your reconsideration request is denied, you must get back to work. You should make the necessary arrangements on your site using the tips that Google offers you. If the problem is related to the links, then you should analyze the links again.After you have completely eliminated the problems on your site, you can send a re-evaluation request. The more serious the problems on your site, the more likely you are to be rejected repeatedly. You should not neglect to take the necessary steps.Google penalties are the last thing you want to encounter. If you do not want to receive these penalties, you must comply with some general rules. You should be aware that any over-optimized operation will be a problem and the rules can change constantly.

What is Keyword Cannibalization and How to Solve It?
Sep 4, 2022 1398 reads

What is Keyword Cannibalization and How to Solve It?

Keyword cannibalization on a website occurs when two or more pages compete with each other for the same keyword or keyword groups, undermining each other’s rankings. Depending on how these pages were created, the solutions will vary.With the Diversity update, Google aimed to prevent multiple pages from the same domain from filling the top search results for the same query.What Is the Diversity Update?Introduced in June 2019, this update took a significant step to curb powerful domains dominating search results. The Diversity update ensures that only one page from a given domain appears in the search results for a specific query, allowing smaller domains to earn organic traffic and offering users more choices. Rather than targeting each keyword with multiple pages as before, the focus shifted to consolidating content into a single strong page that covers the topic comprehensively, which then ranks higher.https://twitter.com/searchliaison/status/1136739062843432960Negative Effects of Keyword Cannibalization Organic traffic is split between similar pages Conversion rates drop and user experience suffers Backlink equity is divided among competing pages Crawl budget is wasted on redundant pages What Keyword Cannibalization Is NotFor two pages ranking for the same keyword to truly cannibalize each other, they must both appear in similar ranking positions. If one page ranks well while the other sits far down the results, that is not cannibalization. Addressing a single keyword by consolidating pages can inadvertently remove traffic from other keywords, causing an overall drop. Be sure true cannibalization exists before you act.How to Detect Keyword CannibalizationIf a page’s traffic suddenly drops or fluctuates, there could be many causes. However, if search volume and average ranking remain stable while impressions and clicks fall, cannibalization may be at work.Your goal is to find pages targeting the same queries or serving the same intent. By merging such pages, you can combine their traffic and help the consolidated page climb in the SERPs.If two pages target similar intent but different queries, differentiate them so each page serves a distinct purpose—for example, user-intent pages for “buy running shoes” versus “running shoe reviews.”1. Google Search ConsoleIn Search Console’s Performance report, filter by your keyword (Exact Query) and date range. Compare the two URLs in the Queries tab. If you see alternating ranking or traffic drops on one page when the other rises, cannibalization is likely:2. “site:” SearchUse a site search (e.g., site:example.com "your keyword") to see which pages are indexed for that term. If multiple pages appear, review them for overlap:3. Advanced Web RankingAWR shows how many URLs a domain ranks for a keyword. Multiple URLs indicate potential cannibalization:4. AhrefsIn Ahrefs’ Organic Keywords → Movements report, track keyword ranking changes. Sudden swaps between two URLs suggest cannibalization:Causes & Remedies for Keyword CannibalizationCannibalization can stem from various issues; solutions depend on the root cause.– Duplicate ContentIf identical content lives on multiple URLs, consolidate them into one page (301 redirect the rest) so only a single URL ranks.– Similar ContentGoogle may treat highly similar pages as duplicates. Differentiate the content, or merge and redirect weaker pages to the strongest version.– Thin ContentPages with thin content fail to provide unique value and may be seen as duplicates. Enrich product/category pages with unique details and user intent signals.– No Dedicated Landing PageIf no page is optimized for a keyword, Google may default to your homepage. Create a dedicated landing page for each target keyword.– Indexable URL ParametersIndexable parameters can create multiple URLs for the same content. Use canonical tags, noindex, or Search Console’s parameter handling to address this.– Internal Linking StructureInternal links and anchor text guide Google’s understanding. Ensure you link relevant pages consistently to highlight your primary URL for each keyword.– Backlink ProfileInbound links and their anchor text also signal relevance. Make sure external links use the correct URL and keyword anchor to avoid splitting link equity.Conclusion Detecting and fixing cannibalization will improve your visibility and rankings. Before creating new content, check if you already have pages targeting the same topic. It’s often better to expand and optimize an existing page than to create a new one. If no competing page exists, include related keywords and topics to broaden the page’s coverage. Optimize internal linking and site hierarchy to focus authority on your primary page for each keyword. Offer a single, comprehensive, and well-optimized page for each search intent.

How Google Ads Keyword Match Types Work with 10 Examples
Sep 4, 2022 7988 reads

How Google Ads Keyword Match Types Work with 10 Examples

Knowing how match types work—the cornerstone of Search Network campaigns, the most used campaign type in Google Ads—is crucial for getting effective results. The match type you choose when setting up your campaign determines how efficiently you’ll reach your audience. To speak directly to user intent and secure a spot on the search results page, you need to understand the match types thoroughly. By using different match types, you can minimize unwanted matches with irrelevant queries and achieve a higher conversion rate. There are three keyword match types: broad match, phrase match, and exact match.1 – Broad MatchBroad match is the most expansive match type, helping you reach the widest audience. Your ad may show whenever the user’s query contains any part of your keyword—before, after, or between words. While broad match can drive maximum reach, it also risks matching irrelevant searches, which can waste budget. Better alignment between ad relevance and user intent improves click-through rate (CTR) and lowers cost-per-click (CPC).If your goal is to drive maximum traffic to your page, you can use broad match. However, you must regularly review search terms and add negatives to exclude unwanted queries. Otherwise, your budget may be spent on users who aren’t likely to convert. Broad match makes sense if you don’t have a detailed keyword list; Google’s algorithm can help find the most relevant matches for you. Then, you can exclude irrelevant terms via your negative keyword list.2 – Phrase MatchPhrase match is ideal for targeting your keyword plus its close variations, with more control than broad match. Your ad will appear when a user’s query includes your keyword phrase in the exact order you specify, though extra words can come before or after. To set up phrase match, enclose your keyword in quotation marks: "keyword phrase". Phrase match offers less volume than broad match but delivers higher-quality traffic, improving conversion rates and budget efficiency.3 – Exact MatchExact match is the most restrictive match type, letting you reach a very specific audience with precise search intent. Your ad displays only when a user’s query exactly matches your keyword or close variants. This yields a high CTR and strong conversion rate, though it may raise CPC due to limited reach. To use exact match, wrap your keyword in square brackets: [keyword].Negative Keyword ListsNegative keyword lists help you exclude multiple unwanted terms at once, rather than adding them one by one to individual campaigns. After choosing your match types, compile a list of irrelevant terms and apply it across campaigns. This saves effort and prevents budget waste on non-converting searches.For example, if you sell custom-designed T-shirts and use phrase match "custom designed t-shirt", you might see unrelated queries about DIY designs or tutorials. Regularly review your search terms and add negative terms to keep your campaign focused. Likewise, if you’re a store selling “purple potatoes,” use exact match [purple potatoes] to avoid recipe or planting queries. Better yet, add common irrelevant terms directly to your negative list when you launch the campaign.Example Setups1. Keyword: “custom designed t-shirt”2. Keyword: [purple potatoes]3. Keyword: “Germany to Turkey flight ticket”4. Keyword: [engagement ring]5. Keyword: “sunflower oil”6. Keyword: [red dress]7. Keyword: “faux leather blazer”8. Keyword: [women's shoes]9. Keyword: “Turkish coffee”10. Keyword: [bungalow house prices]