AnalyticaHouse

Marketing tips, news and more

Explore expert-backed articles on SEO, data, AI, and performance marketing. From strategic trends to hands-on tips, our blog delivers everything you need to grow smarter.

Effects of IP Class and Server on SEO
Sep 4, 2022 1248 reads

Effects of IP Class and Server on SEO

Search Engine Optimization (SEO) is one of the fields whose importance is increasing every day with digitalization and the growth of e-commerce. Website owners who conduct SEO work may sometimes overlook important metrics. Domain Class IP and server-side scans are also among the important SEO metrics.What is Class IP?Class IPs are the names given to the classes where websites are located between certain IP addresses. There are 5 different class IP categories here. Class A IP Class B IP Class C IP Class D IP Class E IP In these classes, Class A IP represents the range between addresses 1.0.0.1 and 126.255.255.254, Class B IP between 128.1.0.1 and 191.255.255.254, Class C IP between 192.0.1.1 and 223.255.254.254, Class D IP between 224.0.0.0 and 239.255.255.255, and Class E IP between 240.0.0.0 and 254.255.255.254.The Impact of Class IPs on SEOClass IPs generally represent the IP addresses where websites are located. Since Class D and E IPs are usually used for developer-related purposes, they are not found on the world wide web. Among Class A, B, and C IP addresses, the most valuable are Class A IPs. This is because Class A IP addresses are harder to obtain and represent the group of IPs where many valuable websites on the internet are hosted. For this reason, access to these IPs is quite difficult and costly.Therefore, websites with Class A IPs are highly valued by search engines. Since their costs are very high, websites with Class A IP addresses are known to have index and authority priority by search engines. About 75–80% of the websites hosted on the web consist of Class B and Class C IPs. From this, you can understand how valuable Class A IPs are.Server and SEO RelationshipOne of the common mistakes after publishing a website is starting SEO efforts without checking which server the website is hosted on. Hosting companies generally use shared hosting servers for websites, and multiple websites can be hosted on the same server as your website.This situation may negatively affect the visibility of your website if one of the websites on the same server; is penalized by search engines, hosts illegal, gambling, violent, etc. content, contains duplicate content, falls into spam or sandbox status In such harmful cases, the server IP address may be restricted by search engines, negatively impacting your website’s visibility as well. To prevent this, a server scan must be performed.How to Perform a Server Scan?There are many free platforms you can use to perform a server scan and learn which other websites are hosted on the same server as yours. Our recommendation is to use the two platforms below.https://wmaraci.com/sunucu-tarama https://www.hosttescil.com/seo/sunucu-tarama/After visiting the relevant address, simply paste your website into the box above and click query.Once the scan is complete, as you can see in the image above, other websites hosted on the same server will be listed. These websites should be checked, and if there is any harmful website, it should be reported to the hosting company immediately.How to Exit a Server Hosting Harmful Sites?If you think your website may be harmed by other sites hosted on the same server, you can purchase a static IP address for your website, ensuring it has its own IP address instead of the server IP and can be customized.Another method is to contact your hosting company, inform them of the situation, and request that your website be moved to a different server. Hosting companies usually handle such cases with understanding and will move your website to a different server without charging you.Additionally, during your scan, you can view websites sharing the same Class C IP address as yours, and if there are websites you consider harmful, you can again contact your hosting provider and request to be moved to a different Class IP address.

How To Do Toxic Backlink Analysis? Disavow Action
Sep 4, 2022 6841 reads

How To Do Toxic Backlink Analysis? Disavow Action

With the importance of search engine optimization (SEO) for brands and platforms day by day, backlinks, one of the effective SEO metrics, have started to come to the fore. Backlinks, which became a big market between 2012-2018, are among the SEO metrics that search engine algorithms evaluate by considering many criteria and significantly affect website visibility.What is Backlink?Backlink, by definition, is the name given to the backlinks/links that a website receives from a different website. For example, if an article shared on a blog site links to articles/contents on different sites it refers to, it is a backlink to the site it refers to. Search engines see backlinks as a reference for website value. Between 2012 and 2018, the more backlinks a website received, the more valuable and visible that website was. However, after 2018, when backlinks turned into a market, users made backlink sales by establishing many artificial websites, and backlink evaluation was abused, search engines decided to change the evaluation strategy on the backlink side. At this point, we came across toxic/useful backlinks concepts.Useful BacklinksAs the name suggests, useful backlinks are backlinks that positively affect a website's visibility and authority. The most important criteria that determine the usefulness of a backlink are the authority and relevance of the linking site. At the beginning of our article, we mentioned that search engines have changed their strategy in backlink evaluation. Now, search engines have begun to consider backlinks obtained naturally (artificial/non-purchased), from sites with high authority, as a positive evaluation criterion, not according to the number of backlinks. To understand that a backlink is useful, it is necessary to pay attention to the following criteria: The authority of the website is high, Original content, Not giving too many backlinks to be considered spam, High DR & PR authority values, Close relevance to the linked page and site, Does not contain illegal content (betting, violence, etc.), No penalty (sandbox, ban, etc.) on the part of search engines, The domain is not newly opened. Toxic BacklinksToxic backlinks, which we can think of as the opposite of useful backlinks, are often links that harm the website's visibility and authority. If a website has more than one of the following features to which it is linked, this backlink brings more harm than good to the relevant website. The authority of the website is low, It contains duplicate content, Exiting links to many websites to the extent that it can be considered spam, Low DR & PR authority values, Lack of relevance to the linked page and site, It contains illegal content (betting, violence, etc.), Having a past/current penalty (sandbox, ban, etc.) by search engines, The domain has just been opened, The link came from forums, blog comments with artificial user reviews How to Do Toxic Backlink Analysis?Search engines want website owners to consciously follow backlinks to their sites and reject them via Search Console Disavow, which we mentioned at the end of our article.When performing toxic backlink analysis, the above items must be taken into account. Otherwise, the unconscious disavow operation will also affect the authority of useful links and may greatly damage the visibility of the website. For example, let's analyze the malicious backlink of analyticahouse.com together.Step 1 - Choosing the Platform to List BacklinksFirst of all, we need to see what the backlinks coming to our website are and their authority. Usually, Search Console > Backlinks tab or paid SEO analysis tools are used for this. The links in the Search Console > Backlinks section are usually updated very late, and all platforms with active backlinks are not displayed in this section. That's why many SEO experts prefer paid SEO tools when analyzing toxic backlinks. For this, the most well-known Ahrefs and Semrush tools can be preferred. Today, we will analyze the toxic backlinks of our website on Ahrefs.Step 2 - Crawling the DomainWe perform our crawling and analysis by typing our relevant domain into the domain/URL input in the Ahrefs panel.As you can see in the preview, the website analyticahouse.com has a total of 1810 backlinks from 172 websites.Step 3 - Identifying Linking DomainsIn this section, we must first click on the "Linking Domains" section on the left and look at which domains the backlinks come from.On the page that opens, we see the list of domains giving backlinks and their DR values. Here, it is primarily for us to detect websites with a DR value below 10 and note these domains aside.When we note these websites aside, we see that there are mostly links from domain list sites with .pw extensions. If there are links from natural and useful websites in these links, we remove them from the list and continue to host the remaining domains in our list.While some of the domains here are domain list sites, some are mostly sites focused on digital marketing, SEO, and performance advertising. Although it is not very correct to evaluate it in the toxic backlink category, since search engines can recognize and make sense of domain list sites, it is a healthy practice to reject domain list sites with low authority.Step 4 - Link Exit Number of Backlinking SitesAnother important factor in malicious link analysis is how many other sites the linked domain is linked to. The "Dofollow linking domains" tab in the 5th column of Ahrefs scan shows how many different sites that site has dofollow links.The important criterion here is; especially low and medium DR value websites, dofollow links to quite a lot of websites. Since websites with too many dofollow links will be considered as spam backlinks by search engines, it would be logical to add these websites to our disavow list.Step 5 - Anchor Texts of BacklinksAnother factor that determines the quality level of the backlinks taken to the website is the keywords that these links are given. Because users will view the current website by clicking on these keywords and in fact, our website will take reference from these keywords. To do this, click on the "Link Texts" button in the left-hand menu.In the window that opens, we can see from which anchor text the links received and how many domains link in the same anchor text.The important criterion here will be to identify domains that are unrelated to the website and give empty/incorrect anchor texts and add them to the disavow list.After adding all these domains to our disavow list, we can now prepare our disavow file that we will upload to Search Console.Step 6 - Preparing the Disavow.txt FileDisavow file is a tool through which we reject toxic backlinks through Search Console and reject all damage to our website from these domains. The domain list to be uploaded to this tool must be uploaded in a specific format and .txt extension. For this, we first create a file named "analyticahouse-disavow.txt".After creating our file, we add the domains we want to reject, each one on a line.The point to be noted here is; domains must be entered as domain names only. For example, if we want to reject the domain at https://sitename.com/page, it only needs to be entered as domain:sitename.com.If you want to reject a link from only 1 page in that domain, not a domain, then just paste that URL into the line in the txt file. For example, if we only want to reject links from the /page instead of rejecting all links from the sitename.com domain, it is sufficient to add "https://sitename.com/page" to the relevant .txt file.Step 7 - Uploading Disavow File to Search ConsoleAfter preparing our related disavow.txt file, we log in to the address below.https://search.google.com/search-console/disavow-linksThen we choose our website from the "Select Property" section.After choosing our website, we select the .txt file we have prepared by saying "Upload disavow list".That is all! We have successfully uploaded our disavow file and have rejected backlinks that we think will be toxic to our website. We should apply this process once every month, detect the updated malicious links and add them to the disavow file, and then update our file by saying "Replace".

What are Google Penalties and How to Remove Them?
Sep 4, 2022 11439 reads

What are Google Penalties and How to Remove Them?

Is your site not getting the traffic it deserves even though you have done digital marketing work correctly? This is one of the worst scenarios that site owners or digital marketers can face. Because there is a possibility that the site may face Google penalties.Thanks to the power of the Google search engine, it has the opportunity to control the search results. In other words, your site must meet certain standards in order to be listed on Google. Sites that do not meet the standards set by Google are kept in the background.What is Google Penalty?A Google penalty means that your site isn't appearing correctly in search engines or your rankings for target keywords have dropped drastically. When your site encounters a Google penalty, you cannot reach your target audience. Therefore, your traffic values and income decrease.Google penalty is a situation that every site can face. The reason you encounter a Google penalty may be that you used the wrong SEO efforts. When your site gets a Google penalty, getting rid of it and trying to regain a reputation is not an easy task.You may need to take many different actions to avoid the Google penalty. You should review your site and remove the situation that caused you to be penalized. Then you should try to gain rankings again by using the right SEO studies.When trying to save your site from a Google penalty, you should learn what search engines deem necessary on a website. Thus, you can make your site appear in search results again. In some cases, you may need professional support to avoid the Google penalty.Google's Fight Against Spam and Poor ResultsWhy are Google penalties applied? Google's main task is to provide its users with the most relevant search results. A site that tries to trick the search engine and change the results is seen by Google as spam and poor quality. This poor quality means a worsening of the user experience.The Google penalty is a natural result of advanced algorithms that assist Google in performing and auditing site crawls. Penalties are generally applied through the algorithm, even if there are manual actions such as Google's occasional immediate penalty.All penalties are to protect Google's search results. Google clearly states what it expects from sites in the webmaster guidelines. Any action taken against these guidelines will be considered spam and sites will be penalized if necessary.Google detects spam sites during the crawling phase. Pages that Google deems as spam are not indexed, or even if they are, they are kept in the background. Manual action is taken by Google's review team when sites seriously violate the guideline.The Most Common Spam Actions That Cause Google PenaltiesMost of the Google penalties (90% or more) are implemented through the algorithm. You may suspect a Google penalty for sudden traffic drops or ranking losses on your site. The most common actions that can result in you being penalized are as follows: Possession of spyware, adware, viruses or other malicious software on your site. Include hidden links or hidden text on your site. Showing different content to different site visitors to search engines. Redirecting users to different pages even though they do not take any action on your site. Filling the page with unrelated keywords to gain a ranking advantage. Overuse of targeted keywords in the content. Having a significant amount of duplicate or duplicate content on pages on your site. The list of situations that can cause you to receive Google penalties is quite extensive. However, the reasons for the penalties applied to the sites in general are as above. As you can see for these reasons, the main reason why sites are penalized is to try to deceive the search engine.Actions That Cause Manual Google PenaltiesThe manual Google penalty is the type of penalty you should be most afraid of. If your site has been penalized by a manual action, you will have to work hard to restore your site. For this reason, you should stay away from attempts that could result in a manual Google penalty. To tolerate abuse of your site through spam transactions. Allowing spam content (comments, forum messages, etc.) created by users. Taking advantage of services that are used for free but whose main purpose is to create spam. Attempting to abuse structured data or taking erroneous actions about it. Getting unnatural links to your site or buying links in bulk. Making unnatural link exits from your site or selling links. Creating weak content that does not add any value to the visitor. Performing sneaky redirects that users don't notice. Using hidden texts filled with keywords in the site content. Taking actions that violate Google News and Google Discover policies. These are the most important actions that result in a manual Google penalty for site owners, even as Google regularly tries to overhaul and keep its policies up to date. However, in general, gaining unnatural links and giving unnatural links are the most important penalties.How to Detect Google Penalties?The process of learning whether your domain name has been penalized by Google is called google ban query. If you have received a Google penalty by manual action, you will be notified through Google Search Console. If your site has been penalized by algorithms, Google will not inform you about it. You can use various tools to determine if you have been penalized. Semrush: You can audit your site by going to Semrush Sensor. Once the audit is complete, Semrush will show you which pages have high volatility. Thanks to this volatility, you can find out whether your site has been penalized by algorithms. Panguin: By using the Panguin tool, you can find out if your site is penalized by algorithms in a very short time. In order to use this tool, you need to grant access to your Google Analytics account. So you can check your organic traffic data. FE International: When using the FE International tool, all you have to do is add your site to the tool and review historical data. During the review, you should check the link between the rises and falls in your traffic values and algorithm updates. MozCast: With the MozCast tool you get an estimated value for changes in the algorithm. If your traffic has dropped drastically but MozCast data is normal, it could mean your site is facing a potential Google penalty. Google Search Console: Google Search Console is your best friend to control your site. If your site has been penalized by manual actions, you can see all notifications from your property here. You may have to make an extra effort to remove every penal action you detect. Rank Ranger: Rank Ranger is an ideal tool for those who don't want to deal with too much statistical information. If you see a lot of red values and your traffic has dropped when you generate a report, you should be careful. This could mean that your site is the target of the algorithm. AccuRanker Grump: The AccuRanker Grump tool lets you track Google. The more grumpy the tiger figure on the site, the greater the volatility in the rankings. If the tiger is grumpy and your site traffic has decreased, the root cause may be due to the changes in the algorithm. Fruition: Frution tool is one of the ideal tools for checking Google penalties. This tool analyzes traffic changes by looking at historical site data. If you are facing too many negative situations in traffic data, you may have received an algorithm penalty. Getting a Google penalty can be scary, but what can really bother you is being unaware of it. You can't notice what's going on as your site's traffic values drop. You can find out if there is any problem on your site by using Google penalty check tools.How to Review Google Penalties Based on Links?Links are still among the most important ranking factors for Google. You may face Google penalties if you unnaturally link to your site. If your site has somehow been penalized due to an unnatural link structure, you should perform two important checks to produce the necessary solution.1. Checking Link TextsIf you have been penalized by Google for links, the first point you should check is the link texts. After scanning your site in any link analysis tool, you should examine the anchor text section. In a natural link profile, the links will be in the form of explicit URLs or branded words.In an unnatural link profile, the link texts are usually filled with target keywords. This is a clear indication that you have purchased links. Google's algorithms are capable of detecting such behavior. That's why it's so important to pay attention to diversity when creating links.2. Checking the LinksWhen you want to check inbound links to your site, you need to use a link analysis tool. The data in the link analysis tools will clearly show the sites that link to your site. You can do this through Google Search Console if you wish. But a link tool is more functional.You should do the same check for outgoing links. If you have 100 pages on your site and you link to 1,000 different sites, you are doing an objectionable job. Also, if you get links from such sites, you may experience problems in the future. You should pay attention to the links you receive from your site and the links you provide from your site.Checking the link texts, inbound links to the site and outbound links from the site are the first steps to avoid Google penalties for links. After performing the checks, you should get rid of the problematic links and link texts.How to Remove Google Penalties?The process of removing Google penalties is quite a long process. You may be penalized for security problems, unnatural links, weak content, duplicate or duplicate content, and other major problems. Whichever type of penalty you received, you should do your Google penalty removal process accordingly.1. Security ProblemsIf you have security problems on your site, you will have very serious problems. When Google detects that your site has been hacked or that you have malware on your site, you will be hit hard in terms of search results. Because what happens to your site is indicated in the search results. Shut down your site now: Block people from accessing your site so it doesn't cause more problems. You can do this by showing a 503 error to users. You should then change all usernames and passwords associated with your site. Make the necessary information: Inform other people on your site or various employees, if any, about the subject. You should contact the place where you get the server service and tell them about the situation and help them take precautions for themselves. Find the source of the problem: There is a wide variety of tools available to help you understand what's going on on your site. Check the health of your site from your Google Search Console property. If Google detects malicious software on your site, it will provide you with the necessary information about it. Begin the cleaning process: Remove malware, compromised files, and spam from your site. If you have a clean backup that you have taken before, you can use it and if you do not experience much data loss. After clearing your site from security problems, you should take the necessary measures to avoid a similar situation. You may not be able to completely prevent all attacks on your site, but you can ensure that your site is not an easy target.2. Unnatural LinksWe have already explained some of the things that can be done about unnatural links. If Google has penalized you for unnatural links, it may do so for two different reasons. One of the reasons is unnatural links to your site, and the other is unnatural links from your site. Unnatural inbound links: This is when Google thinks inbound links to your site are unnatural. You may be penalized for conditions such as purchased links, indexed links, spam blog comments, and involvement in link networks. Unnatural outbound links: This is when Google thinks outbound links from your site are unnatural. You can get this penalty if you have sold links to different irrelevant sites from your site or if you have made link outputs in the form of spam. As soon as Google starts to think that the links to and from your site are out of control, you will be hit with a manual penalty. Buying links, exchanging links, getting links with keyword-filled link texts, or creating links with automated programs are the main causes of this problem. Removing inbound links to your site: This takes a very long process. First of all, you should examine your link profile and determine which links are problematic. You should then reject these links in accordance with Google guidelines. Removing outbound links from your site: This process is quite simple. You remove the links from your site to other sites and everything is back to normal. If there is malicious software that continues to automatically place links on your site, you should remove them. You should use advanced linking tools to decide which links are bad or risky. Ahrefs, Semrush and Majestic are the right tools to help you with this. You should not forget that each vehicle has its own control standards.3. Weak ContentIf you've been penalized manually for poor content, this is an easily fixed problem. The problem of poor content is especially common when you have too many affiliate links. The reason for this is to use the text of the company that makes the sale in product or service promotions. Remove content: You should not use text content from the manufacturer or supplier on your site. Every content you publish on your site must be original. Check the content: You should check the content with too many links and keyword-stuffed content in order to get a full ranking. Write new content: You should write new content to replace the content you have removed from your site. When writing content, you should remember that you are writing for people. Create a content plan: Create a content plan to get more content on your site. Try to take advantage of all content types when creating content. If you're incapable of creating content, you may want to consider outsourcing. It is very important that the content on your site reflects your brand and informs the visitors. In this way, you can get rid of the weak content problem.4. Duplicate and Copied ContentThe most important goal of Google is to provide the best search experience to its users. In order to fulfill this purpose, it provides answers to different questions and ranks the sources in each query. It is a situation that spoils the search experience for two different sites with the same content to rank in the search engine at the same time.Google, therefore, tries to decide which content is more original. Sites with more original content will rank higher, while sites with duplicate and duplicate content will rank lower. There are three different situations for duplicate content and it is necessary to understand them well.Recurring content of yours on your siteThis is one of the situations that is usually seen in e-commerce sites. If you have a system that creates different URLs for the same product, Google will perceive this as a move to cheat the algorithm and you will have problems. For example, you can check the following links:https://www.site.com/tr/urun/seo/ https://www.site.com/tr/urun/seo?sort=newest https://www.site.com/tr/urun/seo?sort=best https://www.site.com/tr/urun/seo?session-id=123Each of these links actually belongs to the same page. Only the sort method and session token are used. When these pages are reflected in the search engine results, you will be using your own content on your own site as a copy. To solve the problem, you should resort to the use of canonical tags.Publishing your content on other sitesAfter you write very good content for your site, you may encounter the situation that it will be published elsewhere. For example, a site that copies the content you write may rank with your content because it has better values than your site.The only thing you can do in such a situation is to trust Google. Google's algorithm is quite capable of identifying the original source of content. However, in some cases, it may not be able to do so. You can try different ways to solve the copying problem of your content: Request removal of content: You can contact the owner of the site that copied your content and request the removal of your content. If your request is not answered positively or if there is no response within a certain time, you can proceed to the next step. Request removal of content from Google: You can request removal of content for copyright reasons by visiting Remove Content from Google provided by Google. When reporting content as duplicates, you must report each page individually. Contact the host of the copying site: Once you find the site that copied your content, you can contact where it received the server service. Places that provide such services may terminate the service of the site copying content because they do not want to deal with legal situations. Get support from a lawyer: Getting help from a legal advisor can be a problem in terms of your budget. However, if the copied content is doing great damage to your brand or somehow preventing you from earning income, it's best to take legal action. Inclusion of other people's content on your siteIf you use other people's content on your site, your rankings in Google will decrease. Because Google does not have an approach to offer different sites with the same content to its users. This problem is usually experienced in e-commerce sites.In e-commerce sites, site owners fill product descriptions with descriptions directly from the manufacturer. For this reason, it is common for the same content to appear on hundreds of different sites selling the same product. Regardless, you shouldn't make this mistake. Completely remove duplicate content: Remove all content on other sites that you have published on your own site one by one. You can use the Copyscape tool to make this process easier. With Copyscape, you can quickly detect duplicate content. Write or print new content: After you remove duplicate content from your site, you need to replace it with content. You can write this content yourself using the right keywords. Alternatively, you can get support from others in content writing. Review product descriptions: When major e-commerce sites use product descriptions as they come from the manufacturer, it's unlikely to leave them behind with the same content. You should add 1-2 paragraphs to make the product description unique. Don't forget redirects: If you remove a page entirely, use a 301 redirect to redirect it to a meaningful but different page. Google will understand that the page has been removed permanently. You should be careful when substituting newly made content for duplicate content. Especially when you hire an author for this job, you should ask them for a guarantee that they do not publish the content elsewhere. If the content has been published elsewhere, your site will continue to have duplicate content.Submitting a Reconsideration Request to GoogleIf your site has received an algorithm-based penalty, this part will not be of use to you. If you receive a manual penalty, you must request reconsideration after removing the circumstances that gave rise to the penalty.After seizing every opportunity to improve the current quality of your site, you can turn to a reconsideration request. A reconsideration request is a simple request to Google to have your site reviewed for compliance with Google guidelines.If you're going to request a reconsideration on your Google Search Console property, the most important thing is to tell Google what you're doing. Google offers you a text field for this and you should make the most of it. For this, you should consider the following tips: Detail what you've done: You don't need to write dozens of pages when making a request to Google. Just write down what you did, why you did it, and what results you expect. You can also specify dates and influence Google. Share your solutions: If you have been penalized for unnatural links, you should state your solution. If you have removed 1,234 unnatural links from your site, you should write this. So Google will understand more clearly what exactly you are doing. Notify those responsible: If your site has become like this after getting service from an SEO agency, you can put the responsibility on them. If you are sure of the situation, you can blame the SEO agency. You can say that you are no longer working with them and that you are trying to do the right things. You should be mindful of the language you use when requesting a reconsideration. You should not use angry, impatient or accusatory language when making a request to Google. After requesting reconsideration, you should wait patiently.Rejection of a Request for ReassessmentThere is no guarantee that the reconsideration request you send to Google will be positive. If your reconsideration request is denied, the Google penalty will remain in effect. When your request is rejected, you can get more detailed information about the problem on your site.When your reconsideration request is denied, you must get back to work. You should make the necessary arrangements on your site using the tips that Google offers you. If the problem is related to the links, then you should analyze the links again.After you have completely eliminated the problems on your site, you can send a re-evaluation request. The more serious the problems on your site, the more likely you are to be rejected repeatedly. You should not neglect to take the necessary steps.Google penalties are the last thing you want to encounter. If you do not want to receive these penalties, you must comply with some general rules. You should be aware that any over-optimized operation will be a problem and the rules can change constantly.

What is Keyword Cannibalization and How to Solve It?
Sep 4, 2022 1398 reads

What is Keyword Cannibalization and How to Solve It?

Keyword cannibalization on a website occurs when two or more pages compete with each other for the same keyword or keyword groups, undermining each other’s rankings. Depending on how these pages were created, the solutions will vary.With the Diversity update, Google aimed to prevent multiple pages from the same domain from filling the top search results for the same query.What Is the Diversity Update?Introduced in June 2019, this update took a significant step to curb powerful domains dominating search results. The Diversity update ensures that only one page from a given domain appears in the search results for a specific query, allowing smaller domains to earn organic traffic and offering users more choices. Rather than targeting each keyword with multiple pages as before, the focus shifted to consolidating content into a single strong page that covers the topic comprehensively, which then ranks higher.https://twitter.com/searchliaison/status/1136739062843432960Negative Effects of Keyword Cannibalization Organic traffic is split between similar pages Conversion rates drop and user experience suffers Backlink equity is divided among competing pages Crawl budget is wasted on redundant pages What Keyword Cannibalization Is NotFor two pages ranking for the same keyword to truly cannibalize each other, they must both appear in similar ranking positions. If one page ranks well while the other sits far down the results, that is not cannibalization. Addressing a single keyword by consolidating pages can inadvertently remove traffic from other keywords, causing an overall drop. Be sure true cannibalization exists before you act.How to Detect Keyword CannibalizationIf a page’s traffic suddenly drops or fluctuates, there could be many causes. However, if search volume and average ranking remain stable while impressions and clicks fall, cannibalization may be at work.Your goal is to find pages targeting the same queries or serving the same intent. By merging such pages, you can combine their traffic and help the consolidated page climb in the SERPs.If two pages target similar intent but different queries, differentiate them so each page serves a distinct purpose—for example, user-intent pages for “buy running shoes” versus “running shoe reviews.”1. Google Search ConsoleIn Search Console’s Performance report, filter by your keyword (Exact Query) and date range. Compare the two URLs in the Queries tab. If you see alternating ranking or traffic drops on one page when the other rises, cannibalization is likely:2. “site:” SearchUse a site search (e.g., site:example.com "your keyword") to see which pages are indexed for that term. If multiple pages appear, review them for overlap:3. Advanced Web RankingAWR shows how many URLs a domain ranks for a keyword. Multiple URLs indicate potential cannibalization:4. AhrefsIn Ahrefs’ Organic Keywords → Movements report, track keyword ranking changes. Sudden swaps between two URLs suggest cannibalization:Causes & Remedies for Keyword CannibalizationCannibalization can stem from various issues; solutions depend on the root cause.– Duplicate ContentIf identical content lives on multiple URLs, consolidate them into one page (301 redirect the rest) so only a single URL ranks.– Similar ContentGoogle may treat highly similar pages as duplicates. Differentiate the content, or merge and redirect weaker pages to the strongest version.– Thin ContentPages with thin content fail to provide unique value and may be seen as duplicates. Enrich product/category pages with unique details and user intent signals.– No Dedicated Landing PageIf no page is optimized for a keyword, Google may default to your homepage. Create a dedicated landing page for each target keyword.– Indexable URL ParametersIndexable parameters can create multiple URLs for the same content. Use canonical tags, noindex, or Search Console’s parameter handling to address this.– Internal Linking StructureInternal links and anchor text guide Google’s understanding. Ensure you link relevant pages consistently to highlight your primary URL for each keyword.– Backlink ProfileInbound links and their anchor text also signal relevance. Make sure external links use the correct URL and keyword anchor to avoid splitting link equity.Conclusion Detecting and fixing cannibalization will improve your visibility and rankings. Before creating new content, check if you already have pages targeting the same topic. It’s often better to expand and optimize an existing page than to create a new one. If no competing page exists, include related keywords and topics to broaden the page’s coverage. Optimize internal linking and site hierarchy to focus authority on your primary page for each keyword. Offer a single, comprehensive, and well-optimized page for each search intent.

What is Pinterest SEO? How to Do Pinterest SEO?
Sep 3, 2022 4993 reads

What is Pinterest SEO? How to Do Pinterest SEO?

Pinterest has risen in popularity in recent years and is indispensable for e-commerce sites. Pins on Pinterest have much longer lifespans than posts on other social platforms, so Pinterest SEO deserves your full attention.Even if your follower count is modest, strategic Pinterest SEO can drive far more traffic to your profile than your audience size suggests. With the right SEO, you could send over 50,000 visits to your profile even if you have only 1,000 followers.Why Pinterest SEO MattersThere are several compelling reasons to invest in Pinterest SEO. Over 2 billion searches happen on Pinterest each month, and more than 400 million users log in at least once a month.Source: https://www.oberlo.com/blog/pinterest-statisticsThose numbers make Pinterest a significant traffic source. Yet many users and businesses misuse the platform. To outperform competitors on Pinterest, you need solid SEO.How Pinterest WorksPinterest is more a visual search engine than a social network. According to its Help Center, it’s “a visual discovery engine for finding ideas.” It behaves much like Google: users come seeking ideas, inspiration, and useful content— not just social interaction. If users can’t find what they want, they quickly leave. That makes SEO critical. Pinterest vs. Other Social Platforms Traffic from Pinterest is long-lasting. On other platforms, posts often “expire” quickly from users’ feeds. With consistent, strategic pinning, you can build a steady stream of traffic to your site. Done right, Pinterest SEO is a free way to drive organic traffic, which you can then funnel to your calls to action. How Pinterest Surfaces Content Pinterest’s main feed uses the Smart Feed algorithm, showing each user pins picked for relevance and quality. You can also view content only from people you follow. Hashtags are supported—but older pins won’t retroactively gain traction from newly added hashtags. Most discovery happens via Pinterest’s search bar. Users type queries and browse the results. Key Ranking Signals Domain authority: If your pins link to your own website, Pinterest checks your site’s popularity. Pin quality: User engagement (saves, clicks) is a major quality signal—Pinterest wants users to spend more time on the platform. Activity level: Regularly save others’ content and grow your network to boost your own visibility. Relevance: In board and pin descriptions, use the right keywords for your niche. Getting Started with Pinterest SEOPinterest SEO closely parallels Google SEO: you need authority, expertise, and relevance. Follow these first steps:1. Create a Business AccountYou must have a free Pinterest Business account to do SEO effectively. If you already have a personal account, you can convert it or add a linked business profile. Business accounts can customize their header image. You gain access to analytics: impressions, saves, and click data. Run Promoted Pins (ads) once you’re set up. Claim your website and social profiles so your logo appears on pins from your site. 2. Optimize Your ProfileYour displayed name, username, and bio all affect SEO and discoverability. Username is fixed, but your display name should include top keywords. Your bio is prime real estate—use focused keywords and a call to action. 3. Build AuthorityYour profile’s authority determines how many eyes your pins get. To become an authority: Post consistently high-quality, niche-relevant content. Create multiple boards for different topics. Repin ~10 top posts from other authorities to each board. Space out your pinning over days, not all at once. Next, strengthen your site signals: Most Pinterest traffic is mobile—ensure a responsive, fast site. If your site loads slowly, Pinterest visitors will bounce back to the platform. Encourage users to save your pins from your site (via the Save button). Keyword Optimization for PinterestPinterest’s constantly evolving “smart feed” rewards the best SEO. Pin and board descriptions must contain the right keywords.1. Find Relevant Keywords Use Pinterest’s search autocomplete: type a seed keyword and note the suggestions. Long-tail phrases (e.g. “best running shoes for flat feet”) beat generic terms for discoverability. Track your keywords in a spreadsheet to keep organized. 2. Optimize BoardsEmpty board descriptions are a wasted opportunity. Do this: Create boards matching your main topics (e.g. “DIY Home Décor,” “Healthy Recipes”). Use your target keywords in the board name and description. Fill boards with your own pins plus a few top pins from others. 3. Optimize Pins Start with a keyword search in Pinterest’s search bar. In the results, note the color-highlighted suggested keywords (e.g. “minimalist living room,” “living room ideas”). Use those suggestions in your pin title and description. Also optimize your image file name, add relevant hashtags, and include a clear call to action. Driving High-Value TrafficTo see real ROI on Pinterest, focus on:1. Engagement Follow complementary accounts—get on their followers’ radar. Pin when people are most active (evenings and breaks). Link your other social profiles to quickly build your Pinterest base. Reply to comments on your pins to foster engagement. 2. Conversion GoalsPinterest-driven conversions often happen over time. Monitor your click data and avoid pressuring new visitors with hard sells. First build trust with consistent, helpful content.3. Clear, Informative DescriptionsWrite descriptions that tell users exactly what they’ll find on your site. SEO is important, but clarity and promise of value come first.4. Strong VisualsStock photos underperform—Pinterest’s visual search looks for real, unique images. Use custom photos or eye-catching graphics. Images with embedded text can help the algorithm understand your pin’s topic.5. Leverage Video Users love video tutorials in categories like food, beauty, or décor. Pinterest places video pins higher in search results. Videos boost engagement, but produce them at a sustainable pace. Pinterest is a powerful platform when used correctly. With thorough Pinterest SEO, you can attract new audiences and grow traffic to your site.

What is JavaScript SEO and How is it Done?
Sep 3, 2022 1403 reads

What is JavaScript SEO and How is it Done?

One of the most common problems faced by people doing SEO work is that JavaScript content is not discovered by search engines. If you are dealing with sites built with JavaScript, the issues you encounter will be quite different from classic content management systems.If you want to succeed in search engines with a JavaScript-heavy site, you need to carry out JavaScript SEO. You must ensure that your site’s pages are created correctly, indexed, and are search-engine friendly.What is JavaScript?JavaScript is highly valuable when it comes to web development solutions. HTML and CSS are the foundation, but many developers prefer to leverage JavaScript. The reason is that JavaScript makes it possible to make sites more interactive.When JavaScript is used, it becomes easier to update the content on pages dynamically. For example, websites that share constantly streaming data like match scores use JavaScript. This way, data is updated in real time with minimal delay.Without JavaScript, you would need to refresh the page constantly to follow such data. Therefore, even if you build the foundation of the site using HTML and CSS, you need JavaScript to make the site interactive and perform real-time updates.What is JavaScript SEO?JavaScript SEO is part of technical SEO work. It is indispensable for sites built with JavaScript. JavaScript usage is quite popular. It is especially used on e-commerce sites to generate main content or to link to similar products.Despite its popularity, sites built with JavaScript often perform poorly in search engines. The main reason is that JavaScript SEO efforts have not been carried out correctly.Google Index and JavaScriptUsing JavaScript is great for users. However, the same cannot be easily said for search engines like Google. Getting JavaScript content indexed by Google does not always happen. Google’s approach to this is a bit different.While Google can easily index some content created with JavaScript, it may not index other content. Here, it’s important that the site using JavaScript content is structured correctly. However, you may encounter a similar situation even if your site is built with HTML. JavaScript Crawling Difficulty: On HTML sites, crawling content is quite easy. Search engine crawlers scan everything quickly. On JavaScript sites, the crawler visits the site but cannot find the links. It downloads and renders JS files and then examines them. Crawler Limitations: Google’s crawler does not always crawl all content. If the content on your site depends on cookies and other stored data, the crawler may not see them. Therefore, it’s not easy to say Google is excellent at JavaScript rendering. Despite these challenges and limitations, Google continues to improve its search crawler. As long as content is important to Google, it is considered worth rendering. In addition, when the rendering process takes too long, Google’s crawler is designed to skip it.Is Using JavaScript Bad for SEO?JavaScript makes it harder to detect various SEO problems because it’s not guaranteed that Google will execute every JavaScript snippet on a page. While trying to make your site successful, you must put in extra effort—and most importantly—apply JavaScript SEO methods.JavaScript is not entirely bad for SEO. Many sites that use JavaScript extensively still enjoy strong organic visibility. In modern web development, JavaScript is a necessity—just like HTML and CSS.Is JavaScript SEO Mandatory?If you own a JavaScript-heavy site, JavaScript SEO is mandatory. Without this work, you’ll struggle to ensure your content is discovered by Google. If the content located within the JavaScript section isn’t discovered, this may also mean the page itself isn’t discovered.Even though Google is getting better at reading and interpreting JavaScript content day by day, you still need to take the necessary steps to be discoverable. If your site content is tied to JavaScript, you should use JavaScript SEO methods to ensure it gets indexed.JavaScript SEO: Core RequirementsOnce you clearly understand the relationship between JavaScript and SEO, you can take the first step toward JavaScript SEO. If you want your JavaScript-built site to perform well on Google, you should know you must dedicate time to JavaScript SEO. Google should be able to crawl your site, understand its structure, and discover its valuable assets. Google should be able to render your site without difficulty. Google should not consume excessive crawl budget while crawling and rendering your site. If you want to succeed in JavaScript SEO, you must meet these three requirements. JavaScript rendering is a serious burden in terms of crawl budget. Once you exhaust the crawl budget Google allocates to you, some of your pages will be ignored by Google.Search-Engine-Friendly JavaScript ContentThere are two important ways to check whether JavaScript content is detected and indexed by Google. The quickest method is to use the “site:” search operator. The other method is to check your Google Search Console property. But first, you should perform the following checks: Make sure Google can technically render JavaScript content easily. It’s not enough to open Google Chrome and check the site. Instead, go to your Google Search Console property and inspect your site with the URL Inspection Tool. During checks, verify whether the main content appears. See whether Google can access the related posts or similar products section. If you notice issues during these checks, you should make the necessary adjustments as part of JavaScript SEO. If Google is having trouble rendering your site, the reasons may include timeouts, various errors, or blocked JavaScript files.After the basic checks, you should verify whether your site is on Google. To do this, search for “site:https://www.analyticahouse.com/tr/blog/seo-terimler-sozlugu” and check whether the content is on Google.If the URL you checked appears on Google, you can move on to a detailed review. Take the content located in the JavaScript-rendered section on your site and search for it on Google. From this, you can learn whether the JavaScript content has been crawled.So far, the method we tried was the simplest. However, if you want to perform a much more advanced examination, you should turn to Google Search Console. To perform the necessary checks in your GSC property, follow these steps: Log in to Google Search Console and paste the relevant URL into the URL Inspection tool. Review the inspected URL and check the content rendered with JavaScript to perform the necessary checks. Repeat similar steps for different URLs on your site. Remember that a single URL is not sufficient for verification. During checks, you may find that JavaScript content is not being crawled. The reasons can include the Googlebot timing out, rendering issues in the content, some resources being skipped, low-quality content, and the page not being discovered.Delivering JavaScript Content to GoogleAfter making your JavaScript content search-engine friendly, you need to ensure it is delivered to Google correctly. At this point, two different methods are used: server-side rendering and client-side rendering. Rendering: Rendering is the process of presenting site content, templates, and other features to the user. Rendering has two types: server-side rendering (SSR) and client-side rendering (CSR). Server-Side Rendering: With SSR, when a user visits the site, the page is rendered on the server and sent to the browser. Since JavaScript doesn’t need to be rendered separately, this is generally the most suitable method for SEO. Client-Side Rendering: CSR can be problematic in terms of performance. Slow-loading pages negatively affect overall page rankings. To avoid problems, JavaScript SEO methods and CSS should be used effectively. Some sites use both main rendering methods together. This approach is called dynamic rendering. In dynamic rendering, the site switches between the two rendering techniques depending on who is accessing the site. Thus, pre-rendered pages are served to users.When JavaScript content is delivered correctly, Google notices JavaScript code immediately and processes it properly. Google’s crawler attempts to crawl millions of sites, so there is a crawl budget allocated to each site.Google’s crawler handles JavaScript sites in two stages. In the first stage, the crawler looks at the HTML content and evaluates the site using it. Then, the JavaScript that needs to be rendered is processed. When SSR is used, indexing the site is easier.Anyone who wants to benefit from JavaScript SEO should use as much HTML content as possible. This way, critical information will be sent to the crawler in the first stage, making it possible for the site to be ranked based on that information.Common Mistakes in JavaScript SEO WorkAlthough JavaScript is uniquely important in site development, it can be a headache when not used correctly. No matter how good the site is, there will be technical shortcomings if JavaScript is misused. Therefore, you should pay attention to common mistakes when using JavaScript:Ignoring HTML: The most important information on the site should be delivered with HTML, not JavaScript. Search engine crawlers process the initial HTML. If you want your site to be indexed quickly, create critical information with HTML.Incorrect Use of Links: Links help people interact better with your site. When using JavaScript, you need to structure links correctly. If the site is structured with JavaScript, Google recommends not using HTML elements for links.Blocking Google Bots: Unless Google’s crawlers revisit your site, they cannot detect JavaScript code. Some developers use the “noindex” tag, making this revisit impossible. Make sure you don’t have this kind of issue on your site.JavaScript and PaginationMany sites use pagination to spread long content across multiple pages. However, most of these sites only allow Google to visit the first page. Therefore, when Google crawls, it cannot notice the other valuable content.The reason for this error is the link structure. Sites do not use to implement pagination. Instead, they perform pagination based on a user’s click action. As a result, Google’s crawler must “click” to view other pages.When Google’s crawler visits a site, it does not click or scroll. For Google to notice the next page, links must be used. When links are not used in this way, they are not noticed by Google and your content cannot be discovered.Use of Hashes and RedirectsOne of the most common situations in JavaScript sites is creating URLs using a hash (#). Google may have trouble crawling a page where a hash is used. You should use the correct URL structure to make Google’s job easier.Incorrect URL: https://www.analyticahouse.com/tr/#/seo-glossary Incorrect URL: https://www.analyticahouse.com/#seo-glossary Correct URL: site:https://www.analyticahouse.com/tr/blog/seo-terimler-sozlugu If you use incorrect URLs of the types mentioned, you are quite likely to face various crawling problems. Even overlooking a tiny detail means taking a problematic step in terms of JavaScript SEO. You should perform the necessary checks to avoid such situations.In addition to the use of hashes, you should pay attention to various redirects. If you are implementing redirects via JavaScript on your site, you may run into issues. Therefore, it is much better to perform redirects with server-side 301s.JavaScript SEO work is part of technical SEO. While trying to improve your site in terms of technical SEO, you should not forget the JavaScript side of things. Not every error is caused by JavaScript. Therefore, performing the correct SEO audits is very important.