AnalyticaHouse

Marketing tips, news and more

Explore expert-backed articles on SEO, data, AI, and performance marketing. From strategic trends to hands-on tips, our blog delivers everything you need to grow smarter.

The Use of First-Party Data in E-commerce Analytics
Aug 6, 2024 414 reads

The Use of First-Party Data in E-commerce Analytics

First-party data is the information a company collects from its customers and through its own channels. This data is typically gathered through customer interactions, website visits, transactions, and other direct engagements. Here’s why first-party data is so valuable.What is first-party data?First-party data is the information collected directly by a company from its own customers and target audience. This data comes from visits to your website, purchases, and other user interactions. It is considered the most valuable type of data for your business because it comes directly from the source, making it accurate and reliable.Why should we use first-party data? Accuracy and Reliability: Since first-party data comes directly from your customers, it is the most accurate and reliable form of data. It best reflects customer behavior and preferences. Data Control: First-party data is fully under your control, giving you greater oversight in terms of data privacy and security. This makes it easier to comply with privacy regulations like KVKK and GDPR. Personalization: The data collected from your customers allows you to offer more personalized and relevant content. This enhances the customer experience and boosts customer loyalty. Competitive Advantage: You gain access to exclusive data that your competitors can’t access. This makes your marketing strategies and business decisions more effective. So what is third-party data?Third-party data is information collected, aggregated, and sold by entities that are different from the original data source. In digital marketing and data analytics, third-party data is typically sourced externally and may include a broad range of demographic, behavioral, and interest-based data about individuals. This data is gathered by data brokers, aggregators, or other third-party organizations specialized in collecting and selling data.Recently, there have been limitations on third-party data usage, particularly in performance marketing and e-commerce analytics. KVKK requires user consent for data collection, which affects the creation of third-party cookies. Mozilla Firefox and Apple Safari have implemented features like Intelligent Tracking Prevention (ITP) and Enhanced Tracking Protection (ETP), which block third-party cookies by default. Google Chrome planned to phase out third-party cookies by 2022, but this deadline has been pushed to the second half of 2024, aiming to strike a balance between user privacy and an ad-supported web. Safari and Firefox offer advanced privacy to their users by blocking third-party cookies by default.However, with Google’s recent A new path for Privacy Sandbox on the web announcement, a new update has been introduced stating that the blocking of third-party data and cookie deprecation will now be left to user choice.The main reason for this update is the combination of default cookie blocking in browsers, pressure from ad-tech platforms, and underwhelming performance in the tests conducted under the Privacy Sandbox API.For more comprehensive and technical e-commerce analytics, contact us

Profitable Organic Content Production with N-Gram Analysis
Jul 25, 2024 524 reads

Profitable Organic Content Production with N-Gram Analysis

How to Generate Profitable Organic Content with N-Gram Analysis?In the e-commerce projects we consult on for SEO, we have all certainly encountered the expectation: “We don’t want to create blog content only to gain organic traffic; our goal is also to generate revenue from the blog content we produce.” After performing the analyses below, tracking user behavior and observing the results of the planned strategy will be the most critical part, but I hope this strategy sparks an idea in your mind. :)1. Setting Up N-Gram AnalysisFor this strategy, the pool of search queries resulting in sales on the website is first subjected to N-Gram analysis. This analysis breaks down search queries into 1, 2, and 3-component parts, evaluating each component separately.By ranking according to various metrics such as ROAS, Conversion, and CPO, the top-performing 1, 2, 3, or 4-gram terms are obtained. This allows you to see which terms perform best and have the highest spend.This analysis provides insights for targeting high-performing terms more heavily and removing terms with high spend but low contribution to performance from the strategy.2. List of High-Performing Search Terms via N-Gram AnalysisIn Google Ads Search Network campaigns, we identify main categories for organic content revenue targeting through high-performing generic and short-tail keywords. (For example: men, women, children, men’s shoes, children’s t-shirts, women’s blouses, etc.)Encountering the main keywords identified as high-performing in the N-Gram analysis also in 3 and 4-gram versions provides a sort of validation of the strategy.Although our SEO efforts target long-tail keywords corresponding to 3 and 4 grams, analyzing 1 and 2-gram main keywords is essential for identifying high-performing umbrella keywords that may drive results.TIP: Branded queries still have high sales potential compared to non-branded queries, so extra attention should be paid to umbrella keywords within branded queries, as these keywords are already familiar to your users and have higher purchase potential.3. Organic Performance Analysis of High-Performing KeywordsThe organic traffic performance for the website is analyzed for the 3 and 4-gram keywords identified as high-performing in Google Ads Search Network campaigns. Thus, in the organic content strategy built around keywords performing well in Google Ads, the organic performance transfer for queries aligned with SEO targeting can also be evaluated.4. Organic Content CreationFinally, blog content with Informational & Transactional intent is created for the keywords identified through Google Ads and organic performance analyses, with a focus on sales potential.1. For profitable queries ranking in the top 3 organically, a blog strategy aligned with Transactional intent can be developed, as it is possible to gain authority through blog content.2. For queries with average organic traffic performance, ranking 4-10, a blog strategy aligned with Transactional intent can also be developed.In summary, although targeting long-tail queries is still the primary SEO strategy, you can design your blog content strategy for profitability by identifying revenue-generating keywords from N-Gram analysis with generic and short-tail queries and leveraging existing organic performance for those queries.Even if N-Gram analysis is just one method for profitability analysis, it is an essential step for constructing the correct strategy.Content sharing alone is not enough in the user’s purchase journey; you can encourage users toward conversion by using effective internal linking strategies within blog content.

Jul 25, 2024 0 reads

Hello

Preparing for Privacy Sandbox: What is Storage Access API?
Jul 18, 2024 641 reads

Preparing for Privacy Sandbox: What is Storage Access API?

Chrome is gradually phasing out support for third-party cookies to reduce cross-site tracking. This creates a challenge for sites and services that rely on cookies in embedded contexts for user journeys like authentication. The Storage Access API (SAA) allows these use cases to continue while limiting cross-site tracking as much as possible.What is the Storage Access API?The Storage Access API is a JavaScript API for iframes to request access to storage permissions that would otherwise be denied by browser settings. Embedded elements with use cases dependent on loading cross-site resources can use this API to request access from the user when needed.If the storage request is granted, the iframe will be able to access cross-site cookies, just like it would if the user visited that site as a top-level context.While it prevents general cross-site cookie access often used for user tracking, it allows specific access with minimal burden on the user.Use casesSome third-party embedded elements require access to cross-site cookies to provide a better user experience — something that will no longer be possible after third-party cookies are disabled.Use cases include: Embedded comment widgets that require login session details. Social media “Like” buttons that require login session details. Embedded documents that require login session details. A top-level experience delivered within an embedded video player (e.g., not showing ads to logged-in users, knowing user caption preferences, or restricting certain video types). Embedded payment systems. Many of these use cases involve maintaining login access within embedded iframes.Using the hasStorageAccess() methodWhen a site first loads, the hasStorageAccess() method can be used to check whether access to third-party cookies has already been granted.// Set a hasAccess boolean variable which defaults to false. let hasAccess = false; async function handleCookieAccessInit() { if (!document.hasStorageAccess) { // Storage Access API is not supported so best we can do is // hope it's an older browser that doesn't block 3P cookies. hasAccess = true; } else { // Check whether access has been granted via the Storage Access API. // Note on page load this will always be false initially so we could be // skipped in this example, but including for completeness for when this // is not so obvious. hasAccess = await document.hasStorageAccess(); if (!hasAccess) { // Handle the lack of access (covered later) } } if (hasAccess) { // Use the cookies. } } handleCookieAccessInit();

The Power of AI and Automation at AnalyticaHouse
Jun 27, 2024 1266 reads

The Power of AI and Automation at AnalyticaHouse

We are delighted to announce that AnalyticaHouse has been selected as a finalist for the coveted “Best Use of AI in Search” award at the European Search Awards 2024!The Fusion of AI and Automation at AnalyticaHouseAutomation plays a crucial role in enhancing our efficiency at AnalyticaHouse. By automating controllable and standardized operations, we can redirect our efforts toward more creative and strategic thinking. The systems we’ve established reduce our daily workload, allowing us to engage more deeply with the core concepts that define marketing.We have begun integrating AI into our automation processes, particularly leveraging developments from OpenAI and the user-friendly nature of its API. This integration brings the creative power and processing capacity of AI into the workflows at AnalyticaHouse.Today, we’ll discuss one of our projects in this realm—the system we developed for SOVOS Turkey, which has made it a finalist at the EU Search Awards 2024.Leveraging AI to Overcome Marketing Challenges in a Digital AgeIn today’s world, the attention spans and engagement thresholds of users are diminishing day by day. Accelerating technology and intensifying competition make capturing a user’s interest at the first point of contact increasingly challenging. Coupled with the constantly evolving user psychology and desires, crafting marketing communications that adapt to changing consumer behaviors presents a significant challenge.At this juncture, we have merged traditional technologies with the power of AI to create a system that understands user needs and psychological states in real time. This system enables us to identify evolving user personas and generate personalized advertising communications that dynamically adapt to these changes.Revolutionizing Ad Strategies with AI and Real-Time DataAt this stage, we created a Google Sheets document to organize all data and feed Google Ads with the generated ad copy.Using Ads scripts, we identified other search terms that the user group interacted with. Furthermore, the script provided age and gender distribution data for this user group, helping us understand the predominant demographics.Through Apps Script, we fetched location and interest data for the relevant user group from GA4 and organized these into tables for persona creation.Additionally, using the SERP API, we retrieved the most frequently searched topics and terms by this user group from Google Trends and added these to our persona table.We also fed this structured data into the OpenAI API and GPT-4 Turbo script using Apps Script. Thanks to predefined business criteria and relevant signals, GPT-4 Turbo was able to generate a detailed persona.After the Persona Creator script outputted the persona, it was re-inputted into GPT-4 Turbo along with the target main keyword, requesting five unique keywords for this persona. The brief given to GPT-4 included dopamine-triggering communication and neuromarketing techniques suitable for B2B marketing, thus enhancing the impact of the generated ad copy.The ad texts produced were then applied to relevant ads through Google Ads Customizers and updated weekly to ensure the data reflected current user behavior and demographics, allowing for dynamically tailored ad copy.Comparatively, against a static generic ad, the dynamic ad demonstrated a 27% increase in CTR, a 20% reduction in CPC, and a 40% rise in conversions over the testing period.Evolving with AI: Shaping the Future of Digital MarketingThe world is changing at an increasingly rapid pace, and this acceleration is set to continue. The ease of use of artificial intelligence, as in various industries, has led to rapid changes in the world of digital marketing as well. The enhancement and optimization of traditional technological approaches, when combined with AI and reinforced with validated knowledge, offer incredible potential to keep pace with this changing world and match its speed, perhaps even helping to create the new.“In today’s world, executing user-centric marketing operations and being able to optimize them instantaneously based on changing signals carries great significance for achieving both strong performance and cost-effective marketing spend.Our approach in this project not only highlights the contributions and value of personalized marketing operations but also demonstrates how we can leverage the most advanced technological solutions of the new world in familiar and simple ways. This endeavor may inspire brand-new projects and ideas, and it sets a new benchmark for personalized marketing and the use of AI in the digital marketing realm.At this transformative moment, we, the AnalyticaHouse team, are both inspired and emboldened by the recognition and appreciation this work has received from a prestigious and respected organization like the European Search Awards. It has been a source of great pride for us and has fueled our motivation on this journey towards the next: to the future.”Project Owner | Emrecan Karakus - Performance Marketing Manager @AnalyticaHouseBeing selected as a finalist at the European Search Awards is a profound validation of the strategies we’ve deployed and the innovations we’ve cultivated along this path. As an organization, we take great pride in the achievements of our team and the milestones we’ve reached together.This project is a snapshot of our overarching methodology, showcasing the exceptional service we deliver to our clients and customers. Our diverse team spans Performance Marketing, Data Science, Product Analytics, SEO, Media & Planning, and Marketing Communications forming a cohesive unit dedicated to overseeing and enhancing every aspect of our clients’ marketing strategies. With this integrated approach, we not only help our clients meet their goals across various industries but also drive them toward pioneering outcomes by implementing operational and strategic initiatives that lead the market and foster innovative solutions.

Write e-commerce Purchases to Firestore with sGTM
May 24, 2024 1343 reads

Write e-commerce Purchases to Firestore with sGTM

Server-side Google Tag Manager (sGTM) offers enhanced flexibility and security for tracking and handling data in your e-commerce applications. One powerful application of sGTM is writing purchase data directly to Firestore, Google Cloud's NoSQL database. This blog post will walk you through the process of setting up sGTM to capture e-commerce purchases and store them in Firestore.Step 1: Creating a New Server-side Google Tag Manager (sGTM) TemplateIn this step, you'll create a custom template in your server-side Google Tag Manager (sGTM) container. This template will define the logic for capturing e-commerce purchase events and sending the data to Firestore. By creating a reusable template, you streamline the process of handling and managing purchase data across your e-commerce platform.You can use this code to accessing write and read data to Firestore from sGTM.const Firestore = require('Firestore');const Object = require("Object");const getTimestampMillis = require("getTimestampMillis");let writeData = { timestamp: getTimestampMillis()};if (data.customData && data.customData.length) { for (let i = 0; i < data.customData.length; i += 1) { const elem = data.customData[i]; if (elem.fieldValue) { writeData[elem.fieldName] = elem.fieldValue; } else { Object.delete(writeData, elem.fieldName); } }}const rows = writeData;Firestore.write('', rows, { projectId: '', merge: true,}).then((id) => { data.gtmOnSuccess();}, data.gtmOnFailure);Step 2: Configuring the Firestore DatabaseOpen Google Cloud Console: Navigate to the Firestore section. Create Database: Follow the prompts to set up a Firestore database in "production mode" or "test mode" based on your requirements.You may also create a new rule like this while you are running test mode:  rules_version = '2';service cloud.firestore { match /databases/{database}/documents { match /{document=**} { allow read, write: if request.auth != null; } }}Now you can send the any purchase data to Firestore like this:  Contact us for more use cases using server-side Google Tag Manager.