Marketing tips, news and more
Explore expert-backed articles on SEO, data, AI, and performance marketing. From strategic trends to hands-on tips, our blog delivers everything you need to grow smarter.

How to Write SEO-Friendly Content in 4 Steps?
Search engine optimization (SEO) refers to the improvements made to increase organic traffic and comply with search engine parameters. The priority of SEO optimizations and search engines is to provide users with a great experience and high-quality content. Search engines reward websites that deliver quality content by ranking them higher.SEO-friendly content is: optimized according to search engine parameters, built around a primary keyword, targeted to the right audience, responsive to user intent, valuable to users, easy to read, engaging, aligned with the brand’s voice, enriched with images and infographics, comprehensive, natural, and original. How to Write SEO-Friendly ContentThere are steps to follow before, during, and after writing SEO-friendly content. The first two are deciding your topic and outlining your content.1. Define Your TopicChoose niche topics that your target audience will easily consume and find interesting. Your content should relate to your website and the services you offer. Search engines emphasize content that adds value to users, so your articles need to provide up-to-date information.You can discover what interests your audience by examining competitors’ blogs, as well as forums and news sites to find questions users ask but haven’t had answered.In short, when choosing your topic, you can look at: your competitors’ blog pages, forums and news sites. 2. Analyze and Research KeywordsAfter choosing your topic, identify relevant keywords. These fall into primary and secondary keywords.Once your content is indexed, it will rank for those queries. Your main target query is your primary keyword—it summarizes and best represents your content’s focus. Secondary keywords (like subheadings) support the main topic. For instance, if “search engine optimization” is your primary keyword, “organic traffic” could be a secondary keyword.Three factors are crucial when selecting keywords: user intent, search volume, and competition.Your chosen keywords should match user intent. To verify intent suitability, search your keyword and compare the results to your planned content.For example, if your article covers “digital marketing trends,” targeting “digital marketing” is too broad, as Google returns “What is Digital Marketing?” pages for that query.Search Volume: Average monthly searches indicate how often a keyword is queried. High search volume doesn’t guarantee more traffic for you—it also means stronger competition.Competition: Difficulty levels (low, medium, high) show how hard it is to rank for a keyword. New or smaller sites should aim for low- to medium-difficulty terms first.Tools like Google Ads Keyword Planner, Ahrefs Free Keyword Generator, and Answer the Public help you find search volumes and competition. Google Ads – Keyword Planner Google holds about 86.64% of global search market share (Statista, Sept. 2021). Its Keyword Planner suggests new keyword ideas and shows average search volumes and competition levels per term.Enter your product or service to get related suggestions. You can enter multiple terms, filter by language/region, or paste your site or a competitor’s URL to see which keywords they rank for. “Avg. monthly searches” shows volumes; “competition” shows how contested each term is. Ahrefs Free Keyword Generator Ahrefs’ Free Keyword Generator shows search volumes and keyword difficulty (KD) for each term. Filter by country (e.g., Turkey) or by other search engines (Bing, Amazon, YouTube). KD is on a log scale: 0–10 (easy), 10–30 (moderate), 31–70 (hard), 70–100 (very hard). Answer The Public Answer the Public organizes queries around your keyword, showing alphabetical combinations and questions. It doesn’t display volume or competition data, and it lacks Turkish localization—but it can still surface Turkish queries if you set the region accordingly.3. Research User IntentUser intent explains why a user makes a search. Four main types exist: Informational (seeking knowledge—e.g., “What is SEO?”), Navigational (seeking a specific site—e.g., “AnalyticaHouse”), Commercial (researching before purchase—e.g., “best SEO tools”), Transactional (ready to buy—e.g., “buy iPhone 13”). 4. Outline Your ContentDecide on your content’s: length, and headings. Your article must be comprehensive and up-to-date. To outrank competitors, cover topics more thoroughly than they do.HeadingsUse competitor posts to identify key subtopics and include them plus new insights. Tools like Answer the Public’s “Questions” and Google’s “People also ask” can reveal popular questions to answer. Maintain a clear hierarchy with H1, H2, H3 tags and matching font sizes. Include your primary keyword in headings.LengthBacklinko’s analysis of 912 million blogs suggests 1,000–2,000 words is ideal. Longer posts also attract more backlinks. Quality matters more than sheer length.5. Ensure ReadabilityYour SEO-friendly article must read naturally and engagingly. Avoid robotic language. Keep paragraphs around 100–150 words, avoid repetition, and present each main idea clearly.Use and Optimize ImagesEnhance clarity with relevant images, videos, or infographics. Name files with keywords—e.g., “sunflower-oil.jpg” rather than “IMG1234.jpg.”Benefits of SEO-Friendly ContentHigh-quality content differentiates you from competitors and builds authority. Ranking for multiple keywords boosts visibility, making it easier to attract prospects. In short, SEO-friendly articles enhance your: conversion rate, organic traffic, brand awareness, and loyalty. SEO-Friendly Content ChecklistHere’s a 13-point checklist for SEO-friendly content: 1. Outperforms competing articles in depth and breadth. 2. Accurate, reliable information. 3. Written by a subject-matter expert. 4. Enriched with images and infographics. 5. Focused on a high-volume, medium-competition primary keyword. 6. Uses semantically related secondary keywords. 7. Each heading covers one main idea plus supporting points. 8. No redundant repetitions. 9. Primary keyword in headings. 10. Consistent heading hierarchy and font sizes. 11. Consistent tone of address (“you” vs. “we”). 12. Consistent verb tenses. 13. Clear, engaging writing with no spelling/grammar errors. 14. No duplicate content across platforms.

What Should You Pay Attention to in SEO Compatibility When Creating a New Website?
When building a website, it is the best time to make the easiest optimizations and infrastructure setups in terms of search engine compatibility. Many SEO metrics such as speed, visuals, canonical and meta editing can be easily organized during the website construction phase, coded within certain rules, and provide great convenience after the website goes live.1- Speed FactorsSpeed is one of the most important factors for successful search engine optimization and user experience. Among the most obvious factors affecting the speed of websites are: Visual & Media Content CSS Files JS Files Server Quality When preparing a website, first of all, all the images on the site must be optimized and sized in a way that does not compromise efficiency. For this, it is important to use images in WebP, JPEG2000, JPEG, SVG formats. Since the WebP format is not displayed in Safari browsers, the “on-error” function can be used in the HTML part so that the JPEG version of the image is shown in Safari browsers while the WebP format is shown in other browsers (Chrome, Mozilla, Opera, etc.). It is extremely important that visual and media content is optimized as much as possible. Compressing images under 150 KiB positively affects browser loading speed. It is also important that visual and media content is not larger than the area in which they are placed. For example, if an image is to be added to an average 400x400 pixel area in a grid module on the homepage, the image should not be larger than 400px. Otherwise, while the image covers an area of 400px in the browser, it will load with higher size and resolution during loading, which will negatively affect the opening speed.The most obvious mistake made with JS (JavaScript) and CSS files is the use of ready-made JS and CSS libraries without optimization. For example, the jQuery JS library or the Bootstrap CSS library contains many classes and functions. However, we generally use only a few of these classes and functions in our websites. Therefore, before being included on the page, unnecessary classes and functions should definitely be cleaned from these files and compression operations should be performed.Another factor affecting website speed is server quality. When a user visits a web page, the loading of DOM elements on the page by the server and the late response of request queries by the server negatively affect page performance. Therefore, it is important for the website to be hosted on a quality server in terms of processor, CPU, and RAM, and to have high bandwidth.2- Meta EditsA web page has indispensable metas. Meta title, meta description (meta keywords for search engines other than Google) are the main ones. First of all, the meta title and description fields must be editable and customizable on the CMS side. However, since a page's metas must not be empty, and sometimes some pages may be overlooked, if no custom meta description is entered, metas can be generated by defining a rule. For example:Page H1 + Site Name In this way, on forgotten/overlooked pages, instead of meta tags appearing empty, they will be generated based on a standard rule. Meta tags must be included among the tags of the page.3- Canonical TaggingCanonical tags are one of the most important markers that determine a web page's crawling and indexing criteria. To avoid duplicate page issues on websites and to indicate whether a page is its original version or another version, canonical tagging is used. Canonical tagging must be editable and customizable on the CMS side. However, since there may be pages that are forgotten or overlooked, just like in meta edits, canonical tagging should be created on all pages by defining a rule. For example:In this way, canonical tagging that automatically points to itself will be created on all pages that are not optionally edited. Canonical tagging must be included among the tags of the page.4- H1 Heading TagThe H1 heading tag is the main title of a web page. It directly carries the targeted keyword of the relevant page and reflects the title of all the content on the page. Each page must have exactly 1 H1 tag. H1 tags must be editable and customizable on the CMS side. However, as in meta and canonical, a rule can be defined so that it does not appear empty on pages where it is forgotten to be edited. For example:Page NameThere must be a maximum of 1 H1 heading tag per page. Therefore, if there are areas on the page that contain different H1 tags, these areas should be defined with or tags instead of H1.5- Robots.txt FileThe Robots.txt file is a kind of entry navigation that search engine bots first visit on a website. The pages/folders that are desired to be crawled or not crawled within the site, and the path of the sitemap, are specified in the robots.txt file. The line starting with the “Allow” command indicates a crawlable web page, while the “Disallow” command indicates pages that are not desired to be crawled. The “User-agent” command specifies which bots the commands will apply to. For example, a website can be open to crawling for Google bots but closed for crawling bots like Screaming Frog. The “*” symbol means that commands apply to all bots. For a web page with an admin panel, a standard robots.txt file can be created as follows:User-agent: * Allow: / Disallow: /admin Sitemap: https://sitename.com/sitemap.xmlThe Effect of Robots.txt on the SiteWhen a robots.txt file is created, the commands defined here must have a dynamic effect on the pages within the site to help with the crawl budget.For example, we give disallow to the /myaccount URL in the robots.txt file to block it from crawling. In this case, to help search engine bots, a “nofollow” tag should be dynamically added to all links pointing to the /myaccount address within the site. Otherwise, search engine bots may crawl a page that should not be crawled in robots.txt by following internal links. This negatively affects the crawl budget. In addition to the nofollow tag, if a page is disallowed in the robots.txt file, a meta robots tag should be added directly into this page. Example:Pages disallowed in the robots.txt file should also not be included in the sitemap. All of this process should be dynamically created during website construction. If a page disallowed in robots.txt is linked within the site, adding nofollow to all of them and adding a meta robots tag to the page itself should be dynamic. 6- Sitemap – Sitemap.xmlThe sitemap is a kind of navigation file that facilitates navigation and crawling for search engine bots visiting the website. Sitemaps should be generated dynamically, not manually. All pages listed for the user, such as service, blog, product pages, should be included in the sitemap. Of course, while generating the sitemap dynamically, the primary factor must be the robots.txt file. If a page/URL is blocked from crawling by adding disallow in the robots.txt file, this page/URL must not be included in the sitemap. Sitemaps must be in .xml extension and readable. For an example sitemap model, you can visit Google documentation:https://developers.google.com/search/docs/advanced/sitemaps/build-sitemap 7- Pagination & Infinite Scroll UsageEspecially in e-commerce sites’ product listing and blog listing pages, pagination must be used. Listing all products/content of a category on the same page when a category page is visited negatively affects browser performance and user experience. To prevent this and increase page efficiency, a pagination structure should be used on the page. There are several types of pagination. For example:Pagination done by page numbering:Or infinite scroll pagination, where content loads as the page is scrolled. Among these types of pagination, the most widely used today are Load More and Infinite Scroll pagination. Because instead of directly switching to different pages, content/products loading as the page is scrolled increases the user experience. Today, Google is also switching to infinite scroll pagination and recommends it. Example Google documentation:https://developers.google.com/search/blog/2014/02/infinite-scroll-search-friendly8- Language / Hreflang TagsIf a website uses multiple language options, hreflang tags must be used on the site to avoid duplicate issues and to ensure more relevant results in location-based searches. These tags should mark the alternative language pages on each language page. For example, if a web page that opens in Turkish by default has an English version:The hreflang to be placed on the Turkish page:The hreflang to be placed on the English page:9- Structured Data MarkupStructured Data Markup is a kind of schema markup that allows a web page to be more easily interpreted by search engines. There are many types of structured data markup for web pages. To give examples of the most commonly used structured data markups: Organization MarkupOrganization markup is a type of markup that should only be placed on the homepage of a website and represents the business card of the website. This markup includes the organization of the website, contact information, etc. Organization markup can be done as follows: {"@context": "http://schema.org", "@type": "Organization", "name": "SITE NAME", "logo": "LOGO URL”, "url": "https://sitename.com/", "email": "SITE EMAIL", "address": "COMPANY ADDRESS", "telephone": "CONTACT PHONE" } Breadcrumb MarkupBreadcrumb markup is beneficial because it presents the existing breadcrumb structure more neatly to search bots, so we strongly recommend using it on all pages to show hierarchy.The breadcrumb markup to be placed on all inner pages should mark all categories that precede it hierarchically. Example: { "@context": "https://schema.org", "@type": "BreadcrumbList", "itemListElement": [ { "@type": "ListItem", "position": 1, "item": { "@id": "https://sitename.com/", "name": "Home" } }, { "@type": "ListItem", "position": 2, "item": { "@id": "https://sitename.com/category", "name": "Category Name" } }, { "@type": "ListItem", "position": 3, "item": { "@id": "https://sitename.com/category/product-name", "name": "Product Name" } } ]}Product Markup (For E-Commerce Sites)Product markup is used for product pages and directly contains information about the product. Example: { "@context": "http://schema.org/", "@type": "Product", "name": "PRODUCT NAME", "url": "PRODUCT URL", "description": "PRODUCT DESCRIPTION", "sku": "PRODUCT SKU", "brand": { "@type": "Thing", "name": "BRAND NAME" }, "offers": { "@type": "Offer", "url": "PRODUCT URL", "priceCurrency": "CURRENCY, e.g.: USD", "price": "PRODUCT PRICE", "availability": "http://schema.org/InStock", "seller": { "@type": "Organization", "name": "BRAND NAME" } } } Service Markup (For Service Pages)For sites that sell services in consulting, education, etc., service markup can be used instead of product. Example usage: { "@context": "https://schema.org/", "@type": "Service", "provider": { "@type": "Service", "name": "SERVICE NAME", "description": "SERVICE DESCRIPTION" }, "providerMobility": "dynamic", "url": "SERVICE URL" } BlogPosting Markup (For Blog Pages)BlogPosting markup is used for blog pages on the website. Example usage: { "@context": "https://schema.org", "@type": "BlogPosting", "mainEntityOfPage": { "@type": "WebPage", "@id": "BLOG URL" }, "headline": "BLOG NAME", "image": [ "FEATURED IMAGE URL" ], "url": "BLOG URL", "datePublished": "PUBLISH DATE", "dateModified": "LAST UPDATE DATE", "author": { "@type": "Organization", "name": "SITE NAME" }, "publisher": { "@type": "Organization", "name": " SITE NAME ", "logo": { "@type": "ImageObject", "url": "SITE LOGO" } }, "description": "BLOG DESCRIPTION" } FAQ Markup (For FAQ Pages)FAQ markup is used for frequently asked questions pages on websites. The most important benefit of this markup is that it appears in search results, called Google snippets. Example FAQ markup usage: { "@context": "https://schema.org", "@type": "FAQPage", "mainEntity": [{ "@type": "Question", "name": "QUESTION 1", "acceptedAnswer": { "@type": "Answer", "text": "ANSWER 1" } },{ "@type": "Question", "name": "QUESTION 2", "acceptedAnswer": { "@type": "Answer", "text": "ANSWER 2" } },{ "@type": "Question", "name": "QUESTION 3", "acceptedAnswer": { "@type": "Answer", "text": "ANSWER 3" } } ] } 10- Breadcrumb StructureBreadcrumb is a type of header navigation that makes site navigation easier. It is a structure that allows users to see hierarchically which page they are on, which category the relevant page belongs to, and to easily navigate between categories and pages. The breadcrumb structure must be created dynamically and placed at the top of the page. An example breadcrumb usage may look like this:An example for e-commerce sites:Home > Category > Product NameFor more information, you can visit the W3 documentation:https://www.w3schools.com/howto/howto_css_breadcrumbs.asp

How to Prepare Your Brand for Black Friday in 3 Steps
The big November sales are starting! Is your brand ready for Black Friday? Follow these 3 steps to get the most efficient results for your brand during this hectic period.1. Plan Your Black Friday Campaign CalendarThe planning stage before Black Friday begins is crucial. First, define your campaign calendar and decide which audience you will contact, when, and how. You should create different strategies and plans for before, during, and after Black Friday.2. Forecast Your BudgetsTo make accurate forecasts, use tools like Performance Planner, Keyword Planner, and Google Trends, and review last year’s Black Friday data in Google Analytics. This will help you estimate more accurately and manage your budgets effectively.When planning budgets, consider all dynamics: media mix, channel distribution, brand and programmatic plans for maximum visibility, and include third-party channels in your plan.3. Divide Your Plans into PhasesHistorical data shows that shopping interest ramps up before November, peaks during Black Friday week, and then tapers off afterward. Therefore, split your planning into three phases: pre-sale, sale week, and post-sale.a) Pre-Black Friday PhaseBecause competition and costs spike during the sale week, start your campaign early to build awareness and expand your remarketing audiences. By driving traffic to your site before the sale at lower cost, you’ll have data ready to retarget more efficiently when costs rise.Be sure your promotional visuals and videos are ready well before peak week—dynamic creatives work best for awareness.Update your CRM segments with last year’s online and offline purchasers in Google Ads and Facebook, so you don’t miss any high-value customers.Also, ensure your website infrastructure can handle peak traffic: verify Google Tag Manager, conversion tags, and Facebook events before the rush.On the product side, check inventory levels and variety for your best-sellers and planned discount items to avoid stockouts during the surge.b) Black Friday WeekIn Google Ads, watch for “limited by budget” alerts—manual bidding campaigns can run out of budget quickly when volume spikes. Wherever possible, use automated bidding strategies to stay competitive.To broaden your reach, diversify campaigns: alongside programmatic and third-party buys, use Google Discovery, YouTube, and Dynamic Search Ads. Don’t forget to add all relevant ad extensions.On Facebook, build separate campaigns for awareness, consideration, and conversion, each with tailored messaging.Prioritize low-competition categories to avoid skyrocketing costs. In search campaigns, favor long-tail keywords over ultra-competitive terms like “Black Friday” or “Cyber Monday.”Launch new campaigns early so they finish learning before peak week. On Facebook, use Dynamic Creative; in Google Ads, use Responsive Display and Responsive Search Ads, with varied extensions.Traffic during peak week often stays for a while after. Segment users by behavior—e.g., abandoners—and set up a post-sale phase with special offers like “We Miss You” coupons for those who added to cart but didn’t purchase.c) Post-Black Friday PhaseSince sales taper off gradually after Black Friday, plan for a smooth transition. Prepare your post-sale creatives and messaging in advance.A common mistake is forgetting to pause sale campaigns afterward. Use automated rules in all platforms to turn off or adjust campaigns, and audit any leftover promotions from last season.

React & Google Analytics: How to Integrate GA4 in React?
It is an undeniable fact that React and Google Analytics are one of the most popular tools and libraries among the web-analyst community. Google Analytics is the most widely used web analysis tool, which helps you easily track and retarget your users.Google Analytics 4 (GA4) provides you with more comprehensive measurement methods such as personalized reports and intelligence analytics, by blending your web and mobile application together. Moreover, it provides you with better insights into your digital marketing strategies compared to Universal Analytics (UA).Known for being fast and simple, React is the most popular front-end JavaScript library in the field of web development. Using Google Analytics on your React-based website gives you the following benefits: You can find out from which country your users are driving traffic and what demographics they have. You can see how much time your users spend on which pages. You can measure the Enhanced Ecommerce and custom events. You can report bugs in your React application. You can measure user behavior for A/B tests in your application. Assuming you have a Google Analytics 4 account and a react-based website, let's see how you can set up a healthy react google analytics property step by step.First of all, you need to create a GA4 property within your current Universal Analytics account. You can use the GA4 Property Setup Assistant for this. Click on Get Started and it will be installed instantly without any pre-configuration.You can see that the GA4 installation was successful from the Connected Property section.Google Analytics 4 Measurement IDHalf of our work is done. Now that we have received the most important part, which is GA4 Measurement ID, which starts with G-, so that you can complete the GA4 installation on our React-based website.React Google Analytics IntegrationIn the React ecosystem in the digital marketing world, Static Site Generators (SSG) such as Gatsby and NextJS are generally used for page management, plugin support, CMS, site speed, and SEO compatibility concerns.Now let's start with possible integration methods.Adding Gtag ScriptFirst you need to install the react-ga package in your application.yarn add react-gaThen you have to add the react-ga package into index.js or app.js.import ReactGA from 'react-ga'; const TRACKING_ID = "UA-12341234-1"; // YOUR_OWN_TRACKING_ID ReactGA.initialize(TRACKING_ID);One of the most common problems in React applications occurs in the rendering methods, specifically, CSR (Client Side Rendering) and SSR (Server Side Rendering). You can access detailed information here for rendering methods.In your SPA (Single Page Application) applications, you should send your events with history.listen using react-router-dom in order to prevent these rendering problems.import React from 'react' import { withRouter } from 'react-router-dom'; import ReactGA from 'react-ga'; const RouteChangeTracker = ({ history }) => { history.listen((location, action) => { ReactGA.set({ page: location.pathname }); ReactGA.pageview(location.pathname); }); return ; }; export default withRouter(RouteChangeTracker);Gatsby GTAG PluginIf you use Gatsby engine on your website, adding gatsby-plugin-google-gtag /) is recommended.First of all, you need to install the gatsby-plugin-google-gtag plugin.yarn add gatsby-plugin-google-gtagThen you should update the gatsby-config.js file as follows:module.exports = { plugins: [ { resolve: `gatsby-plugin-google-gtag`, options: { trackingIds: [ "GA-TRACKING_ID", ], gtagConfig: { optimize_id: "OPT_CONTAINER_ID", anonymize_ip: true, cookie_expires: 0, }, pluginConfig: { head: false, respectDNT: true, exclude: ["/preview/**", "/do-not-track/me/too/"], }, }, }, ], }Because of the SSR, you may also have to send your custom events as shown below.typeof window !== "undefined" && window.gtag("event", "click", { ...data })Adding GTAG Script in Next.jsIn the folder where your Next.js application is located, you can open the .env.local file, so that you can add your Measurement ID.NEXT_PUBLIC_GOOGLE_ANALYTICS=You can easily add this variable in the .env.local file, for example if you are using Vercel.Then it will be enough to add the snippet into the _document.js file.import Document, { Html, Head, Main, NextScript } from 'next/document' export default class MyDocument extends Document { render() { return ( {/* Global Site Tag (gtag.js) - Google Analytics */} ) } }To pull custom events, you should use this following method;export const event = ({ action, params }) => { window.gtag('event', action, params) }As you can see, there are different Google Analytics 4 integration methods according to different frameworks.However, no matter what React framework you are using, you must first understand how Google Analytics pulls data from any web application. This will prevent possible measurement and integration errors on your website.You can access the source on how Google Analytics 4 works on the Gtag script here.

Adapting New Tool for Our Automated Jobs: Apache Airflow
Performance marketing combines advertising and innovation to assist merchants and affiliates expand their companies in any aspect. Each retailer's campaign is carefully targeted, ensuring that everyone has a chance to succeed and win. When all the operations on each side are done correctly, performance marketing offers win-win situations for both merchants and affiliates. As Tech Team, we gave a decision about writing new blogs about the digital marketing projects we build with the software engineering skill sets. We mainly produce new solutions for different brands and our main goal is to increase their performance with data analysis and automation projects. On that sense, we decided to publish new blogs related to our projects. What do we produce as a team? As a tech team, we adopted the modern performance marketing ideology and as a result of this, we produce an automation project using Airflow to schedule product reports for our customers. On our product reports, we are gathering information about the unique codes, availability, discounted prices, discount percentages and many more related features about the products by visiting their URLs. Moreover, we have a solid background on Google Sheets to add more power on our automation projects for more monitoring & reporting purposes. Furthermore, we also used docker container technology to adapt the Airflow environment into a virtual environment for the problems that we can face during deployment phase. To be more clear, Airflow is an automation tool to create data pipelines for multiple purposes. The main reason we work with Airflow is, rather than cron jobs, Airflow provides us a UI service to monitor all the processes in almost real-time. In addition to that, analyzing the logs on the platform has a significant impact on catching and regulating errors during the process. Moreover, when an error occurs on the system during the processes, with the configuration file, we can get emails for any kind of errors and it provides us a service for instant interference. First of all, the whole process depends on scraping and on the performance side of the whole process, we mainly use parallel threads working asynchronously and with that way we can scrape all the data in minutes for a large scale of URLs to be checked. We designed multiple virtual machine templates on Google Cloud which can operate all the required tasks in a given order. The most significant part of the order is to deliver data through the processes in a single template for the upcoming reports on the Airflow platform. What is Airflow? Let’s get more deep inside into the main structure of Airflow to understand how it works and how we can adapt this data pipeline environment to different projects for future purposes. “A DAG specifies the dependencies between Tasks, and the order in which to execute them and run retries; the Tasks themselves describe what to do, be it fetching data, running analysis, triggering other systems, or more”[1]. Basically we can think of Airflow as a way more complex version of cron jobs. In Airflow we are using workers as threads to operate all the Tasks. First of all, we need to know that all Airflow tasks work on a pre-structured object named ‘DAG’, which is a terminology to annotate each task we scheduled on the Airflow system. For example; with DAG( "Company_X_Product_Report", schedule_interval='@daily', catchup=False, default_args=default_arguments ) as dag: 1- We can define DAGs using Context Manager. DAG structure has so many features such as time scheduling, naming and retry options in case of any errors, helping us to regulate and set each of them on the data pipeline environment. In addition to that, we also have default-arguments to add more features on the DAG structure. For example; default_arguments = { 'owner':'AnalyticaHouse', 'start_date':days_ago(1), "sla": timedelta(hours=1), 'email': ['analyticahouse@analyticahouse.com'], 'email_on_failure': True, } 2- We used multiple functions to operate each task in a DAG. To be more clear, whole process can be expressed as: url_task >> scrape_task >> write_to_sheet_task >> find_path_task >> parse_message_task >> write_message_task 3- We also separate each task with corresponding functions using PythonOperators. Mainly these operators are responsible for the Python functions handling the required duties. One of the most used ones are Bash Operators and Python Operators for executing bash and python scripts. In our project, we mainly used Python Operators for executing each function one by one. In this example we showed that each scheduled python function needs to get implemented in such a fashion that Airflow will interpret all the implemented format as a job. We used more than 5 main functions to operate the whole scheduled job and we can think like we are adding the scheduled jobs to a stack structure. In this manner, all the PythonOperators will be added into a line and get executed one by one in a row. If one of the tasks fails (if exception raises), all operations fail and Airflow informs us. url_task = PythonOperator( task_id='get_url_data', python_callable=getUrlData, ) Combining the Airflow architecture with Docker Docker is a free and open platform for building, delivering, and operating apps. Docker allows you to decouple your apps from your infrastructure, allowing you to swiftly release software. In that sense, we also used Docker in our development stages and at the end of the day, we created a docker image based on Apache’s docker image on DockerHub[3]. We came up with small changes such as network bridging and port directions for other purposes. Basically, while we are developing the Airflow environment, we are faced with different minor and major problems including packet incompatibilities and version problems. Moreover, before we started working with Airflow, we did research about how we can adapt Docker technology to the Airflow project and then we made several meetings to have a common sense about the advantages that Docker can bring into action. Shortly after, during the development process we got stuck in so many points and every time we crash into the wall, rather than removing Airflow from the server, we deleted the container that we created and re-deployed it with the same Docker image. In addition to that, composing or creating a Docker container helped us to save so much time during the development part of the project. Advantages: Time saving in case of any unexpected errors during the development phase Easy to configure networking and volume features for the development of the project Isolated environment which is good for managing small bugs and errors Containerized structure brings easy configurations for each build time Challenges during the development Before we start implementing the project, we made a lot of brainstorming to figure out the critical parts of it because every time we come up with an architectural design, we were always adding more features on top of it to make it the best in the industry. We are a team sharing the ideology of creating a product in an impeccable way. As a result of this ideology, the starting point of the project differs from the end product. First, we built the project on Google Cloud VM and then with the other projects we developed, it started to get harder for us to manage and deploy all of them. Then, as a team, we decided to work with Docker to make everything easier. We are a young team eager to learn and build new projects with the latest and reliable technologies. In that manner, we also implemented error-catching modules because while we were developing the project, we faced server and Airflow crashes for no reason. Catching errors has an essential impact on us because customers that we are working with need to receive product reports daily, and during the flow of the whole automation project, in case of any errors, it results in undelivered reports. References [1],[2]-https://airflow.apache.org/docs/apache-airflow/stable/concepts/overview.html [3] https://hub.docker.com/r/apache/airflow

The Key to Personalizing your Content: Persona Building for Digital Marketing
If you’ve ever wondered how classical marketing has evolved over the years into personalized marketing, this blog post right here presents one of the building blocks of the matter. The digital age presents us with a constantly evolving environment when it comes to marketing practices. When some brands are flexible and agile with keeping up with these evolvements, some are having a much harder time adapting. Our deep dive will begin by explaining how classical marketing is evolving into personalized marketing, and continue by explaining the importance of building spot-on personas that align with our key performance indicators. Classical Marketing vs. Personalized Marketing We’ve been observing different ways of managing masses ever since the 1928 release of Edward Bernays’ Propaganda studies, which goes into detail of how people can be manipulated into taking action around a common goal. It is safe to say that the classical understanding of marketing formed around this idea, way before the concept of digital. Before, products were presented by creating a hype around them, a hype that assures that individuals are missing out on something by not owning them. However, in modern society, when there are innumerable products and services, is there really one type of hype that motivates all individuals? This is exactly when personalized marketing comes into play. Individuals that are motivated by the exact same needs do not exist in modern society anymore. Sure, there are some commonalities between different individual needs, but there are many varieties that did not exist beforehand. Whether you are a world famous, well-routed brand or a beginner, this is one fact that applies to all: We must acknowledge the existence of not one but multiple different target audiences that our brand can attract. Later on, we must find the common purchasing motivations within these audiences and form our ad creatives accordingly. But let’s not get ahead of ourselves. What is a Persona? A persona is basically a fictional character that we create in marketing studies in order to get to know our target audiences. Personas’ characters are defined by their characteristics via demographic details along with their interests, behaviors, motivations, fears and goals. These characteristics should align with our brand’s value proposition and be identified accordingly. Creating personas can be interpreted as designing interactions with our audiences. With our personal interactions, as we understand what someone is motivated by, we shift the way we talk to them. It is a very similar case for a brand. As a brand, we have an established tone of voice and we shift our messages according to the wants and needs of our personas. Now that we know the “what” of personas let’s move onto the “how” we create them. How to create personas for marketing operations We’ve established that personas are fictional characters that are formed to represent certain percentages of a population. However, we must not rely on our imagination while creating them. Characteristics of a persona should be defined according to research and/or data gathering processes. If you are a rooted brand that is already engaged in digital operations, the best thing to do would be to analyze your existing customers and group them under certain criteria. This part may sound like classical CRM methodologies and we cannot say that they are totally unrelated. The real challenge is for new brands or brands who wish to reach a completely different audience profile than the ones that existed beforehand. The first thing we need to do is market research. As any source on marketing will tell you, knowing your audience is the key. However, the research should not stay limited to the people who are most likely to use our products. If our product is a rather expensive one, we can assume that it will be more desirable for people with higher income. But is that all? Aren’t there moments in other people’s lives where they save up intensely in order to become an owner of our product? The answer is yes. Therefore, our biggest challenge would be to determine these moments for our audience and choose where we will catch their attention. That is why, when creating personas for marketing operations, aspects such as motivations, fears and interests are much more relevant and important than generic demographic segmentations. In order to presume our audience’s moments of intent, we need to be looking into their other habits so that we can decide on what part of their life journey our products or services would fit the most. If you wish to dig into this way of thinking more, you can start by researching what an empathy map is, but that is not what we are mapping out today. How we get to know our audience is done through data gatheration. Whether the data comes from website cookies or general publicly available research findings, we need to make sure that our presumptions about our audience is as accurate as can be. Not sure where to start? Well, there is always Google Trends which shows the related searches your audience is more likely to make if they show an interest in what you offer. There are websites such as Statista that publish research reports and give you a wider understanding of your audience’s habits. Keyword Planner is also a crucial source as it offers related search terms that might be outside of our scope when trying to understand our audience’s behavior. When you think you have different enough characteristics to compile under different personas, start creating the structure. If some personas have a lot more motivations or fears than others, perhaps they can still be divided into different personas. If they have less motivations or fears than others, perhaps there isn’t much need for taking them into account separately. After your personas are done, start building your creative structure accordingly. How to create personalized communication for different personas Now that we have decided on our personas, it’s time to decide how we communicate with them. Since we know that all of our personas are differentiated with their motivations, we need to come up with different visual assets and texts that will feed into their motivations the most while reviewing our brand. Do we use video assets or static ones? Do we ask intriguing questions to our audience or do we simply state how awesome we are? The answers all depend on how we’ve built our personas and how we think we can best catch their attention. Let’s talk about Spotify for a moment. They provide the ability to stream music, and everyone likes music right? Then why does Spotify use different ad creatives? Well, it has several different premium packages that apply to different types of people. Everyone listens to music, but not everyone’s motivations for listening to music are the same. Even our own motivations may differ during the day! That’s why it is crucial for us to see different ad visuals that will attract different kinds of emotions and users. Taking a look at some of these visuals, it is possible for us to compile them under 4 different motivations to use Spotify. The first image reflects capacity, which implies that it is an all-in-one platform for your music needs and you will most likely find what you are looking for in there. The second image talks about finding your own beat, which emphasizes your individual music taste, and no matter how typical or edgy you are, you have a place within Spotify! The third image talks about finding your own rhythm, which suggests that no matter how your mood can vary throughout the day, Spotify has got the song to match your level of emotion. The last image focuses on the price aspect of Spotify, which does not apply to a more particular need than financial benefits. Depending on our mood, all of these visuals may have an effect on us at some point. Regardless, they are likely going to hit all their potential users in the right spot for choosing different benefits to talk about all at once. It is important to note that there is prejudice when it comes to personalized marketing, and it is believed that it is the costlier way to go. While that may be true in some cases, spending a large budget on a standard mass audience does not allow you to test out what you are doing right. Although mass targeting is a viable option for the awareness stage of the marketing funnel, we still need to understand what makes the funnel go more and more narrow.