Roleplay as a fearless Technical SEO who must pass through Google's Web Rendering Service (WRS), a legendary construct, as part of a mission to protect site visibility.
Panel: 'Think like a bot, rank like a boss' from BrightonSEO September 2019
PubCon, Lazarina Stoy. - Machine Learning in Search: Google's ML APIs vs Open...LazarinaStoyanova
The document discusses Google's ML APIs versus OpenAI's APIs and their applications for SEO and digital marketing tasks. It provides examples of how natural language processing APIs from Google and OpenAI can be used for tasks like text analysis, sentiment analysis, document classification, translation and content transformation. While both Google and OpenAI APIs are useful, the document recommends choosing the right API for each specific task based on its capabilities and limitations in order to get the best results.
Automate The Technical SEO Stuff by Michael Van Den Reym
In this talk, Michael will show you how to automate technical SEO tasks. You will learn how to schedule and compare crawls to spot technical errors faster, how to use RPA to speed up technical SEO audits and how to automatically optimize images. You will get inspired to execute technical SEO better and faster.
No More "It Depends" - Learn to Set your Visual SEO Resources #LondonSEOMeetu...Aleyda Solís
This document discusses how SEOs often answer questions with the vague response of "it depends" and provides better alternatives. It recommends developing reusable resources like diagrams, charts and frameworks to more clearly explain SEO scenarios, processes and criteria. This helps avoid vague answers, establish trust, and facilitate decision making. It also encourages analyzing activities and setting replicable systems to improve services and grow expertise.
Kleecks - AI-Martech as a game changer-DEF.pdfKleecks
This document discusses how AI and automation can revolutionize the SEO value chain. It argues that AI can enable a symbiotic relationship between SEO strategy and technical implementation by powering automated competitive analysis, strategy suggestions, and one-click fixes. This represents a new approach that can improve time-to-market and optimize implementation timescales through a no-code model.
SEO low hanging Fruit - Identifying High Impact Opportunities Fast #SEOforUkr...Aleyda Solís
Learn how to identify high-impact SEO opportunities in your SEO Process fast, going through common scenarios that you can use to maximize your SEO results.
Cost Effective Multilingual Content Optimization in An International SEO ProcessAleyda Solís
How to optimize your content in an international / Multilingual SEO process? Take a look at the criteria to take into consideration and tips to maximize results.
Martin McGarry - SEO strategy c/o England manager Gareth SouthgateMartin McGarry
This document outlines an SEO strategy inspired by Gareth Southgate's plan to improve the England football team. It presents a three step target setting framework: 1) Be competitive in 6-12 months, 2) achieve tournament success in 1-2 years, and 3) be number one in 2-3 years. Key metrics such as visibility, site health, traffic, and time on page are identified and targets are set for each step. Accompanying tasks are mapped to the metrics and timeline to execute the plan, such as link campaigns, technical fixes, and content updates. The strategy aims to provide structure through measurable goals and accountability.
The document discusses strategies for content creation targeting low search volume keywords. It notes that while some marketers ignore these keywords, they can be high intent terms that are likely to convert if addressed with relevant content. The document advocates mapping out related low search volume topics, creating templates with rules for metadata, and programmatically launching many pages to cover niche topics. When this was tested with a 100-page pilot, it led to 105% traffic growth and 25% higher conversions after expanding the program to over 5,000 pages. The conclusion is that low search volume keywords should not be ignored as they can find "precious" intent if addressed properly.
What Makes your SEO Fail (and how to fix it) #BrightonSEO Aleyda Solís
In this session I'll go through the main causes of SEO processes failure and provide actionable tips to overcome the most common challenges to achieve SEO success.
Thriving as an SEO Specialist: Frameworks & Tips to Manage Complex SEO ProcessesAleyda Solís
How to successfully manage an SEO process? Is about having Influence to earn support, Be fast and agile and Be consistent and error free. I explain how in this presentation!
Product, service and category page links (and how to get them) - Rebecca Moss...Rebecca Moss
Rebecca heads up the Digital PR team at JBH, delivering creative digital PR strategies for lifestyle brands. After working in SEO for more years than she would care to admit, Rebecca's presentation reveals how the SEO industry has fallen out of love with large-scale hero campaigns, and shifted back to fundamentals of earning links using content marketing techniques.
How to Implement Machine Learning in Your Internal Linking Audit - Lazarina S...LazarinaStoyanova
This document contains the transcript of a presentation about incorporating machine learning into internal linking audits. The presentation discusses analyzing a website's internal link structure using machine learning techniques like topic modeling and fuzzy matching to identify opportunities for new or improved internal links. It provides a 6-step process for discovery, analysis, clustering content by topic, identifying link opportunities, prioritizing where to link, and measuring the impact of implemented links. The goal is incremental improvements to internal linking that can boost SEO over time through better content organization and discoverability.
BrightonSEO - Master Crawl Budget Optimization for Enterprise WebsitesManick Bhan
For every website on the internet, Google has a fixed budget for how many pages their bots can and are willing to crawl. The internet is a big place, so Googlebot can only spend so much time crawling and indexing our websites. Crawl budget optimization is the process of ensuring that the right pages of our websites end up in Google’s index and are ultimately shown to searchers.
Google’s recommendations for optimizing crawl budget are rather limited, because Googlebot crawls through most websites without reaching its limit. But enterprise-level and ecommerce sites with thousands of landing pages are at risk of maxing out their budget. A 2018 study even found that Google’s crawlers failed to crawl over half of the webpages of larger sites in the experiment.
Influencing how crawl budget is spent can be a more difficult technical optimization for strategists to implement. But for enterprise-level and ecommerce sites, it’s worth the effort to maximize crawl budget where you can. With a few tweaks, site owners and SEO strategists can guide Googlebot to regularly crawl and index their best-performing pages.
Improving Crawling and Indexing using Real-Time Log File InsightsSteven van Vessum
Interesting insights from log files are often kept within silos, not shared with content teams.
This prevent content teams from reaching their full potential.
Learn how they can improve crawling and indexing though leveraging insights from log file analysis, all in real-time.
SEO Tool Overload😱... Google Data Studio to the rescueNils De Moor
Google Sheet Template >>> http://bit.ly/seotooloverload-sheet
Ask any person in SEO what tool they use, and you'll more likely than not get a list of tools answered. SEO's need different perspectives, the right tool for the right job, but with an explosion of data produced by these tools, things get overwhelming really fast. To be able to tie things together, Nils will explore ways to streamline the data from your tools and build a single source of truth with Google Data Studio, helping you to make the right decisions.
You'll learn about using QUERY functions in Google Sheets, applying Machine Learning to do fuzzy matching on keywords and search queries, and much more...
---
Want access to the Google Sheets and Google Data Studio TEMPLATES --> bit.ly/seotooloverload-sheet
---
SEO-Campixx 2022 | Suchoperatoren auf SteroidenPaul Schreiner
Nützliche Suchoperatoren für deinen nächsten Technical Audit, Content Recherche oder weitere Backlinks finden. Und das alles direkt in der Google Suche.
Swipe left: Why your content is getting ghostedEleni Cashell
Creating engaging content is a tricky thing, and even if your work is perfectly targeted to your audience, with all the SEO research to back it up, it doesn’t guarantee engagement.
In this talk, Eleni will showcase some of the mistakes she’s made and seen first-hand, explain why this isn’t getting engagement, and reveal how to fix it. From making content more accessible and inclusive, to key research methods that are often ignored, this talk will show you how to turn content that’s being ghosted and ignored, into something that creates meaningful connections.
This document discusses how to build links that satisfy Google's E.A.T. (Expertise, Authoritativeness, Trustworthiness) guidelines. It introduces the concepts of E.A.T. and explains why links that increase the linked page's perceived quality are more valuable. Specific linking strategies are presented such as guest blogging, Wikipedia links, and Google My Business citations. The key takeaway is that links following E.A.T. principles help pages by boosting their perceived quality in search engine algorithms.
GretaMunari - The redemption of content automationGretaMunari1
In this presentation, you'll be walked you through how Trainline scaled high-quality human written content to 4M + pages without using AI tools and created customer-oriented pages while improving rankings along the way.
Rendering: Or why your perfectly optimized content doesn't rankWeLoveSEO
The document discusses how Google renders webpages for indexing. It explains that Google uses the Chromium browser and its components like Blink, V8, and the headless Chrome browser to render pages. The rendering process involves crawling the initial HTML, extracting links and resources, loading necessary scripts and files, and finally rendering the fully assembled page. Issues like undiscoverable links, blocking resources, dependencies, and client-side rendering can cause content to be missing from what Google indexes. The document provides tips to improve rendering such as ensuring visibility, adding diagnostics, and taking iterative steps.
Are you there Page Experience? It's Me, DevTools.Rachel Anderson
This document summarizes a presentation about using Chrome DevTools to test and optimize websites for Google's upcoming Page Experience update. It discusses the components of the update like mobile-friendliness and core web vitals. It provides guidance on using DevTools to test for issues in areas like intrusive interstitials, HTTPS security, and core web vitals metrics. The presentation emphasizes that field data may differ from lab tests in DevTools, and it outlines many resources for further information on the Page Experience update and related topics.
Are you there Page Experience? It's me, DevToolsJamie Indigo
BrightonSEO, March 2021
With Google's Page Experience ranking signal update rolling out in May 2021, you're running out of time to put in the budget line items for all the fancy SEO tools you'll need! Don't panic. Rachel Anderson and Jamie will show you how to optimize for humans (and algorithm updates) using an underestimated SEO ally: Chrome DevTools.
JavaScript SEO Ungagged 2019 Patrick Stoxpatrickstox
Patrick Stox is a product advisor, technical SEO expert, and brand ambassador at Ahrefs. He speaks at various conferences and organizes several meetup groups. He has judged various search awards and is a founder of the Technical SEO Slack group. Stox provides advice on JavaScript frameworks, headless CMS, code splitting, and best practices for JavaScript sites to be search engine friendly. He notes the challenges search engines face in rendering JavaScript content at scale.
1) JavaScript is widely used on modern websites to make pages more interactive and dynamic. It can impact search engine optimization (SEO) by modifying the HTML that search engines see.
2) Search engines like Google handle JavaScript by executing it to see the final rendered page, but this is complex and resource-intensive. It involves a two-pass indexing process where content is initially indexed from the static HTML then re-indexed after JavaScript rendering.
3) JavaScript can impact SEO positively or negatively depending on how it is used. Examples of positive impacts include rendering important product links or content, while negative impacts include missing metadata, errors during rendering, or slow performance.
This document summarizes best practices for architecting and optimizing Ajax applications. It discusses how application architectures have evolved from traditional MVC to a more dynamic model with code running in both the server and browser. It provides guidance on improving performance through proper markup, understanding browser specifics, optimizing network usage, caching strategies, and reducing DOM manipulation. The document emphasizes that performance must be a primary consideration in Ajax application design.
Split into two parts, the first one is basics in web user interfaces, the second one - protractor/jasmine testings (structure, set configs, spec and test example).
STP 2014 - Lets Learn from the Top Performance Mistakes in 2013Andreas Grabner
1) Performance issues often stem from architectural decisions, disconnected teams, flawed implementations, pushing changes without proper planning, blindly reusing components, and lack of agile deployment practices.
2) Common metrics that help identify performance problems include number of requests/user, log messages, exceptions, objects allocated/in cache and cache hit ratio, images, SQL statements, SQLs per request, HTTP status codes, and page size.
3) Tracking key performance indicators and metrics across automated unit and performance tests can help identify regressions and keep performance/architecture in check.
Do SEOs Need to Know About Chromium? Of CORS! Extended Edition - BrightonSEO ...Jamie Indigo
Presented at BrightonSEO September 2021
Did you know that secrets about Google's Web Rendering Service are hiding in plain sight? Discover the relationship between Chromium and Google Search so you can leverage this open-source technology to discover technical SEO issues on your site.
Let us share with you a deep love of Chromium. Chromium runs Chrome. It also runs Google Search's Web Rendering Service. If Chromium adopts it, Google Search adopts it. Join in the love story so you can leverage this open-source technology to discover technical SEO issues on your site.
How to make React Applications SEO-friendlyFibonalabs
SEO (search engine optimization) is important for businesses to be visible in search results. React applications can be challenging for SEO because search engines cannot see content rendered by JavaScript. Some key challenges are delays in indexing content, slow page loads, and inability to read metadata and create sitemaps. Techniques like prerendering, server-side rendering, and tools like React Helmet and Next.js help make React apps more SEO-friendly by rendering content on the server so search engines can index pages fully.
Search engines have come a long way in understanding JavaScript, but issues with rendering and load times can still impact your crawl budget and prevent search engines from indexing valuable content!
Finding the optimal solution that provides the best user experience, whilst also satisfying the bots can be a challenge. This talk will cover the differences between these solutions, a number of tools and metrics you can use, and other significant considerations to take into account when proposing a rendering solution to your developers.
SMX Advanced 2018 SEO for Javascript Frameworks by Patrick Stoxpatrickstox
The document discusses SEO considerations for JavaScript frameworks. It notes that SEOs need to understand how JavaScript works and how search engines handle it, as many developers are not familiar with SEO. It provides tips for SEOs, including that search engines don't interact with the page content in the same way users do, and content should be loaded by default without user interaction. It also discusses different approaches to rendering pages for search engines like server-side rendering versus client-side rendering.
BrightonSEO, July 2021 - To better understand a website's content search engines developed Web Rendering Services and are now able to render pages more or less like a normal user. Those Web Rendering Services are strictly connected to other phases of the crawling-indexing-ranking pipeline - if a rendering fails, it may affect all of them. In this session Giacomo will guide you through the process of understanding why rendering could be a problem also for non-Javascript pages, how to manually debug page rendering, the difference between understanding WRSs' capabilities and debugging problems on a website, and eventually how to test pages at scale.
The document provides an overview of web development. It discusses how the web was created in 1989 by Tim Berners-Lee and the initial technologies of HTTP, HTML, and URLs. It then explains how a basic web application works with a browser connecting to a web server to request and receive HTML files and other resources. The document also summarizes key concepts in web development including front-end versus back-end code, common programming languages and frameworks, database usage, and standards that allow interoperability across systems.
Deep crawl the chaotic landscape of JavaScript Onely
The document discusses the challenges of indexing JavaScript-powered websites by search engines. It notes that JavaScript rendering takes significant computational resources, straining crawlers' budgets. It also suggests that client-side rendered JavaScript websites have difficulties with search engine indexing and ranking, as content may be missed during Google's two-wave indexing process for JavaScript. The document recommends using server-side rendering, hybrid rendering, or prerendering to help search engines properly index JavaScript websites.
Scraping the web with Laravel, Dusk, Docker, and PHPPaul Redmond
Jumpstart your web scraping automation in the cloud with Laravel Dusk, Docker, and friends. We will discuss the types of web scraping tools, the best tools for the job, and how to deal with running selenium in Docker.
Code examples @ https://github.com/paulredmond/scraping-with-laravel-dusk
PrairieDevCon 2014 - Web Doesn't Mean Slowdmethvin
Web sites can be fast and responsive once you understand the process web browsers use to load and run web pages. We'll look at using tools like WebPageTest to analyze and optimize web pages.
This document discusses SEO challenges for JavaScript applications and solutions used at ratemyagent.com.au. It covers using hashbangs and escaped fragments to allow search engines to crawl single-page apps. Prerendering techniques are explained like snapshot, on-the-fly, and server-side rendering. Their middleware uses PreRender_MVC and attempted various prerendering services. Coding tips provided include understanding Google's AJAX crawling specification, URL types for SEO, setting canonical URLs, and tools like Google Webmaster Tools.
Similar to How Googlebot Renders (Roleplaying as Google's Web Rendering Service-- D&D style) (20)
Rendering strategies: Measuring the devil's details in core web vitals - Jam...Jamie Indigo
Core Web Vital are the results of how we render a page. For all this buzz, the battlefield fits in your pocket.
The battle field for CWV is the initial viewport AKA above the fold
CWV are diagnostic output, the result of how quick we complete the critical rendering path.
How we render affects how quickly we achieve the critical rendering path.
What happens when you combine Mobile First Index, Performance, and JavaScript? You find the critical rendering path. This talk will look at how these 3 major components of search can guide your strategy and tactical ways to improve them.
Crafting Expertise, Authority and Trust with Entity-Based Content Strategy - ...Jamie Indigo
At SMXL, I presented a talk about crafting effective, authoritative content by understanding entities. People, places, objects, and ideas have facets. Human users have unique perspectives and their language changes as their relationship to an entity changes. It's time we stop chasing keywords-- a byproduct of search intent-- in favor of strategic entity-based strategy.
This deck includes insights into how to access the data behind Google's knowledge graph, use external links to boost the search engine's understanding, and ways to become an authoritative and trusted source.
Tech SEO + Site Migrations - SMX MunichJamie Indigo
The document provides a detailed checklist for SEO best practices when conducting a website migration. It outlines numerous steps to take during pre-migration planning such as conducting content inventories, setting key performance indicators, and establishing analytics and technical baselines. It also lists critical steps for go-live including verifying domain preferences, checking server logs and CDNs, and submitting sitemaps. Post-migration activities include monitoring redirects and ongoing site defense. The comprehensive checklist is intended to help SEOs effectively plan and execute a migration while minimizing negative impacts to organic traffic.
Technical Foundations of Successful Internationalization - SMX MunichJamie Indigo
Reaching the next billion users means breaking out of the high-speed, desktop-focused, Google-centric experience. In order to take the idea of going global from c-suite vision board to tactical reality, you’ll need to get technical. From servers to CDNs, hreflangs to JavaScript, Jamie will share the real-life lessons you'll need to identify, launch, and successfully monitor your growing international market. In this session Jamie will talk about server stability, location, and parity (or how I learned to stop worrying and love the world's trickiest game of telephone), the real cost of ccTLDs vs subfolders vs subdomains and shrinking screens and the rising costs of JavaScript.
Render v Rank SEO for JavaScript - SEMPDX EngagePDX 2019Jamie Indigo
Today, Jamie will go into one of the most valuable topics in technical SEO – rendering and JavaScript. 95% of sites use JS— so many that Google has had to reconsider how they crawl and index JavaScript generated. Let’s look at the new rules for a dynamically generated web.
Optimizing with Server Logs | Jamie Alberico @ #TechSEO Boost 2018Jamie Indigo
We've all spent hours listening and researching how Google says they interact with our sites. Server logs are a critical view into how Googlebot actually interacts with your sites. Learn how to identify different Googlebot behaviors, crawl waste, and optimization opportunities.
Creating Effective Ecommerce Information Architecture #SearchLove 2018Jamie Indigo
Users come to your site on a quest. When they feel smart and sophisticated on your website, they tend to stick around. Information architecture focuses on organizing content and functionality so that your site presents the best user experience it can. Delve into how information architecture can help your website level up with this #SearchLove 2018 presentation from Jamie Alberico
Site structure is a vital part of any successful site. Crafting an optimal site architecture that provides logical navigation for visitors and enables Google to understand how your site is connected, is one the best ways to significantly increase your performance in search.
We're in a cross-device mobile-first world. Here are detailed steps on how to lift and shift your desktop design to meet user needs on the new (and much smaller) default experience.
Megalive99 Situs Betting Online Gacor TerpercayaMegalive99
Megalive99 telah menetapkan standar tinggi untuk platform taruhan online. Berbagai macam permainan, desain ramah pengguna, dan transaksi aman menjadikannya pilihan utama para petaruh.
The advent of social media has revolutionized communication, transforming the way people connect, share, and interact globally. At the forefront of this digital revolution are visionary entrepreneurs who recognized the potential of the internet to foster social connections and create communities. This essay explores the founders of some of the most influential social media platforms, their journeys, and the lasting impact they have made on society.
Mark Zuckerberg, along with his college roommates Eduardo Saverin, Andrew McCollum, Dustin Moskovitz, and Chris Hughes, founded Facebook in 2004. Initially created as a social networking site for Harvard University students, Facebook rapidly expanded to other universities and eventually to the general public. Zuckerberg's vision was to create an online directory that connected people through their real-life social networks.
Twitter, founded in 2006 by Jack Dorsey, Biz Stone, and Evan Williams, brought a new dimension to social media with its microblogging platform. Dorsey envisioned a service that allowed users to share short, real-time updates, limited to 140 characters (now 280). This concise format encouraged rapid sharing of information and fostered a culture of brevity and immediacy.
Kevin Systrom and Mike Krieger co-founded Instagram in 2010, focusing on photo and video sharing. Systrom, who studied photography, wanted to create an app that made mobile photos look professional. The app's unique filters and easy-to-use interface quickly gained popularity, amassing over a million users within two months of its launch.
Instagram's emphasis on visual content has had a significant cultural impact. It has popularized the concept of influencers, giving rise to a new industry where individuals can monetize their popularity and reach. The platform has also revolutionized digital marketing, enabling brands to connect with consumers in more authentic and engaging ways. Acquired by Facebook in 2012, Instagram continues to be a dominant force in social media, shaping trends and cultural norms.
Reid Hoffman founded LinkedIn in 2002 with the goal of creating a professional networking platform. Unlike other social media sites focused on personal connections, LinkedIn was designed to connect professionals, facilitate job searches, and foster business relationships. The platform allows users to create professional profiles, network with colleagues, and share industry insights.
LinkedIn has become an indispensable tool for job seekers, recruiters, and businesses. It has transformed the job market by making it easier to find and connect with potential employers and employees. LinkedIn's influence extends beyond job searches; it has become a hub for professional development, thought leadership, and industry news. Hoffman's vision has significantly impacted how professionals manage their careers and build their networks.
Jan Koum and Brian Acton co-founded WhatsApp in 2009, aiming to create a simple, reliable..
seo proposal | Kiyado Innovations LLP pdfdiyakiyado
Crafting a compelling SEO proposal? Learn how to structure a winning SEO proposal template with essential elements and tips for client engagement. Elevate your SEO strategy with expert insights and examples
Book dating , international dating phgrathomaskurtha9
International dating programhttps: please register here and start to meet new people todayhttps://www.digistore24.com/redir/384521/godtim/.
get started. https://www.digistore24.com/redir/384521/godtim/
How Googlebot Renders (Roleplaying as Google's Web Rendering Service-- D&D style)
1. Think like a bot,
Rank like a boss:
How Googlebot renders
Jamie Alberico // Not a Robot
SLIDESHARE.NET/JAMIEALBERICO
@JAMMER_VOLTS
2. Jamie Alberico
My name means Usurper Elf
King.
I’m a Technical SEO, Search
Advocate, & Wood Elf Druid.
Oh yeah, and I’m Not a
Robot.
#brightonSEO @Jammer_Volts
3. Masters of unlocking magic in everyday
objects, Technical SEOs are extremely
resourceful.
They see magic as a complex system
waiting to be decoded and controlled.
Proficiencies (recommended)
Chrome Developer Tools, Lighthouse,
Google Search Console, webcrawlers
Technical SEOs
Class Details
#brightonSEO @Jammer_Volts
4. Our Technical
SEO Quest
To protect site visibility by delivering
our content to Google’s index.
To do this, we must pass
through a powerful construct.
#brightonSEO @Jammer_Volts
5. When Googlebot retrieves your pages,
Googlebot runs your code, and assess your
content to understand the layout or structure of
your site.
What is Rendering?
#brightonSEO @Jammer_Volts
6. All information Google collects during the
rendering process is then used to rank the quality
and value of your site content against other sites
and what people are searching for with Google
Search.
How Google Search Works, Search Console Help Center
Rendering’s role in Rank
#brightonSEO @Jammer_Volts
7. Initial HTML
(1st wave of indexing)
Rendered HTML
(2nd Wave of indexing)
Rendering
#brightonSEO @Jammer_Volts
8. If Google cannot render the pages on
your site, it becomes more difficult to
understand your web content because we
are missing key visual layout information
about your web pages.
As a result, the visibility of your site
content in Google Search can suffer.
Rendering Risks
#brightonSEO @Jammer_Volts
9. Until 2018, we thought our quest looked
like this
Crawl
Index
Rank
#brightonSEO @Jammer_Volts
10. Now, we know that Rendering is part of the process
and that Google has two waves of indexing.
Crawl Index
Render
Rank
First Wave
SecondWave
#brightonSEO @Jammer_Volts
11. If Google can’t render content, we fail our quest
Crawl Index
#brightonSEO @Jammer_Volts
13. Google Web Rendering Service
Large Construct (legendary), lawful neutral
Languages HTML, CSS, JavaScript, Images
Skills Perception +12, Dexterity +10
Senses Robots.txt, Robots directives
#brightonSEO @Jammer_Volts
14. Takes action using threads
Each requests to made by a thread. A thread is a single
connection. It sequentially moves through each action,
one at a time, until it’s task is complete.
Features & Traits Actions Equipment
#brightonSEO @Jammer_Volts
15. SEOs call this Crawl Budget
“Simply put, [crawl budget] represents the number of
simultaneous parallel connections Googlebot may use to
crawl the site, as well as the time it has to wait between
the fetches.”
What Crawl Budget Means for Googlebot, Google Webmaster Blog
Features & Traits Actions Equipment
#brightonSEO @Jammer_Volts
16. Stateless
● Does not retain state across page loads
● Local Storage and Session Storage data are cleared
across pages loads
● HTTP Cookies are cleared across page loads
Features & Traits Actions Equipment
#brightonSEO @Jammer_Volts
17. Obedient
Obeys HTML/HTML5 protocol
Literal
“Googlebot, go to the apothecary and buy a
healing potion. If they have shields, buy 2. “
Googlebot comes back with 2 potions.
Features & Traits Actions Equipment
#brightonSEO @Jammer_Volts
18. Politeness is priority 0
Crawling is its main priority while making sure it doesn't
degrade the experience of users visiting the site. We call
this the "crawl rate limit," which limits the maximum
fetching rate for a given site.
What Crawl Budget Means for Googlebot, Google Webmaster Central Blog
Features & Traits Actions Equipment
#brightonSEO @Jammer_Volts
19. Multi-thread
Googlebot can execute more than one request at a time
if demand and server stability allows.
Features & Traits Actions Equipment
#brightonSEO @Jammer_Volts
20. Request URI
Googlebot send a request for content at a unique
resource instance (URI).
Googlebot can discover a URL
via link or submission
Features & Traits Actions Equipment
#brightonSEO @Jammer_Volts
21. Read HTTP response and headers
Q. Does the thing I asked for exist?
A. HTTP Status Codes
Q. Anything I should know before looking at this?
A. Cache-Control, and Directives
Features & Traits Actions Equipment
#brightonSEO @Jammer_Volts
23. Identify Resources
Googlebot identifies resources
needed to complete the request.
It feeds identified resources into
the crawling queue.
Features & Traits Actions Equipment
Use Network tab to see how many
resources a page calls
#brightonSEO @Jammer_Volts
24. Cache
If the requested website implements a cache, a copy of
the data is made or requested
Features & Traits Actions Equipment
#brightonSEO @Jammer_Volts
25. Actions
WRS, web rendering service
Features & Traits Equipment
Googlebot queues pages for both crawling and rendering. It is not
immediately obvious when a page is waiting for crawling and when it is
waiting for rendering.
WRS is the name used to represent the collective elements involved in
Google’s rendering service. Many details are not publically available.
#brightonSEO @Jammer_Volts
27. Actions
WRS process
Features & Traits Equipment
1. A URL is pulled from the crawl queue
2. Googlebot requests the URL and downloads the initial HTML
3. The Initial HTML is passed to the processing stage which extracts links
4. Links go back on the crawl queue
5. Once resources are crawled, the page queues for rendering
#brightonSEO @Jammer_Volts
28. Actions
WRS process
Features & Traits Equipment
6. When resources become available, the request moves from the render
queue to the renderer
7. Renderer passes the rendered HTML back to processing
8. Processing indexes the content
9. Extracts links from the rendered HTML to put them into the crawl
queue
#brightonSEO @Jammer_Volts
29. Chromium, headless browser
EquipmentActionsFeatures & Traits
● Headless means that there is no GUI (visual representation)
● Used to load web pages and extract metadata
● reading from and writing to the DOM
● observing network events
● capturing screenshots
● inspecting worker scripts
● recording Chrome Traces
#brightonSEO @Jammer_Volts
30. Blink, browser engine
● Allows for querying and manipulating the rendering
engine settings (ex: mobile vs. desktop)
● Blink loves service workers. Blink may create multiple
worker threads to run Web Workers, ServiceWorker
and Worklets
EquipmentActionsFeatures & Traits
#brightonSEO @Jammer_Volts
31. Blink, browser engine
Blink is responsible for 2 major elements:
Memory heap: stores the result of script execution
(Memory Heap results are added to DOM.)
Call stack: queue of sequential next steps
(Each entry in the call stack is called a Stack Frame.)
EquipmentActionsFeatures & Traits
#brightonSEO @Jammer_Volts
32. Blink, browser engine
Local storage and Session storage are key-value pairs
that can store any JS objects and functions in the
browser
These keys are a weak point in your rendering offense
against a stateless Googlebot.
EquipmentActionsFeatures & Traits
#brightonSEO @Jammer_Volts
33. V8, JavaScript engine
JavaScript is a single-threaded process and each entry or
execution step is a stack frame.
Googlebot can opt run simultaneous parallel
connections.
EquipmentActionsFeatures & Traits
#brightonSEO @Jammer_Volts
34. V8, JavaScript engine
Each thread will runs through a process of:
1. Loading
2. Parsing
3. Compiling
4. Executing
EquipmentActionsFeatures & Traits
#brightonSEO @Jammer_Volts
35. V8, JavaScript engine
● open-source JavaScript engine and WebAssembly
engine
● developed by Google & The Chromium Project
● Use in Node.js, Google Chrome, and Chromium web
browsers
EquipmentActionsFeatures & Traits
#brightonSEO @Jammer_Volts
36. V8’s components
● Ignition, a fast low-level register-based JavaScript
interpreter written using the backend of TurboFan
● TurboFan, one of V8’s optimizing compilers
● Liftoff, a new baseline compiler for WebAssembly
EquipmentActionsFeatures & Traits
#brightonSEO @Jammer_Volts
41. Define it for your site, by template
#brightonSEO @Jammer_Volts
42. Use clean, consistent signals
Googlebot won’t see past a noindex directive in initial HTML
to see an index placed in DOM.
Duplicative content without a canonical in initial HTML is
crawl waste until rendering.
Inconsistent title tags and descriptions can result from
overwriting the initial HTML with rendered HTML.
#brightonSEO @Jammer_Volts
43. Focus rendering efforts with nofollow
If a resource is not valuable to the construction of the page,
add a nofollow directive to resources that are not necessary
or beneficial to page construction.
#brightonSEO @Jammer_Volts
44. Mobile vs Desktop Rendering
Layout matters for both.
If you want to rank for
position zero, remember that
the content must be exposed
on initial mobile load.
#brightonSEO @Jammer_Volts
45. Choose the rendering strategy that’s
right for your business and stack.
You don’t have to be 100% client-side, 100% server-side, or
100% both (dynamic).
Load what matters when it matters.
#brightonSEO @Jammer_Volts
51. More Pages Resources require more
rendering resources
Each resource must be fetched independently before the
page can be accurately rendered.
This is a major part of the issue with client-side rendering.
More client-side calls mean more blindspots for you.
#brightonSEO @Jammer_Volts
52. Excessive scripts
runs the risk of
hitting thread/rest
thresholds.
This is most often
observed as Other
error .
#brightonSEO @Jammer_Volts
53. Call Stacks have a maximum size
While the Call Stack has functions to execute, the browser
can’t actually do anything else — it’s getting blocked.
#brightonSEO @Jammer_Volts
54. Session and Local web storage limits
5MB per object, and 50MB per system
If your CSR resources are too large, you risk hitting the upper
limit. Elements in queue once the limit is reached may not be
considered by Googlebot.
#brightonSEO @Jammer_Volts
55. Load scripts & images without blocking
Asynchronous calls are supported with async attributes
<rel=”myscript.js” async defer>
Lazy load images in Chrome with native attributes
<img src=”the-traveler.jpg” loading=”lazy”>
#brightonSEO @Jammer_Volts
57. Don’t trust document.write( )
Dynamic code (such as script elements containing
document.write() calls) can add extra tokens, so the parsing
process actually modifies the input.
#brightonSEO @Jammer_Volts
66. Resources
● Get started with Chrome Developer
Tools
● HTML/HTML5 Parsing Standards
● Debugging your pages
● SimpleHTTPServer
● Ngrok
● Fix Search-related JavaScript
problems
● TurboFan overview
● Liftover overview
● Tame the Bots Portals
● Blink Rendering, life of a pixel
● The Rendering Critical Path
● JavaScript Sites in Search Working Group
#brightonSEO @Jammer_Volts