Local SEO
Bradley Benner on Why Simple Websites Beat Keyword-Stuffed Pages | Local Marketing Secrets with Dan Leibrandt
Jul 8, 2024


I recently sat down with Bradley Benner, and this conversation completely changed how I think about local SEO. Bradley is the founder of Tree Care HQ, Semantic Mastery, Semantic Links, and Local Fury. He's been doing local SEO for over 15 years, and he started in the most relatable way possible: as an electrician who couldn't afford to hire a marketing agency.
We talked about the massive shift in Google's algorithm from keyword-based to entity-based search, why most SEO professionals are still stuck optimizing websites like it's 2020, and how Bradley discovered that reducing content actually improves rankings. He also shared his exact site structure that consistently ranks tree service companies with zero backlinks in low-competition markets.
/ / / / / / / /
From Pulling Wire to Building Websites
Bradley didn't start out as a marketing guy. He was an electrical contractor in Virginia who started his business around 2007 or 2008. For the first year or two, he was primarily a subcontractor working for general contractors and remodeling companies. But eventually, he wanted to generate his own leads instead of relying on other people to feed him work.
The problem was that Bradley had learned traditional marketing from his time in real estate, things like print advertising, direct mail, flyers, newspaper ads, and canvassing neighborhoods. He knew almost nothing about digital marketing and could barely operate a computer. So he started asking the contractors he worked with how they were getting leads, and most of the successful ones had hired someone to help them rank on Google.
Bradley couldn't afford to hire anyone, so he did what any determined contractor would do. He decided to teach himself. He built a test website for his electrical business, then created two more test sites for carpet cleaning and locksmith services. He had no clients for those other two industries, they were purely for experimentation.
Within about six months, Bradley got all three sites ranking at the top of Google. He was generating leads for his own electrical company and creating leads for carpet cleaning and locksmith jobs that he didn't even have businesses for yet. That's when he realized he could rent these sites out to contractors in those industries, which is called the rank and rent model.
Once Bradley rented out those first two sites to actual contractors, he saw the potential for a legitimate side hustle. From 2010 to around March 2012, he focused entirely on lead generation by building WordPress websites and Google Places profiles (what we now call Google Business Profiles), ranking them, and then pitching contractors to rent the sites from him.
The interesting thing about Bradley's early business was that everything was local to Virginia. That worked great during most of the year, but Virginia winters are tough enough that home construction and related work slows down significantly. After two winter seasons where his additional SEO income would decline dramatically, he decided he needed to stabilize that income stream. That's when he opened Big Bamboo Marketing LLC in 2012, which he later rebranded as Tree Care HQ, and started offering traditional retainer-based services to local businesses.
About six months after starting his agency, Bradley had replaced his electrical contractor income completely. A year later, in 2013, he launched Semantic Mastery to teach other people how to do local SEO.
The Seven Dollar PDF That Built a Four Year Business Model
What really fascinated me about Bradley's early journey was how simple his ranking method was back then. He stumbled onto something called Warrior Special Offers, which was basically a marketplace for cheap digital marketing products. Bradley bought a seven dollar PDF guide written by a guy named David Kacic about something called automatic links. The guide taught him how to use IFTTT, which stands for If This Then That.
The guide showed Bradley how to build what he calls syndication networks. He would go out and create 15 to 20 web properties like social media accounts, external blogs (Blogger, Tumblr, WordPress, Weebly), bookmarking sites, and other platforms, all branded for a specific company. Then he would connect all of them to IFTTT so that whenever he published a blog post on the client's main website, it would automatically republish or syndicate to all those other branded properties.
Bradley told me something that blew my mind: "I swear to God for about the first four years of my local SEO agency that was all I did."
For four years, his entire business model was building these syndication networks. He would build links to those network properties using link building tools (which was safe to do back then), publish content consistently, and watch his clients rank like clockwork. He said within about two months he would rank every single client he touched.
Bradley eventually got smart and hired people to build these networks for him instead of doing it himself. He created training to teach virtual assistants how to build them so he could spend more time landing clients. That training he created for employees actually became Semantic Mastery's first product, which they launched as IFTTT SEO. About a year after they launched that course, they got a cease and desist letter from IFTTT's attorneys for trademark infringement, so they changed the name to Syndication Academy.
Why Bradley Only Works With Tree Companies Now
For years, Bradley thought he had niched down by only working with contractors. He was doing SEO for pest control companies, septic service companies, HVAC companies, plumbers, tree services, roofers, and all kinds of home service businesses. But he eventually realized that wasn't really a niche at all.
Each of those industries has its own keywords, vocabulary, customer intent, and what makes compelling content that actually works and gets results. After years of this grind, Bradley finally asked himself why he was doing this to himself. In 2019, he created a DBA for his main corporation and called his local SEO agency Tree Care HQ. From that point on, he would only take on tree service contractors as new clients.
That decision to truly niche down into a single vertical made a huge difference in his business. Once he did that, he was really able to start scaling because he only needed to master one industry.
In 2022, Bradley started a link building business called Semantic Links, which provides white label link building services specifically for local SEO agencies. That business has been really successful for the last two and a half years and now actually generates more revenue for Bradley than even his own local SEO agency does. He also started Local Fury, an app specifically designed to help improve organic maps ranking.
Bradley has now been doing digital marketing for about 15 years, almost exclusively focused on local SEO, and he's still fascinated with it.
The Algorithm Shift That Changed Everything
When I asked Bradley why he thinks local SEO is easier now than it was three or four years ago, he dropped some serious knowledge about how Google's algorithm has fundamentally changed.
The algorithm has shifted from a keyword-based system to an entity-based algorithm. For most people who aren't SEO professionals, Bradley explains this as a shift from keyword focus to topic and category focus.
Bradley explained that Google is nothing but a document retrieval algorithm at its core. It's very advanced, but at its simplest form, it's just a document retrieval algorithm. Google treats web pages as documents, and the SEO titles or meta titles of web pages are the document titles.
The way Google used to return results was based on a search query (a keyword). It would look in its database and try to identify which documents or pages were most relevant based on factors like keyword density, meaning how many times the keyword appeared within the content as a ratio of the total words on the page.
It was a very crude algorithm, and that's why it was so easy to manipulate back then. You could just stuff keywords into a page and rank based on having higher keyword density.
But around the end of 2021, the Google algorithm changed and shifted. We're in the semantic web now. Semantic web technologies have been fully adopted by Google. The semantic web is an entity-based algorithm. An entity is a person, place, or thing. The way that semantic search technology works is that the artificial intelligence and machine learning understands what an element or entity is, and then it determines how those entities relate to each other.
Google now has a deeper understanding of things than it did before. Because of large language models integrated into the search algorithm, it now has an understanding of human language, primarily in English.
Now, if you optimize for topics or categories, you can rank a page for every single search query that Google already has an understanding of that is associated with that topic or category, even if that keyword's not present on the page.
When Bradley talks about categories, he's talking about Google Business categories assigned to particular industries. In pest control, the actual Google Business category is "Pest Control Service." For topic optimization, that's for ranking in organic search. Google created its Google Business categories from Wikipedia entities.
In Bradley's industry, the top level category is "tree service," but the top level topic term is "tree maintenance." He told me, "I can rank one page for every single search query associated with the tree service business industry if I optimize for tree service and tree maintenance, those two top level terms."
Most SEO professionals are still stuck in keyword mode. You can still optimize for keywords if you want, but it's limited in performance. If you optimize a document or page for a specific keyword, that page can only rank for that keyword and close variants. But if you optimize for a topic or a category, you can rank that page for every single search query that Google has an understanding of that's in its database for that topic or category.
Bradley has spent two years now studying and testing topic and category optimization, which is entity-based SEO. He's done those two years of testing exclusively in the tree service industry. Because he also does a lot of consulting with other SEO professionals and agencies, he's been fortunate to see how his process and methods can be applied to other industries, primarily for service area businesses.
Why Your Twenty Page Website Is Hurting Your Rankings
This section completely changed how I think about website structure, and it probably describes a lot of pest control websites right now.
Bradley walked me through an example. Let's say a pest control company does termite control, ant control, roach control, and spider control. And let's say in their service area there are five locations or city names that they want to target.
The traditional SEO approach would be to create 20 pages. You'd have Termite Control City 1, Termite Control City 2, Termite Control City 3, and so on. Four services times five cities equals 20 pages. Each one of those 20 pages probably has 1,000 words or 1,500 words or even 2,000 words of content.
Here's the problem: that's redundant. It's bloat. It's duplicated content. Three or four years ago, that's what everyone in the SEO industry did and it worked. But now that actually causes problems.
Google had what's called the Helpful Content Update. The simplified version is that Google now crawls a site and comes up with an overall site quality score. That quality score is now a multiplier that gets applied to every page on the site.
Previous to the Helpful Content Update, Google didn't rank websites. Google ranks web pages for search queries. Remember, it's a document retrieval algorithm. A web page is a document.
Before the Helpful Content Update, you could have a junk site that was not optimized well at all, but you could have one page on that site that was optimized to perfection, and that one page could rank really well. The rest of the pages wouldn't rank, but that one page could.
The Helpful Content Update eliminated that. Now, even if you had one page on the site that was optimized to perfection, if the overall site quality score was 70 percent, then Google's going to apply a 70 percent modifier to that page. It was Google's way of demoting websites that weren't optimized and efficient.
Going back to that example of four services with five locations where you now have 20 pages, you're saying the same damn thing over and over and over again. That's absolutely unnecessary. It's redundant. It's bloat. It also causes what's called crawl resistance because Google now has a harder time identifying what is the best page for what.
Bradley's solution is simple. If you've got four services in this example, then you have four service pages that are not optimized for any one location. It should be service plus brand. So you have Termite Control plus brand name, Ant Control plus brand name, and so on. Each one of those pages is just optimized for the service and the brand, and then you optimize that page thoroughly with descriptions about the service and what to expect.
Then you have separate location pages optimized for each of the cities that you want to target. Those are optimized for the top level category and topic term, meaning Pest Control Service City plus brand name in the SEO title.
Through internal linking, you only have one location page for each service area location. Not five or four like in the example earlier. Now you only need one location page per city.
On that location page, you don't need to have 1,000 or 1,500 or 2,000 words of content. You literally just need to meet the intent of the searcher, which would be who you are, how to contact you, and then if they want more information, here are the products or services provided in this area. That's just internal links that link over to the pages that thoroughly describe those services.
Do you think anybody that needs pest control services is going to read 2,000 words of content on a page about pest control? Bradley asked me. "They're looking to find: do you do this service, are you located near me, and how do I contact you. That's all they're looking for."
Bradley sees SEO professionals all the time who ask him why their project isn't performing, and it's because they're still optimizing like it's 2020. He told me, "In the last two years, the algorithm has changed more than it had in the previous 10 combined, and that is not a joke."
If you keep doing what you've always done and your results aren't getting any better, that's the definition of insanity.
Google Doesn't Read Your Content The Way You Think
One of the most surprising things Bradley shared was about how Google actually processes local SEO pages. Google has been very public about saying that their servers are overloaded. They cannot keep up with crawling and indexing all the new content on the web because of the explosion from ChatGPT and AI writing tools.
Because of that, part of the reason why this new optimization method works so well is because it reduces content and makes it easier for Google. It reduces crawl resistance. There are fewer pages and fewer things for Google to process.
If we can very clearly convey to Google who the business is, what they do, and where they do it in an efficient manner, Google will reward that project with better ranking performance.
Bradley has spent two years testing and found what he thinks is the perfect site structure for tree service contractors. It can be applied to any type of service area business. If you understand direct response marketing, you know that whatever your best performing campaign is becomes your control. Your sole purpose is to beat the control. In SEO, it's similar. You find your best performing structure, duplicate it with one variable changed, and see if it improves.
Bradley has spent two years working on just tree service sites on topic and category optimization methods. Being able to coach others on how to apply that to other types of service area businesses has been really rewarding. And this approach reduces the work significantly. You don't have to create as much content. There are fewer link building targets. It lowers overall overhead costs.
The Perfect Site Structure Bradley Spent Two Years Developing
Bradley has identified three things when it comes to on-page optimization: page structure, site structure, and internal linking.
Page structure is first because it's pages that rank, not sites. Within page structure, Bradley focuses on optimization points that provide the most leverage, the biggest performance gains with the least amount of work.
For page structure, the most important elements are:
The URL is always the first point. Bradley's advice? Don't over-optimize it. De-optimize it. Make it succinct and short without duplicating SEO terms or keywords. He sees domain names with pest control in them, then URLs like forward slash pest-control-city forward slash pest-control-city-termite-control. Years ago that worked. Now that's a negative ranking factor.
The second element is the SEO title (the meta title or document title) and the H1 (the page title). These are both critically important. Bradley sees SEO professionals duplicate the exact same title in both. Don't do that anymore.
Optimize for the top level category and/or topic term in the SEO title, and then use the other one in the H1, or vice versa. You can interchange them. Google sometimes displays an H1 as the SEO title or even the first H2 as the search results title.
In Bradley's industry, the top level category is "tree service," but the top level topic term is "tree maintenance." He usually does tree service in the SEO title and tree maintenance in the H1. He can rank that one page for every single search query even if it's not on the page.
So for page structure: URL, then SEO title, then H1, then headings (H2s), then subheadings (H3s, H4s, H5s). The hierarchy is critical.
When the internet was early, page structure with proper HTML markup was very important. Over the years it became less important, and web designers started using H tags to style elements. Now that it's back to entity-based SEO, proper use of HTML markup is absolutely critical.
Bradley can rank a page with very little content, almost exclusively with just proper page structure. He wants to test ranking a page with nothing but headings and images.
For example, in pest control, a location page should be optimized for Pest Control Service in the SEO title and pest control or some variant in the H1. Going back to that example where termite control, ant control, roach control, and spider control are the four services, those would each be an H2. Those are the services that fall under Pest Control Service.
Then H3s, H4s, and H5s are subheadings that support their respective H2s. The H2s should support the H1. The H1 should support the document title.
Don't use HTML elements to style content. Use them to tell Google what this page is about.
For local SEO, Bradley told me, "Google is not crawling the page and reading the content. It's not. It's looking at the big elements of the page."
Bradley tested this. He published a page with just an SEO title, H1, two H2s, and a couple of H3s under each H2 with no content or images. He ran it through Google's Natural Language API. It came back and identified exactly what he wanted with no content at all. Using Google's own API, Bradley knows for a fact that Google is not reading the content for local SEO.
So that's page structure. Then site structure and internal linking is about the other pages not having redundancy or duplication, and how you interlink pages so it's clear to Google and visitors how to navigate.
That's why Bradley says to have a location page optimized for top level topics or categories. Then you have an H2 that says "We provide the following services in" and repeat the city and state. Underneath, those are H3s that link to their respective service pages.
Now that's perfectly logical for Google. When it crawls the page, it's looking at headings and subheadings. The subheadings are links that follow to pages that thoroughly describe each service.
You don't need 13 paragraphs on the location page explaining each service. All you need to do is satisfy the intent of the searcher.
Those three elements (page structure, site structure, internal linking), if you nail them, then all other SEO efforts become easier. You need far less content, far fewer pages, far fewer backlinks. If you do internal linking properly, it doesn't matter which page you build links to because it distributes out through the internal links.
In the tree service industry, Bradley has a site structure with 10 pages: homepage, four service pages, five location pages. Plus standard pages like contact, about, privacy. But literally 10 content pages.
Bradley can rank those sites in low competition areas with zero backlinks. They rank number one for every search query with no backlinks because it's all about page structure, site structure, and internal linking.
Building Google's Knowledge Graph For Your Clients
Bradley uses a great visualization to explain how Google understands local businesses. Google develops a Knowledge Graph for each local business, which is Google's understanding of who the business is, what they do, and where they do it.
Bradley uses that statement because in the semantic web, you want to feed data to Google in semantic triplet format (subject, object, predicate). In layman's terms, that's who you are, what you do, and where you do it.
If we can convey that clearly to Google, then it makes everything easier because Google rewards it when it has a good understanding.
A Knowledge Graph is like a jigsaw puzzle. When a business first launches, Google has a vague understanding. It has to go out and find corroborating evidence online that supports who that business is, what they do, and where they do it.
If you leave Google on its own to develop that Knowledge Graph, it takes a long time.
When you buy a puzzle, you know from the box what the image is supposed to be. But when you dump it onto a table, it's just scattered pieces. It doesn't look like anything until you start organizing and fitting them together. As you get more pieces in the proper way, that image becomes clearer.
It's the exact same thing with Google's Knowledge Graph. Google has no idea at first. It has a vague understanding.
But if we as SEO professionals can quickly help inform the Knowledge Graph, can we help Google develop that understanding? Yes, we absolutely can.
One way is by forcing Google to recognize accurate information by providing URLs. Creating business directory listings, citations, publishing content, press releases that provide accurate information about who the business is, what they do, and where they do it.
We can extract those URLs and put them in strategic places that force Google to recognize that and put all those puzzle pieces together quickly. Now Google has a clear understanding, which is why it rewards those businesses.
As SEO professionals, we're puzzle builders. We should be putting the puzzle pieces together to help Google have that understanding quicker so our clients benefit.
My Main Takeaway
1. The biggest thing I learned from Bradley is that Google's algorithm has fundamentally shifted from keywords to entities, and most SEO professionals are still optimizing like it's 2020. If you optimize a page for a specific keyword, that page can only rank for that keyword and close variants. But if you optimize for a topic or category using proper page structure, that page can rank for every search query Google associates with that topic, even if those exact words never appear on the page. This isn't theoretical. Bradley has proven this with two years of testing in the tree service industry and through consulting with agencies in other industries. The key is understanding that Google is looking at structural elements like your URL, SEO title, H1, and headings to determine what a page is about, not reading through walls of content. Bradley tested this by creating a page with just an SEO title, H1, two H2s, and some H3s with no body content or images, then ran it through Google's Natural Language API. The API correctly identified what the page was about with zero content, proving that for local SEO, Google relies on structural elements far more than actual written content.
2. The second takeaway is that reducing pages actually improves rankings in the current algorithm, which is completely counterintuitive to what most SEO professionals learned and practiced for years. The old approach of creating 20 pages for four services across five locations is now actively hurting websites because of Google's Helpful Content Update. Google now creates an overall site quality score that acts as a multiplier for every page on your site. When you have redundant, duplicated content across dozens of pages saying essentially the same thing over and over, Google lowers that quality score and applies it to every single page. None of your pages perform as well as they should. Instead, you should have one service page per service optimized for service plus brand with no location mentioned, and one location page per city optimized for the top level category plus location plus brand. On the location page, you just need to meet the searcher's intent (who you are, where you are, how to contact you) and then use internal links to point to your service pages. This reduces your page count by more than half while actually improving performance.
3. The third insight is that proper HTML structure and hierarchy is critically important again for local SEO. Web designers started using H tags just to style elements of a page, but now that we're in entity-based SEO and Google is using large language models integrated into the search algorithm, proper use of headings and subheadings is absolutely critical to communicate what your page is about. Your H1 should be your page title optimized for the top level topic term. Your H2s should be the main services or topics that fall under that category. Your H3s through H5s should support their respective H2s. The H2s should support the H1, and the H1 should support the document title. This hierarchy tells Google exactly what your page is about in a very clear, efficient way. You should never duplicate your SEO title and H1. Instead, use one for the top level category and the other for the top level topic term. Don't use HTML elements to style your content, use them to convey meaning and structure to Google.
4. The fourth major takeaway is the concept of being a puzzle builder for Google's Knowledge Graph, which is one of the best visualizations for understanding how local SEO actually works. When a local business first launches, Google has a very vague understanding of who they are, what they do, and where they do it. It's like dumping out a jigsaw puzzle onto a table. You know what the picture is supposed to be from looking at the box, but all you see is scattered pieces until you start organizing and fitting them together. If you leave Google on its own to figure this out by searching for corroborating evidence online, it takes forever for that picture to become clear. But as SEO professionals, we can dramatically speed up this process by creating business directory listings, citations, press releases, and other strategic URLs that provide accurate, consistent information to Google about who the business is, what they do, and where they do it. We extract those URLs and put them in strategic places that force Google to recognize them and put the puzzle pieces together quickly. When we help Google develop a clear Knowledge Graph fast, Google rewards those businesses with better rankings.
5. The fifth and final insight is that this new approach is massively more efficient for everyone involved. You need far less content because you're not writing 2,000 words about ant control for five different cities. You're writing one thorough piece of content about ant control on a service page, then using location pages with minimal content that link to those service pages. You have fewer pages to maintain and optimize, which means lower overhead. You have fewer link building targets, so you can focus your efforts on fewer pages and let the proper internal linking distribute that authority throughout the site. Your costs go down, your workload goes down, but your results go up. And critically, it's also more efficient for Google, which is why it works. Google has publicly stated their servers are overloaded with the explosion of AI-generated content and they can't keep up with crawling and indexing everything. When you reduce crawl resistance by having a lean, well-structured site that clearly conveys who the business is, what they do, and where they do it, Google rewards you. The sweet spot Bradley has found for tree service companies is 10 content pages total: homepage, four service pages, five location pages. He has examples of sites ranking number one for every relevant search query in low competition markets with zero backlinks, just from nailing page structure, site structure, and internal linking.
Where to Find Bradley Benner
Bradley runs a free weekly webinar every Wednesday at 4 PM Eastern called Hump Day Hangouts. They just hit their 500th episode and are about 20 episodes away from their 10 year anniversary. You can watch replays, post questions ahead of time, and even interact with an AI version of Bradley that's been trained on 10 years of him answering SEO questions live on the show. Check it out at semanticmastery.com/hdh.
You can also find Bradley's work at:
Tree Care HQ (his local SEO agency specializing in tree service companies)
Semantic Mastery (his coaching and consulting company for local SEO professionals)
Semantic Links (white label link building services for agencies)
Local Fury (app specifically for improving Google Maps rankings)
Latest
More Blogs By Danny Leibrandt
Get the latest insights on business, digital marketing, and entrepreneurship from Danny Leibrandt.
Connect to Content
Add layers or components to infinitely loop on your page.
Local SEO
Bradley Benner on Why Simple Websites Beat Keyword-Stuffed Pages | Local Marketing Secrets with Dan Leibrandt
I recently sat down with Bradley Benner, and this conversation completely changed how I think about local SEO. Bradley is the founder of Tree Care HQ, Semantic Mastery, Semantic Links, and Local Fury. He's been doing local SEO for over 15 years, and he started in the most relatable way possible: as an electrician who couldn't afford to hire a marketing agency.
We talked about the massive shift in Google's algorithm from keyword-based to entity-based search, why most SEO professionals are still stuck optimizing websites like it's 2020, and how Bradley discovered that reducing content actually improves rankings. He also shared his exact site structure that consistently ranks tree service companies with zero backlinks in low-competition markets.
/ / / / / / / /
From Pulling Wire to Building Websites
Bradley didn't start out as a marketing guy. He was an electrical contractor in Virginia who started his business around 2007 or 2008. For the first year or two, he was primarily a subcontractor working for general contractors and remodeling companies. But eventually, he wanted to generate his own leads instead of relying on other people to feed him work.
The problem was that Bradley had learned traditional marketing from his time in real estate, things like print advertising, direct mail, flyers, newspaper ads, and canvassing neighborhoods. He knew almost nothing about digital marketing and could barely operate a computer. So he started asking the contractors he worked with how they were getting leads, and most of the successful ones had hired someone to help them rank on Google.
Bradley couldn't afford to hire anyone, so he did what any determined contractor would do. He decided to teach himself. He built a test website for his electrical business, then created two more test sites for carpet cleaning and locksmith services. He had no clients for those other two industries, they were purely for experimentation.
Within about six months, Bradley got all three sites ranking at the top of Google. He was generating leads for his own electrical company and creating leads for carpet cleaning and locksmith jobs that he didn't even have businesses for yet. That's when he realized he could rent these sites out to contractors in those industries, which is called the rank and rent model.
Once Bradley rented out those first two sites to actual contractors, he saw the potential for a legitimate side hustle. From 2010 to around March 2012, he focused entirely on lead generation by building WordPress websites and Google Places profiles (what we now call Google Business Profiles), ranking them, and then pitching contractors to rent the sites from him.
The interesting thing about Bradley's early business was that everything was local to Virginia. That worked great during most of the year, but Virginia winters are tough enough that home construction and related work slows down significantly. After two winter seasons where his additional SEO income would decline dramatically, he decided he needed to stabilize that income stream. That's when he opened Big Bamboo Marketing LLC in 2012, which he later rebranded as Tree Care HQ, and started offering traditional retainer-based services to local businesses.
About six months after starting his agency, Bradley had replaced his electrical contractor income completely. A year later, in 2013, he launched Semantic Mastery to teach other people how to do local SEO.
The Seven Dollar PDF That Built a Four Year Business Model
What really fascinated me about Bradley's early journey was how simple his ranking method was back then. He stumbled onto something called Warrior Special Offers, which was basically a marketplace for cheap digital marketing products. Bradley bought a seven dollar PDF guide written by a guy named David Kacic about something called automatic links. The guide taught him how to use IFTTT, which stands for If This Then That.
The guide showed Bradley how to build what he calls syndication networks. He would go out and create 15 to 20 web properties like social media accounts, external blogs (Blogger, Tumblr, WordPress, Weebly), bookmarking sites, and other platforms, all branded for a specific company. Then he would connect all of them to IFTTT so that whenever he published a blog post on the client's main website, it would automatically republish or syndicate to all those other branded properties.
Bradley told me something that blew my mind: "I swear to God for about the first four years of my local SEO agency that was all I did."
For four years, his entire business model was building these syndication networks. He would build links to those network properties using link building tools (which was safe to do back then), publish content consistently, and watch his clients rank like clockwork. He said within about two months he would rank every single client he touched.
Bradley eventually got smart and hired people to build these networks for him instead of doing it himself. He created training to teach virtual assistants how to build them so he could spend more time landing clients. That training he created for employees actually became Semantic Mastery's first product, which they launched as IFTTT SEO. About a year after they launched that course, they got a cease and desist letter from IFTTT's attorneys for trademark infringement, so they changed the name to Syndication Academy.
Why Bradley Only Works With Tree Companies Now
For years, Bradley thought he had niched down by only working with contractors. He was doing SEO for pest control companies, septic service companies, HVAC companies, plumbers, tree services, roofers, and all kinds of home service businesses. But he eventually realized that wasn't really a niche at all.
Each of those industries has its own keywords, vocabulary, customer intent, and what makes compelling content that actually works and gets results. After years of this grind, Bradley finally asked himself why he was doing this to himself. In 2019, he created a DBA for his main corporation and called his local SEO agency Tree Care HQ. From that point on, he would only take on tree service contractors as new clients.
That decision to truly niche down into a single vertical made a huge difference in his business. Once he did that, he was really able to start scaling because he only needed to master one industry.
In 2022, Bradley started a link building business called Semantic Links, which provides white label link building services specifically for local SEO agencies. That business has been really successful for the last two and a half years and now actually generates more revenue for Bradley than even his own local SEO agency does. He also started Local Fury, an app specifically designed to help improve organic maps ranking.
Bradley has now been doing digital marketing for about 15 years, almost exclusively focused on local SEO, and he's still fascinated with it.
The Algorithm Shift That Changed Everything
When I asked Bradley why he thinks local SEO is easier now than it was three or four years ago, he dropped some serious knowledge about how Google's algorithm has fundamentally changed.
The algorithm has shifted from a keyword-based system to an entity-based algorithm. For most people who aren't SEO professionals, Bradley explains this as a shift from keyword focus to topic and category focus.
Bradley explained that Google is nothing but a document retrieval algorithm at its core. It's very advanced, but at its simplest form, it's just a document retrieval algorithm. Google treats web pages as documents, and the SEO titles or meta titles of web pages are the document titles.
The way Google used to return results was based on a search query (a keyword). It would look in its database and try to identify which documents or pages were most relevant based on factors like keyword density, meaning how many times the keyword appeared within the content as a ratio of the total words on the page.
It was a very crude algorithm, and that's why it was so easy to manipulate back then. You could just stuff keywords into a page and rank based on having higher keyword density.
But around the end of 2021, the Google algorithm changed and shifted. We're in the semantic web now. Semantic web technologies have been fully adopted by Google. The semantic web is an entity-based algorithm. An entity is a person, place, or thing. The way that semantic search technology works is that the artificial intelligence and machine learning understands what an element or entity is, and then it determines how those entities relate to each other.
Google now has a deeper understanding of things than it did before. Because of large language models integrated into the search algorithm, it now has an understanding of human language, primarily in English.
Now, if you optimize for topics or categories, you can rank a page for every single search query that Google already has an understanding of that is associated with that topic or category, even if that keyword's not present on the page.
When Bradley talks about categories, he's talking about Google Business categories assigned to particular industries. In pest control, the actual Google Business category is "Pest Control Service." For topic optimization, that's for ranking in organic search. Google created its Google Business categories from Wikipedia entities.
In Bradley's industry, the top level category is "tree service," but the top level topic term is "tree maintenance." He told me, "I can rank one page for every single search query associated with the tree service business industry if I optimize for tree service and tree maintenance, those two top level terms."
Most SEO professionals are still stuck in keyword mode. You can still optimize for keywords if you want, but it's limited in performance. If you optimize a document or page for a specific keyword, that page can only rank for that keyword and close variants. But if you optimize for a topic or a category, you can rank that page for every single search query that Google has an understanding of that's in its database for that topic or category.
Bradley has spent two years now studying and testing topic and category optimization, which is entity-based SEO. He's done those two years of testing exclusively in the tree service industry. Because he also does a lot of consulting with other SEO professionals and agencies, he's been fortunate to see how his process and methods can be applied to other industries, primarily for service area businesses.
Why Your Twenty Page Website Is Hurting Your Rankings
This section completely changed how I think about website structure, and it probably describes a lot of pest control websites right now.
Bradley walked me through an example. Let's say a pest control company does termite control, ant control, roach control, and spider control. And let's say in their service area there are five locations or city names that they want to target.
The traditional SEO approach would be to create 20 pages. You'd have Termite Control City 1, Termite Control City 2, Termite Control City 3, and so on. Four services times five cities equals 20 pages. Each one of those 20 pages probably has 1,000 words or 1,500 words or even 2,000 words of content.
Here's the problem: that's redundant. It's bloat. It's duplicated content. Three or four years ago, that's what everyone in the SEO industry did and it worked. But now that actually causes problems.
Google had what's called the Helpful Content Update. The simplified version is that Google now crawls a site and comes up with an overall site quality score. That quality score is now a multiplier that gets applied to every page on the site.
Previous to the Helpful Content Update, Google didn't rank websites. Google ranks web pages for search queries. Remember, it's a document retrieval algorithm. A web page is a document.
Before the Helpful Content Update, you could have a junk site that was not optimized well at all, but you could have one page on that site that was optimized to perfection, and that one page could rank really well. The rest of the pages wouldn't rank, but that one page could.
The Helpful Content Update eliminated that. Now, even if you had one page on the site that was optimized to perfection, if the overall site quality score was 70 percent, then Google's going to apply a 70 percent modifier to that page. It was Google's way of demoting websites that weren't optimized and efficient.
Going back to that example of four services with five locations where you now have 20 pages, you're saying the same damn thing over and over and over again. That's absolutely unnecessary. It's redundant. It's bloat. It also causes what's called crawl resistance because Google now has a harder time identifying what is the best page for what.
Bradley's solution is simple. If you've got four services in this example, then you have four service pages that are not optimized for any one location. It should be service plus brand. So you have Termite Control plus brand name, Ant Control plus brand name, and so on. Each one of those pages is just optimized for the service and the brand, and then you optimize that page thoroughly with descriptions about the service and what to expect.
Then you have separate location pages optimized for each of the cities that you want to target. Those are optimized for the top level category and topic term, meaning Pest Control Service City plus brand name in the SEO title.
Through internal linking, you only have one location page for each service area location. Not five or four like in the example earlier. Now you only need one location page per city.
On that location page, you don't need to have 1,000 or 1,500 or 2,000 words of content. You literally just need to meet the intent of the searcher, which would be who you are, how to contact you, and then if they want more information, here are the products or services provided in this area. That's just internal links that link over to the pages that thoroughly describe those services.
Do you think anybody that needs pest control services is going to read 2,000 words of content on a page about pest control? Bradley asked me. "They're looking to find: do you do this service, are you located near me, and how do I contact you. That's all they're looking for."
Bradley sees SEO professionals all the time who ask him why their project isn't performing, and it's because they're still optimizing like it's 2020. He told me, "In the last two years, the algorithm has changed more than it had in the previous 10 combined, and that is not a joke."
If you keep doing what you've always done and your results aren't getting any better, that's the definition of insanity.
Google Doesn't Read Your Content The Way You Think
One of the most surprising things Bradley shared was about how Google actually processes local SEO pages. Google has been very public about saying that their servers are overloaded. They cannot keep up with crawling and indexing all the new content on the web because of the explosion from ChatGPT and AI writing tools.
Because of that, part of the reason why this new optimization method works so well is because it reduces content and makes it easier for Google. It reduces crawl resistance. There are fewer pages and fewer things for Google to process.
If we can very clearly convey to Google who the business is, what they do, and where they do it in an efficient manner, Google will reward that project with better ranking performance.
Bradley has spent two years testing and found what he thinks is the perfect site structure for tree service contractors. It can be applied to any type of service area business. If you understand direct response marketing, you know that whatever your best performing campaign is becomes your control. Your sole purpose is to beat the control. In SEO, it's similar. You find your best performing structure, duplicate it with one variable changed, and see if it improves.
Bradley has spent two years working on just tree service sites on topic and category optimization methods. Being able to coach others on how to apply that to other types of service area businesses has been really rewarding. And this approach reduces the work significantly. You don't have to create as much content. There are fewer link building targets. It lowers overall overhead costs.
The Perfect Site Structure Bradley Spent Two Years Developing
Bradley has identified three things when it comes to on-page optimization: page structure, site structure, and internal linking.
Page structure is first because it's pages that rank, not sites. Within page structure, Bradley focuses on optimization points that provide the most leverage, the biggest performance gains with the least amount of work.
For page structure, the most important elements are:
The URL is always the first point. Bradley's advice? Don't over-optimize it. De-optimize it. Make it succinct and short without duplicating SEO terms or keywords. He sees domain names with pest control in them, then URLs like forward slash pest-control-city forward slash pest-control-city-termite-control. Years ago that worked. Now that's a negative ranking factor.
The second element is the SEO title (the meta title or document title) and the H1 (the page title). These are both critically important. Bradley sees SEO professionals duplicate the exact same title in both. Don't do that anymore.
Optimize for the top level category and/or topic term in the SEO title, and then use the other one in the H1, or vice versa. You can interchange them. Google sometimes displays an H1 as the SEO title or even the first H2 as the search results title.
In Bradley's industry, the top level category is "tree service," but the top level topic term is "tree maintenance." He usually does tree service in the SEO title and tree maintenance in the H1. He can rank that one page for every single search query even if it's not on the page.
So for page structure: URL, then SEO title, then H1, then headings (H2s), then subheadings (H3s, H4s, H5s). The hierarchy is critical.
When the internet was early, page structure with proper HTML markup was very important. Over the years it became less important, and web designers started using H tags to style elements. Now that it's back to entity-based SEO, proper use of HTML markup is absolutely critical.
Bradley can rank a page with very little content, almost exclusively with just proper page structure. He wants to test ranking a page with nothing but headings and images.
For example, in pest control, a location page should be optimized for Pest Control Service in the SEO title and pest control or some variant in the H1. Going back to that example where termite control, ant control, roach control, and spider control are the four services, those would each be an H2. Those are the services that fall under Pest Control Service.
Then H3s, H4s, and H5s are subheadings that support their respective H2s. The H2s should support the H1. The H1 should support the document title.
Don't use HTML elements to style content. Use them to tell Google what this page is about.
For local SEO, Bradley told me, "Google is not crawling the page and reading the content. It's not. It's looking at the big elements of the page."
Bradley tested this. He published a page with just an SEO title, H1, two H2s, and a couple of H3s under each H2 with no content or images. He ran it through Google's Natural Language API. It came back and identified exactly what he wanted with no content at all. Using Google's own API, Bradley knows for a fact that Google is not reading the content for local SEO.
So that's page structure. Then site structure and internal linking is about the other pages not having redundancy or duplication, and how you interlink pages so it's clear to Google and visitors how to navigate.
That's why Bradley says to have a location page optimized for top level topics or categories. Then you have an H2 that says "We provide the following services in" and repeat the city and state. Underneath, those are H3s that link to their respective service pages.
Now that's perfectly logical for Google. When it crawls the page, it's looking at headings and subheadings. The subheadings are links that follow to pages that thoroughly describe each service.
You don't need 13 paragraphs on the location page explaining each service. All you need to do is satisfy the intent of the searcher.
Those three elements (page structure, site structure, internal linking), if you nail them, then all other SEO efforts become easier. You need far less content, far fewer pages, far fewer backlinks. If you do internal linking properly, it doesn't matter which page you build links to because it distributes out through the internal links.
In the tree service industry, Bradley has a site structure with 10 pages: homepage, four service pages, five location pages. Plus standard pages like contact, about, privacy. But literally 10 content pages.
Bradley can rank those sites in low competition areas with zero backlinks. They rank number one for every search query with no backlinks because it's all about page structure, site structure, and internal linking.
Building Google's Knowledge Graph For Your Clients
Bradley uses a great visualization to explain how Google understands local businesses. Google develops a Knowledge Graph for each local business, which is Google's understanding of who the business is, what they do, and where they do it.
Bradley uses that statement because in the semantic web, you want to feed data to Google in semantic triplet format (subject, object, predicate). In layman's terms, that's who you are, what you do, and where you do it.
If we can convey that clearly to Google, then it makes everything easier because Google rewards it when it has a good understanding.
A Knowledge Graph is like a jigsaw puzzle. When a business first launches, Google has a vague understanding. It has to go out and find corroborating evidence online that supports who that business is, what they do, and where they do it.
If you leave Google on its own to develop that Knowledge Graph, it takes a long time.
When you buy a puzzle, you know from the box what the image is supposed to be. But when you dump it onto a table, it's just scattered pieces. It doesn't look like anything until you start organizing and fitting them together. As you get more pieces in the proper way, that image becomes clearer.
It's the exact same thing with Google's Knowledge Graph. Google has no idea at first. It has a vague understanding.
But if we as SEO professionals can quickly help inform the Knowledge Graph, can we help Google develop that understanding? Yes, we absolutely can.
One way is by forcing Google to recognize accurate information by providing URLs. Creating business directory listings, citations, publishing content, press releases that provide accurate information about who the business is, what they do, and where they do it.
We can extract those URLs and put them in strategic places that force Google to recognize that and put all those puzzle pieces together quickly. Now Google has a clear understanding, which is why it rewards those businesses.
As SEO professionals, we're puzzle builders. We should be putting the puzzle pieces together to help Google have that understanding quicker so our clients benefit.
My Main Takeaway
1. The biggest thing I learned from Bradley is that Google's algorithm has fundamentally shifted from keywords to entities, and most SEO professionals are still optimizing like it's 2020. If you optimize a page for a specific keyword, that page can only rank for that keyword and close variants. But if you optimize for a topic or category using proper page structure, that page can rank for every search query Google associates with that topic, even if those exact words never appear on the page. This isn't theoretical. Bradley has proven this with two years of testing in the tree service industry and through consulting with agencies in other industries. The key is understanding that Google is looking at structural elements like your URL, SEO title, H1, and headings to determine what a page is about, not reading through walls of content. Bradley tested this by creating a page with just an SEO title, H1, two H2s, and some H3s with no body content or images, then ran it through Google's Natural Language API. The API correctly identified what the page was about with zero content, proving that for local SEO, Google relies on structural elements far more than actual written content.
2. The second takeaway is that reducing pages actually improves rankings in the current algorithm, which is completely counterintuitive to what most SEO professionals learned and practiced for years. The old approach of creating 20 pages for four services across five locations is now actively hurting websites because of Google's Helpful Content Update. Google now creates an overall site quality score that acts as a multiplier for every page on your site. When you have redundant, duplicated content across dozens of pages saying essentially the same thing over and over, Google lowers that quality score and applies it to every single page. None of your pages perform as well as they should. Instead, you should have one service page per service optimized for service plus brand with no location mentioned, and one location page per city optimized for the top level category plus location plus brand. On the location page, you just need to meet the searcher's intent (who you are, where you are, how to contact you) and then use internal links to point to your service pages. This reduces your page count by more than half while actually improving performance.
3. The third insight is that proper HTML structure and hierarchy is critically important again for local SEO. Web designers started using H tags just to style elements of a page, but now that we're in entity-based SEO and Google is using large language models integrated into the search algorithm, proper use of headings and subheadings is absolutely critical to communicate what your page is about. Your H1 should be your page title optimized for the top level topic term. Your H2s should be the main services or topics that fall under that category. Your H3s through H5s should support their respective H2s. The H2s should support the H1, and the H1 should support the document title. This hierarchy tells Google exactly what your page is about in a very clear, efficient way. You should never duplicate your SEO title and H1. Instead, use one for the top level category and the other for the top level topic term. Don't use HTML elements to style your content, use them to convey meaning and structure to Google.
4. The fourth major takeaway is the concept of being a puzzle builder for Google's Knowledge Graph, which is one of the best visualizations for understanding how local SEO actually works. When a local business first launches, Google has a very vague understanding of who they are, what they do, and where they do it. It's like dumping out a jigsaw puzzle onto a table. You know what the picture is supposed to be from looking at the box, but all you see is scattered pieces until you start organizing and fitting them together. If you leave Google on its own to figure this out by searching for corroborating evidence online, it takes forever for that picture to become clear. But as SEO professionals, we can dramatically speed up this process by creating business directory listings, citations, press releases, and other strategic URLs that provide accurate, consistent information to Google about who the business is, what they do, and where they do it. We extract those URLs and put them in strategic places that force Google to recognize them and put the puzzle pieces together quickly. When we help Google develop a clear Knowledge Graph fast, Google rewards those businesses with better rankings.
5. The fifth and final insight is that this new approach is massively more efficient for everyone involved. You need far less content because you're not writing 2,000 words about ant control for five different cities. You're writing one thorough piece of content about ant control on a service page, then using location pages with minimal content that link to those service pages. You have fewer pages to maintain and optimize, which means lower overhead. You have fewer link building targets, so you can focus your efforts on fewer pages and let the proper internal linking distribute that authority throughout the site. Your costs go down, your workload goes down, but your results go up. And critically, it's also more efficient for Google, which is why it works. Google has publicly stated their servers are overloaded with the explosion of AI-generated content and they can't keep up with crawling and indexing everything. When you reduce crawl resistance by having a lean, well-structured site that clearly conveys who the business is, what they do, and where they do it, Google rewards you. The sweet spot Bradley has found for tree service companies is 10 content pages total: homepage, four service pages, five location pages. He has examples of sites ranking number one for every relevant search query in low competition markets with zero backlinks, just from nailing page structure, site structure, and internal linking.
Where to Find Bradley Benner
Bradley runs a free weekly webinar every Wednesday at 4 PM Eastern called Hump Day Hangouts. They just hit their 500th episode and are about 20 episodes away from their 10 year anniversary. You can watch replays, post questions ahead of time, and even interact with an AI version of Bradley that's been trained on 10 years of him answering SEO questions live on the show. Check it out at semanticmastery.com/hdh.
You can also find Bradley's work at:
Tree Care HQ (his local SEO agency specializing in tree service companies)
Semantic Mastery (his coaching and consulting company for local SEO professionals)
Semantic Links (white label link building services for agencies)
Local Fury (app specifically for improving Google Maps rankings)
Latest
More Blogs By Danny Leibrandt
Get the latest insights on business, digital marketing, and entrepreneurship from Danny Leibrandt.
Connect to Content
Add layers or components to infinitely loop on your page.
Local SEO
Bradley Benner on Why Simple Websites Beat Keyword-Stuffed Pages | Local Marketing Secrets with Dan Leibrandt
Jul 8, 2024

I recently sat down with Bradley Benner, and this conversation completely changed how I think about local SEO. Bradley is the founder of Tree Care HQ, Semantic Mastery, Semantic Links, and Local Fury. He's been doing local SEO for over 15 years, and he started in the most relatable way possible: as an electrician who couldn't afford to hire a marketing agency.
We talked about the massive shift in Google's algorithm from keyword-based to entity-based search, why most SEO professionals are still stuck optimizing websites like it's 2020, and how Bradley discovered that reducing content actually improves rankings. He also shared his exact site structure that consistently ranks tree service companies with zero backlinks in low-competition markets.
/ / / / / / / /
From Pulling Wire to Building Websites
Bradley didn't start out as a marketing guy. He was an electrical contractor in Virginia who started his business around 2007 or 2008. For the first year or two, he was primarily a subcontractor working for general contractors and remodeling companies. But eventually, he wanted to generate his own leads instead of relying on other people to feed him work.
The problem was that Bradley had learned traditional marketing from his time in real estate, things like print advertising, direct mail, flyers, newspaper ads, and canvassing neighborhoods. He knew almost nothing about digital marketing and could barely operate a computer. So he started asking the contractors he worked with how they were getting leads, and most of the successful ones had hired someone to help them rank on Google.
Bradley couldn't afford to hire anyone, so he did what any determined contractor would do. He decided to teach himself. He built a test website for his electrical business, then created two more test sites for carpet cleaning and locksmith services. He had no clients for those other two industries, they were purely for experimentation.
Within about six months, Bradley got all three sites ranking at the top of Google. He was generating leads for his own electrical company and creating leads for carpet cleaning and locksmith jobs that he didn't even have businesses for yet. That's when he realized he could rent these sites out to contractors in those industries, which is called the rank and rent model.
Once Bradley rented out those first two sites to actual contractors, he saw the potential for a legitimate side hustle. From 2010 to around March 2012, he focused entirely on lead generation by building WordPress websites and Google Places profiles (what we now call Google Business Profiles), ranking them, and then pitching contractors to rent the sites from him.
The interesting thing about Bradley's early business was that everything was local to Virginia. That worked great during most of the year, but Virginia winters are tough enough that home construction and related work slows down significantly. After two winter seasons where his additional SEO income would decline dramatically, he decided he needed to stabilize that income stream. That's when he opened Big Bamboo Marketing LLC in 2012, which he later rebranded as Tree Care HQ, and started offering traditional retainer-based services to local businesses.
About six months after starting his agency, Bradley had replaced his electrical contractor income completely. A year later, in 2013, he launched Semantic Mastery to teach other people how to do local SEO.
The Seven Dollar PDF That Built a Four Year Business Model
What really fascinated me about Bradley's early journey was how simple his ranking method was back then. He stumbled onto something called Warrior Special Offers, which was basically a marketplace for cheap digital marketing products. Bradley bought a seven dollar PDF guide written by a guy named David Kacic about something called automatic links. The guide taught him how to use IFTTT, which stands for If This Then That.
The guide showed Bradley how to build what he calls syndication networks. He would go out and create 15 to 20 web properties like social media accounts, external blogs (Blogger, Tumblr, WordPress, Weebly), bookmarking sites, and other platforms, all branded for a specific company. Then he would connect all of them to IFTTT so that whenever he published a blog post on the client's main website, it would automatically republish or syndicate to all those other branded properties.
Bradley told me something that blew my mind: "I swear to God for about the first four years of my local SEO agency that was all I did."
For four years, his entire business model was building these syndication networks. He would build links to those network properties using link building tools (which was safe to do back then), publish content consistently, and watch his clients rank like clockwork. He said within about two months he would rank every single client he touched.
Bradley eventually got smart and hired people to build these networks for him instead of doing it himself. He created training to teach virtual assistants how to build them so he could spend more time landing clients. That training he created for employees actually became Semantic Mastery's first product, which they launched as IFTTT SEO. About a year after they launched that course, they got a cease and desist letter from IFTTT's attorneys for trademark infringement, so they changed the name to Syndication Academy.
Why Bradley Only Works With Tree Companies Now
For years, Bradley thought he had niched down by only working with contractors. He was doing SEO for pest control companies, septic service companies, HVAC companies, plumbers, tree services, roofers, and all kinds of home service businesses. But he eventually realized that wasn't really a niche at all.
Each of those industries has its own keywords, vocabulary, customer intent, and what makes compelling content that actually works and gets results. After years of this grind, Bradley finally asked himself why he was doing this to himself. In 2019, he created a DBA for his main corporation and called his local SEO agency Tree Care HQ. From that point on, he would only take on tree service contractors as new clients.
That decision to truly niche down into a single vertical made a huge difference in his business. Once he did that, he was really able to start scaling because he only needed to master one industry.
In 2022, Bradley started a link building business called Semantic Links, which provides white label link building services specifically for local SEO agencies. That business has been really successful for the last two and a half years and now actually generates more revenue for Bradley than even his own local SEO agency does. He also started Local Fury, an app specifically designed to help improve organic maps ranking.
Bradley has now been doing digital marketing for about 15 years, almost exclusively focused on local SEO, and he's still fascinated with it.
The Algorithm Shift That Changed Everything
When I asked Bradley why he thinks local SEO is easier now than it was three or four years ago, he dropped some serious knowledge about how Google's algorithm has fundamentally changed.
The algorithm has shifted from a keyword-based system to an entity-based algorithm. For most people who aren't SEO professionals, Bradley explains this as a shift from keyword focus to topic and category focus.
Bradley explained that Google is nothing but a document retrieval algorithm at its core. It's very advanced, but at its simplest form, it's just a document retrieval algorithm. Google treats web pages as documents, and the SEO titles or meta titles of web pages are the document titles.
The way Google used to return results was based on a search query (a keyword). It would look in its database and try to identify which documents or pages were most relevant based on factors like keyword density, meaning how many times the keyword appeared within the content as a ratio of the total words on the page.
It was a very crude algorithm, and that's why it was so easy to manipulate back then. You could just stuff keywords into a page and rank based on having higher keyword density.
But around the end of 2021, the Google algorithm changed and shifted. We're in the semantic web now. Semantic web technologies have been fully adopted by Google. The semantic web is an entity-based algorithm. An entity is a person, place, or thing. The way that semantic search technology works is that the artificial intelligence and machine learning understands what an element or entity is, and then it determines how those entities relate to each other.
Google now has a deeper understanding of things than it did before. Because of large language models integrated into the search algorithm, it now has an understanding of human language, primarily in English.
Now, if you optimize for topics or categories, you can rank a page for every single search query that Google already has an understanding of that is associated with that topic or category, even if that keyword's not present on the page.
When Bradley talks about categories, he's talking about Google Business categories assigned to particular industries. In pest control, the actual Google Business category is "Pest Control Service." For topic optimization, that's for ranking in organic search. Google created its Google Business categories from Wikipedia entities.
In Bradley's industry, the top level category is "tree service," but the top level topic term is "tree maintenance." He told me, "I can rank one page for every single search query associated with the tree service business industry if I optimize for tree service and tree maintenance, those two top level terms."
Most SEO professionals are still stuck in keyword mode. You can still optimize for keywords if you want, but it's limited in performance. If you optimize a document or page for a specific keyword, that page can only rank for that keyword and close variants. But if you optimize for a topic or a category, you can rank that page for every single search query that Google has an understanding of that's in its database for that topic or category.
Bradley has spent two years now studying and testing topic and category optimization, which is entity-based SEO. He's done those two years of testing exclusively in the tree service industry. Because he also does a lot of consulting with other SEO professionals and agencies, he's been fortunate to see how his process and methods can be applied to other industries, primarily for service area businesses.
Why Your Twenty Page Website Is Hurting Your Rankings
This section completely changed how I think about website structure, and it probably describes a lot of pest control websites right now.
Bradley walked me through an example. Let's say a pest control company does termite control, ant control, roach control, and spider control. And let's say in their service area there are five locations or city names that they want to target.
The traditional SEO approach would be to create 20 pages. You'd have Termite Control City 1, Termite Control City 2, Termite Control City 3, and so on. Four services times five cities equals 20 pages. Each one of those 20 pages probably has 1,000 words or 1,500 words or even 2,000 words of content.
Here's the problem: that's redundant. It's bloat. It's duplicated content. Three or four years ago, that's what everyone in the SEO industry did and it worked. But now that actually causes problems.
Google had what's called the Helpful Content Update. The simplified version is that Google now crawls a site and comes up with an overall site quality score. That quality score is now a multiplier that gets applied to every page on the site.
Previous to the Helpful Content Update, Google didn't rank websites. Google ranks web pages for search queries. Remember, it's a document retrieval algorithm. A web page is a document.
Before the Helpful Content Update, you could have a junk site that was not optimized well at all, but you could have one page on that site that was optimized to perfection, and that one page could rank really well. The rest of the pages wouldn't rank, but that one page could.
The Helpful Content Update eliminated that. Now, even if you had one page on the site that was optimized to perfection, if the overall site quality score was 70 percent, then Google's going to apply a 70 percent modifier to that page. It was Google's way of demoting websites that weren't optimized and efficient.
Going back to that example of four services with five locations where you now have 20 pages, you're saying the same damn thing over and over and over again. That's absolutely unnecessary. It's redundant. It's bloat. It also causes what's called crawl resistance because Google now has a harder time identifying what is the best page for what.
Bradley's solution is simple. If you've got four services in this example, then you have four service pages that are not optimized for any one location. It should be service plus brand. So you have Termite Control plus brand name, Ant Control plus brand name, and so on. Each one of those pages is just optimized for the service and the brand, and then you optimize that page thoroughly with descriptions about the service and what to expect.
Then you have separate location pages optimized for each of the cities that you want to target. Those are optimized for the top level category and topic term, meaning Pest Control Service City plus brand name in the SEO title.
Through internal linking, you only have one location page for each service area location. Not five or four like in the example earlier. Now you only need one location page per city.
On that location page, you don't need to have 1,000 or 1,500 or 2,000 words of content. You literally just need to meet the intent of the searcher, which would be who you are, how to contact you, and then if they want more information, here are the products or services provided in this area. That's just internal links that link over to the pages that thoroughly describe those services.
Do you think anybody that needs pest control services is going to read 2,000 words of content on a page about pest control? Bradley asked me. "They're looking to find: do you do this service, are you located near me, and how do I contact you. That's all they're looking for."
Bradley sees SEO professionals all the time who ask him why their project isn't performing, and it's because they're still optimizing like it's 2020. He told me, "In the last two years, the algorithm has changed more than it had in the previous 10 combined, and that is not a joke."
If you keep doing what you've always done and your results aren't getting any better, that's the definition of insanity.
Google Doesn't Read Your Content The Way You Think
One of the most surprising things Bradley shared was about how Google actually processes local SEO pages. Google has been very public about saying that their servers are overloaded. They cannot keep up with crawling and indexing all the new content on the web because of the explosion from ChatGPT and AI writing tools.
Because of that, part of the reason why this new optimization method works so well is because it reduces content and makes it easier for Google. It reduces crawl resistance. There are fewer pages and fewer things for Google to process.
If we can very clearly convey to Google who the business is, what they do, and where they do it in an efficient manner, Google will reward that project with better ranking performance.
Bradley has spent two years testing and found what he thinks is the perfect site structure for tree service contractors. It can be applied to any type of service area business. If you understand direct response marketing, you know that whatever your best performing campaign is becomes your control. Your sole purpose is to beat the control. In SEO, it's similar. You find your best performing structure, duplicate it with one variable changed, and see if it improves.
Bradley has spent two years working on just tree service sites on topic and category optimization methods. Being able to coach others on how to apply that to other types of service area businesses has been really rewarding. And this approach reduces the work significantly. You don't have to create as much content. There are fewer link building targets. It lowers overall overhead costs.
The Perfect Site Structure Bradley Spent Two Years Developing
Bradley has identified three things when it comes to on-page optimization: page structure, site structure, and internal linking.
Page structure is first because it's pages that rank, not sites. Within page structure, Bradley focuses on optimization points that provide the most leverage, the biggest performance gains with the least amount of work.
For page structure, the most important elements are:
The URL is always the first point. Bradley's advice? Don't over-optimize it. De-optimize it. Make it succinct and short without duplicating SEO terms or keywords. He sees domain names with pest control in them, then URLs like forward slash pest-control-city forward slash pest-control-city-termite-control. Years ago that worked. Now that's a negative ranking factor.
The second element is the SEO title (the meta title or document title) and the H1 (the page title). These are both critically important. Bradley sees SEO professionals duplicate the exact same title in both. Don't do that anymore.
Optimize for the top level category and/or topic term in the SEO title, and then use the other one in the H1, or vice versa. You can interchange them. Google sometimes displays an H1 as the SEO title or even the first H2 as the search results title.
In Bradley's industry, the top level category is "tree service," but the top level topic term is "tree maintenance." He usually does tree service in the SEO title and tree maintenance in the H1. He can rank that one page for every single search query even if it's not on the page.
So for page structure: URL, then SEO title, then H1, then headings (H2s), then subheadings (H3s, H4s, H5s). The hierarchy is critical.
When the internet was early, page structure with proper HTML markup was very important. Over the years it became less important, and web designers started using H tags to style elements. Now that it's back to entity-based SEO, proper use of HTML markup is absolutely critical.
Bradley can rank a page with very little content, almost exclusively with just proper page structure. He wants to test ranking a page with nothing but headings and images.
For example, in pest control, a location page should be optimized for Pest Control Service in the SEO title and pest control or some variant in the H1. Going back to that example where termite control, ant control, roach control, and spider control are the four services, those would each be an H2. Those are the services that fall under Pest Control Service.
Then H3s, H4s, and H5s are subheadings that support their respective H2s. The H2s should support the H1. The H1 should support the document title.
Don't use HTML elements to style content. Use them to tell Google what this page is about.
For local SEO, Bradley told me, "Google is not crawling the page and reading the content. It's not. It's looking at the big elements of the page."
Bradley tested this. He published a page with just an SEO title, H1, two H2s, and a couple of H3s under each H2 with no content or images. He ran it through Google's Natural Language API. It came back and identified exactly what he wanted with no content at all. Using Google's own API, Bradley knows for a fact that Google is not reading the content for local SEO.
So that's page structure. Then site structure and internal linking is about the other pages not having redundancy or duplication, and how you interlink pages so it's clear to Google and visitors how to navigate.
That's why Bradley says to have a location page optimized for top level topics or categories. Then you have an H2 that says "We provide the following services in" and repeat the city and state. Underneath, those are H3s that link to their respective service pages.
Now that's perfectly logical for Google. When it crawls the page, it's looking at headings and subheadings. The subheadings are links that follow to pages that thoroughly describe each service.
You don't need 13 paragraphs on the location page explaining each service. All you need to do is satisfy the intent of the searcher.
Those three elements (page structure, site structure, internal linking), if you nail them, then all other SEO efforts become easier. You need far less content, far fewer pages, far fewer backlinks. If you do internal linking properly, it doesn't matter which page you build links to because it distributes out through the internal links.
In the tree service industry, Bradley has a site structure with 10 pages: homepage, four service pages, five location pages. Plus standard pages like contact, about, privacy. But literally 10 content pages.
Bradley can rank those sites in low competition areas with zero backlinks. They rank number one for every search query with no backlinks because it's all about page structure, site structure, and internal linking.
Building Google's Knowledge Graph For Your Clients
Bradley uses a great visualization to explain how Google understands local businesses. Google develops a Knowledge Graph for each local business, which is Google's understanding of who the business is, what they do, and where they do it.
Bradley uses that statement because in the semantic web, you want to feed data to Google in semantic triplet format (subject, object, predicate). In layman's terms, that's who you are, what you do, and where you do it.
If we can convey that clearly to Google, then it makes everything easier because Google rewards it when it has a good understanding.
A Knowledge Graph is like a jigsaw puzzle. When a business first launches, Google has a vague understanding. It has to go out and find corroborating evidence online that supports who that business is, what they do, and where they do it.
If you leave Google on its own to develop that Knowledge Graph, it takes a long time.
When you buy a puzzle, you know from the box what the image is supposed to be. But when you dump it onto a table, it's just scattered pieces. It doesn't look like anything until you start organizing and fitting them together. As you get more pieces in the proper way, that image becomes clearer.
It's the exact same thing with Google's Knowledge Graph. Google has no idea at first. It has a vague understanding.
But if we as SEO professionals can quickly help inform the Knowledge Graph, can we help Google develop that understanding? Yes, we absolutely can.
One way is by forcing Google to recognize accurate information by providing URLs. Creating business directory listings, citations, publishing content, press releases that provide accurate information about who the business is, what they do, and where they do it.
We can extract those URLs and put them in strategic places that force Google to recognize that and put all those puzzle pieces together quickly. Now Google has a clear understanding, which is why it rewards those businesses.
As SEO professionals, we're puzzle builders. We should be putting the puzzle pieces together to help Google have that understanding quicker so our clients benefit.
My Main Takeaway
1. The biggest thing I learned from Bradley is that Google's algorithm has fundamentally shifted from keywords to entities, and most SEO professionals are still optimizing like it's 2020. If you optimize a page for a specific keyword, that page can only rank for that keyword and close variants. But if you optimize for a topic or category using proper page structure, that page can rank for every search query Google associates with that topic, even if those exact words never appear on the page. This isn't theoretical. Bradley has proven this with two years of testing in the tree service industry and through consulting with agencies in other industries. The key is understanding that Google is looking at structural elements like your URL, SEO title, H1, and headings to determine what a page is about, not reading through walls of content. Bradley tested this by creating a page with just an SEO title, H1, two H2s, and some H3s with no body content or images, then ran it through Google's Natural Language API. The API correctly identified what the page was about with zero content, proving that for local SEO, Google relies on structural elements far more than actual written content.
2. The second takeaway is that reducing pages actually improves rankings in the current algorithm, which is completely counterintuitive to what most SEO professionals learned and practiced for years. The old approach of creating 20 pages for four services across five locations is now actively hurting websites because of Google's Helpful Content Update. Google now creates an overall site quality score that acts as a multiplier for every page on your site. When you have redundant, duplicated content across dozens of pages saying essentially the same thing over and over, Google lowers that quality score and applies it to every single page. None of your pages perform as well as they should. Instead, you should have one service page per service optimized for service plus brand with no location mentioned, and one location page per city optimized for the top level category plus location plus brand. On the location page, you just need to meet the searcher's intent (who you are, where you are, how to contact you) and then use internal links to point to your service pages. This reduces your page count by more than half while actually improving performance.
3. The third insight is that proper HTML structure and hierarchy is critically important again for local SEO. Web designers started using H tags just to style elements of a page, but now that we're in entity-based SEO and Google is using large language models integrated into the search algorithm, proper use of headings and subheadings is absolutely critical to communicate what your page is about. Your H1 should be your page title optimized for the top level topic term. Your H2s should be the main services or topics that fall under that category. Your H3s through H5s should support their respective H2s. The H2s should support the H1, and the H1 should support the document title. This hierarchy tells Google exactly what your page is about in a very clear, efficient way. You should never duplicate your SEO title and H1. Instead, use one for the top level category and the other for the top level topic term. Don't use HTML elements to style your content, use them to convey meaning and structure to Google.
4. The fourth major takeaway is the concept of being a puzzle builder for Google's Knowledge Graph, which is one of the best visualizations for understanding how local SEO actually works. When a local business first launches, Google has a very vague understanding of who they are, what they do, and where they do it. It's like dumping out a jigsaw puzzle onto a table. You know what the picture is supposed to be from looking at the box, but all you see is scattered pieces until you start organizing and fitting them together. If you leave Google on its own to figure this out by searching for corroborating evidence online, it takes forever for that picture to become clear. But as SEO professionals, we can dramatically speed up this process by creating business directory listings, citations, press releases, and other strategic URLs that provide accurate, consistent information to Google about who the business is, what they do, and where they do it. We extract those URLs and put them in strategic places that force Google to recognize them and put the puzzle pieces together quickly. When we help Google develop a clear Knowledge Graph fast, Google rewards those businesses with better rankings.
5. The fifth and final insight is that this new approach is massively more efficient for everyone involved. You need far less content because you're not writing 2,000 words about ant control for five different cities. You're writing one thorough piece of content about ant control on a service page, then using location pages with minimal content that link to those service pages. You have fewer pages to maintain and optimize, which means lower overhead. You have fewer link building targets, so you can focus your efforts on fewer pages and let the proper internal linking distribute that authority throughout the site. Your costs go down, your workload goes down, but your results go up. And critically, it's also more efficient for Google, which is why it works. Google has publicly stated their servers are overloaded with the explosion of AI-generated content and they can't keep up with crawling and indexing everything. When you reduce crawl resistance by having a lean, well-structured site that clearly conveys who the business is, what they do, and where they do it, Google rewards you. The sweet spot Bradley has found for tree service companies is 10 content pages total: homepage, four service pages, five location pages. He has examples of sites ranking number one for every relevant search query in low competition markets with zero backlinks, just from nailing page structure, site structure, and internal linking.
Where to Find Bradley Benner
Bradley runs a free weekly webinar every Wednesday at 4 PM Eastern called Hump Day Hangouts. They just hit their 500th episode and are about 20 episodes away from their 10 year anniversary. You can watch replays, post questions ahead of time, and even interact with an AI version of Bradley that's been trained on 10 years of him answering SEO questions live on the show. Check it out at semanticmastery.com/hdh.
You can also find Bradley's work at:
Tree Care HQ (his local SEO agency specializing in tree service companies)
Semantic Mastery (his coaching and consulting company for local SEO professionals)
Semantic Links (white label link building services for agencies)
Local Fury (app specifically for improving Google Maps rankings)
Latest
More Blogs By Danny Leibrandt
Get the latest insights on business, digital marketing, and entrepreneurship from Danny Leibrandt.
Connect to Content
Add layers or components to infinitely loop on your page.
