Read About the latest SEO News and Updates

Wednesday, August 30, 2017

Google says we don’t need no stinking location modifiers… or do we?

Google recently stated that users are beginning to drop location qualifiers when searching for local businesses. But is it time for local SEOs to stop focusing on these terms?

Last week, Google declared that the “near me” search query and other “geo-modifiers” (e.g., ZIP code, city name) were, if not dead, then certainly not worth spending your valuable SEO time worrying too much about:
In September 2015, we shared that “near me” or “nearby” searches on Google had grown 2X in the previous year. Now, just two years later, we see that behavior has continued to change. Make no mistake, people still use ”near me” to discover places of interest around them. But we’re now seeing a shift toward dropping location qualifiers (like zip [sic] codes, neighborhoods, and “near me” phrasing) in local searches, because people know that the results will automatically be relevant to their location — thanks to their phone. It’s kind of magical. In fact, this year, search volume for local places without the qualifier “near me” has actually outgrown comparable searches that do include ”near me.” [see data] Over the last two years, comparable searches without “near me” have grown by 150% [see data].
But, as you can see through Google Trends, it’s not just that “implicit” local search queries (searches for local places without the local qualifier) are growing rapidly — it’s that they have always had a much higher base volume as well.

Source: Google Trends
So, we get it — the search term “tacos” is a better term to target than “tacos costa mesa.” However, Google treats implicit/explicit/”near me” searches differently. Just look at these different results (searches were all done from the same location with an incognito browser):
Tacos (located in Costa Mesa, CA)

Tacos Costa Mesa

Costa Mesa Tacos

Tacos Near Me (located in Costa Mesa)

While there’s overlap, literally none of the results above are the same. That tells us that Google evaluates all of these queries differently. Not only that, but according to our 2016 Local SEO Ranking Factors study (2017 version coming soon), it looks like Google is looking at different ranking factors as well.
Here’s how various factors correlated with high rankings in the Local Pack for implicit and explicit local search queries:

So, this requires a more holistic approach to location-based SEO. Local SEO isn’t just about fixing data accuracy problems, it’s about making sure that clients are effectively optimized for the myriad terms and paths that will generate them business.
That means implicit geo-location queries like “tacos” may have the highest search volume and best growth opportunity, but if you’re skipping other term variations, you are leaving money on the table.
For example, here is the monthly search volume for these searches in Pleasanton, Calif.:
Tacos — 210
Tacos Near Me — 140
Tacos Costa Mesa — 50
Costa Mesa Tacos — 10
If you are only focusing on “tacos,” that’s 210 monthly searches, or 51 percent of the potential traffic out there for core taco-related terms (and there are likely hundreds of long-tail “taco” queries for a city, each with their own unique results). Now, start applying that to, say, a 100-location business, and that’s 21,000 potential customers you are trusting Google to send to the right place.
This is part of the reason why we treat ALL these terms as a “taco” search. Because given the particular searcher, their search behavior, and proximity to your location, they could theoretically be engaging in any of these searches.
By the way, the #1 tacos-related search term?
Tacos Nearby — 590

So I wouldn’t abandon “near me” optimizations just yet.

Source URL: http://searchengineland.com/google-says-dont-need-no-stinking-location-modifiers-281428

Friday, August 25, 2017

Phrase match is dead. Long live phrase match!

Reports of the demise of phrase match have been greatly exaggerated, finds contributor Steve Cameron. It turns out that there are certain situations in which it comes in pretty handy. 

When Google launched their broad match modified match type some years ago, some marketers hailed the demise of phrase match as something of a foregone conclusion.
Many, ourselves included, quickly shifted gears and rolled out new campaigns using a combination of broad match modified (BMM), exact match and negative match keywords. And others did not.
Let’s consider why we did this. First — and this should be writ large wherever AdWords advertisers congregate — no one likes broad match. No one except, perhaps, Google.
Using broad match is essentially lazy, the equivalent of holding out your hand with all your money it in and letting Google take what they want. They’re getting better — broad match is nowhere near as bad as it used to be — but it still allows Google a freedom you wouldn’t want your car mechanic or your electrician to have.
As a result, many advertisers instead opted for phrase match and exact match for their keywords, often having to develop hundreds of phrase match keywords to cover a multitude of possible search terms. For example, if we were trying to promote our blue widgets, we might start with “blue widgets” as a search term but soon have to add “widgets in blue,” “blue color widgets,” and so on.
With the introduction of BMM, advertisers were suddenly able to cover hundreds of possible search terms by simply using +blue +widgets. In our agency, we then monitor the Search Query Performance reports, pull out those search terms that deserve special attention and move them directly into exact match status, thus making the intermediate phrase match version redundant.
This streamlines the process and, with new build campaigns, seems to be the simplest way to get campaigns live and generating real data. We can then crunch this data faster to get to the kernels of optimization that will drive ongoing success.

Phrase match is still holding its own

We reached out to other marketers to discover how phrase match fits into their account management approach and found that reports of phrase match’s demise may be a little premature.
As Mark Kennedy from SEOM Interactive comments:
“While the changes to match types over the last few months have leant towards the demise of phrase match, it still has its place. I won’t remove it from older accounts where phrase match KWs have good history and conversion value (although I will prune those with no value). And if they are in accounts where ad groups are broken out by match type, I tend to keep them as well — again, especially where there is historical data. The other time I use them is when I am mining search query reports for converting phrases that I want to test. So I’ll use different match types on these new terms (including phrase match) to see what performs well. So while phrase match isn’t ‘necessary’ anymore we haven’t taken it totally out of the utility belt.”
Kennedy’s comments about historical value are particularly significant. Google loves historical data — especially when it comes to Quality Score, for example, and it could take a new BMM keyword some time to achieve a high QS already enjoyed by a long-term phrase match keyword that has consistently performed well.
While others echo these sentiments, Julie Friedman Bacchini of Neptune Moon feels that keyword matching is becoming increasingly fuzzier. She cites the changes in the keyword planner where match types are no longer segmented as an indication of the way Google sees match types. Coupled with the ending of the close variant opt out, the battle lines between marketers and the search engines are blurring. Interestingly, she concludes:
“It’s a little ironic, really, because it seems like now more than ever true phrase match would be helpful to advertisers. With matching getting fuzzier, the ability to zero in on a particular phrase or way of expressing a search term without having to sculpt it with negatives would be welcomed by many, I suspect!”
James Svoboda, partner at WebRanking, also agrees that the close variant rollout last year essentially made phrase match and Broad Match Modified the same with one significant exception: word order.
He explains:
“Since we typically see similar results for search terms containing the same words, but in different word order, such as PPC Agency Minneapolis and Minneapolis PPC Agency, both of which can be targeted using one BMM Keyword: +Minneapolis +PPC +Agency, we’ve opted to simplify our account structure by using BMM instead of phrase match for most keywords. There are a few case-by-case situations that may arise where word order has a significant impact on performance, such “Used Cars” and “Cars Used” where the first one typically will be used by searchers with queries for preowned vehicles and the second one by searchers looking for cars used for specific tasks such as: cars used for off-roading. Even in these cases, we can use Negative Keywords to shape the queries that we are targeting through BMM Keywords.”
As a result, they have effectively removed phrase match from their arsenal and are looking forward to the day when both Google and Bing follow suit.
Whether that day will ever come is debatable. There are, perhaps, too many accounts that have been built and optimized over the years around a solid core of well-performing phrase match keywords. And, since phrase match isn’t broken, there may be no reason to try to fix it.
And for some of us, like Gil Hong from Seer, phrase match may simply be a habit that is hard to shake:
“I still use phrase match! Partly out of habit of how I build campaigns in Excel, and in some instances, phrase match still has a place for specific keyword intent.”
The consensus, therefore, seems to be that there is some life in phrase match after all. Where a well-performing phrase match keyword with great historical data and a good Quality Score is present, it could take some time for a Broad Match Modified keyword to catch up — and indeed, it would be hard to imagine the performance ever quite being matched. But where there are well-performing phrase match keywords, there are Exact Match versions willing and eager to usurp them.

So poor phrase match is getting pressure from both sides, and while it might be able to hold its own for the time being based on past glories, going forward, its days are still probably numbered. But then, we said that a couple of years ago!

Tuesday, August 22, 2017

Load time, static site generators & SEO: How we lowered our own site’s load time by 74%

Google's upcoming transition to mobile-first indexing, combined with its raised expectation of mobile site performance, should convince site owners to consider static site generation. 


Google has raised the bar on site load times consistently over the last decade, and the upcoming transition to mobile-first indexing, combined with its rising expectations of mobile site performance, should be a clear warning sign to site owners.
Site owners generally, however, seem not to be listening.
Unfortunately, based on our analysis of 10,000+ mobile web domains, we found that most mobile sites don’t meet this bar: The average load time for mobile sites is 19 seconds over 3G connections.
DoubleClick research published in September 2016
At our company, we have been experimenting over the last year with static site generation. Our tests on our own site are aimed at allowing us to assess the challenges facing site owners, to understand the scope of opportunity and potential for performance improvement, and also to explore the practical limitations in content management — one of the key criticisms of static site generation.
Our site, QueryClick.com, was a small, fairly well-optimized B2B site, but it averaged ~6.99 second load time in the month prior to our deployment of static site generation (July 2016), dropping to ~1.8 seconds in the month following. That represented a load time reduction of 74.29 percent, despite some server response issues experienced during the period we were actively developing the site.

One month before and after switching to a static site generation infrastructure.
We performed further server optimization improvements over the year, reaching our sub-one-second mobile device target even while testing the impact of less efficient elements driven by JavaScript.

Yes, we know! We didn’t even use sprites, gzipping, or other such techniques — which highlights the impact of a platform-first approach to solving the page speed problem.

A platform-first approach to page speed

I’ve written before about the varying levels of importance of the different aspects of page speed on SEO and about how Google’s algorithm employs data about SERP bounce-backs (when users bounce back to its SERPs after losing patience with a slow-loading site). But it’s worth making the point again as we head to a mobile-first world: server response times and the critical render path event (the point at which everything in the initial device viewport is rendered) are key to delivering high-performance SEO, especially for enterprise-level sites.
Any developer worth their salt will look at the asset load requests in QueryClick’s site displayed in the above image and shake their head at all the unoptimized elements. But that is the point. High performance was achieved despite a lack of rigorous optimization in the code and asset deployment. It was driven by the platform and the high-level architecture decisions.

Google would like us to improve further — and we will, but real performance change has already been delivered.
So, what architecture did we use? As Python and Django evangelists, we write the copy in Markdown and push it to our staging server build via Github, where we can check to ensure it’s all OK. We then use Celery to set a time for the staging server copy to be pushed to the live Git repository. Then, Cactus regenerates the pages and, voila, the live server is populated with the static pages.
Of course, for your average content producer, this infrastructure is not as simple to create or maintain as a standard CMS without some technical know-how. That is the most common criticism of static site deployments, and many enterprise clients consider it a deal-breaker when they’re looking at static site solutions.
Certainly, if you manage product inventory that dynamically changes by tens of thousand in a day, which one of our clients does, then a robust management back end is essential.
That’s why anyone deploying a static site performance solution in the enterprise must leverage Oracle ATG or the like, which can easily generate and manipulate static web pages using its API. When you think about it, live dynamic site management requires significantly more hardware infrastructure than static.
If you need more convincing, take a look at the variety of static code bases already in flight. They use a variety of programming languages and many of them are fully capable of being fitted into an enterprise environment. When you also use a content delivery network (CDN) in production, you can offer a robust solution that delivers both blazing speed (for even poor 3G mobile connections) and total redundancy and elimination of server failure challenges.
Dynamic static asset provision and modern caching controls on static generators allow clean, live adjustment of content that is exactly comparable to dynamic site generation at a fraction of the hardware demand.
It may take years for the general web to catch up with Google’s pioneering push for modern, fast, easy-anywhere web experiences. But if you want to benefit your conversion rate and your brand experience, and enjoy a significant drive to SEO performance thanks to fast critical render performance and positive SERP bounce behavior, then you should have your development team investigate and find an architecture that works for your site today.

Source URL: http://searchengineland.com/load-time-static-site-generators-seo-280785

Wednesday, April 12, 2017

Website Uses it's name in URL on Google Search Results

While I was searching for a keywords "seo services in delhi ncr", I found a results with totally unique URL listed in Google search results. I was little surprised to see the listed URL structure. Here is the screenshot for that unique URL structure:


It was the 1st time when I found this type of URL in Google search results so I was excited to know how it can be done. So I have started researching on it and found that it was an older Google update which happened in 16th April 2015 named "Google Replaces A Site’s URL In Search Results & Uses Its Site Name & Breadcrumb Path".

I realized that there are many such minor updates which happens in between and we miss them as we always focus on the major updates like penguine, panda etc. I am sure if we have a close eye on each minor updates like this then we will become a leader in the industry. As there are very few people in SEO industry have close eyes on such minor updates so they never gets benefits of these changes happen to their clients websites.

And if we get such minor updates then we should also implement the websites accordingly to understand the way how to implement them for a website. It can be a attraction to the users and clients as well.

For today I think it's enough to write on blog. I am still in the implementation stage for this update and once I get anything different on Google then I will share my another blog post.

Thanks for reading my blog. Your comments with your questions will be appreciated.