Daily Archives: January 13, 2022

DuckDuckGo passes 100 billion searches

By | January 13, 2022


DuckDuckGo, the privacy focused search engine, announced it has surpassed 100 billion searches – all time. It posted this announcement on Twitter.

100 billion searches. Here is the announcement on Twitter:

A year ago, the search company announced it was hitting over a 100 million searches per day. Now if you look at its public traffic statistics page, it shows all time searches passed 100,024,437,307 and the highest daily number of queries it saw to **** was 110,439,133, that record was this past Monday. DuckDuckGo continues to grow on a daily basis.

Why we care. Pressure over consumer privacy has prompted Apple and Google to block third-party cookies from tracking users across the web. That same focus on privacy has also helped propel DuckDuckGo past 100 billion searches all-time. Its success, while modest, may provide new or existing search engines with a roadmap to chipping away at Google’s dominance or avoiding it altogether by concentrating on an underserved base of users.

It is still miles behind Google, but DuckDuckGo is gradually closing in on Yahoo! and Bing, so a future in which it is as much a part of the conversation as Bing may not be that far off. Nevertheless, optimizing specifically for any non-Google search engine remains highly unlikely.


New on Search Engine Land

About The Author

Barry Schwartz a Contributing Editor to Search Engine Land and a member of the programming team for SMX events. He owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry’s personal blog is named Cartoon Barry and he can be followed on Twitter here.



Source link : Searchengineland.com

How to build a long-term, search-first marketing strategy

By | January 13, 2022


“There are roughly three and a half billion Google searches made every day,” said Craig Dunham, CEO of enterprise SEO platform Deepcrawl, at our recent MarTech conference. “According to research from Moz, 84% of people use Google at least three times a day, and about half of all product searches start with Google. The way that consumers are engaging with brands is changing, and it’s doing so rapidly.”

He added, “Consumers begin their journey with the tool that many of us use hundreds of times a day. Thus, the connection to revenue becomes clear — it starts with search.”

Source: Craig Dunham and Scott Brinker

The concept of digital transformation has been around for years, but it’s taken a whole new form in the wake of recent societal shifts. New technologies and the 2020 pandemic have led to a “greater focus on the need to drive optimal digital experiences for our customers,” says Dunham.

A brand’s website is often the first, and most lasting, impression customers will have of your organization. Here are some strategic actions he recommends marketers take to ensure their online properties are optimized for the search-first age.

“The website is a shared responsibility and it requires proper strategic leadership,” Dunham said. “The first step is to take some time and educate yourself, your leadership, your board and your organization so they more broadly promote organic KPIs as business-wide objectives.”

“There’s great data out there on the impact of the efficiency of SEO as a low-cost acquisition channel,” he added.

Source: Craig Dunham

Aside from sharing communication from Google on the importance of search from a business perspective, marketers can look for case studies from reputable organizations to encourage search prioritization. This can help higher-ups start seeing organic traffic as a key business metric.

“I was in a meeting recently and I had a digital leader say to me that you know website performance should not be an SEO metric — it has to be a business metric,” he said.

Create a cross-functional search ops task force

“Much of the data and insight generated by CEOs and their tools today are rarely utilized to their full potential,” Dunham said. “This is in part due to SEO not being seen as a business priority. As a result, it’s been siloed — pulling in teams from across the organization breaks down those silos.”

The more team members are involved with search processes, the more they’ll see its impact. People from each department will have more opportunities to contribute to growing online visibility using their unique skillsets.

“We know that businesses that are able to implement these organizational-wide search operations systems and practices — connecting a range of perspectives and search activities that are happening — are going to be the ones that will have a competitive advantage,” said Dunham.

Apply SEO testing automation

More and more brands are turning to automation tools to streamline tasks. According to Dunham, these solutions can be used for search-related activities as well.

“Automation can be well-deployed within web development processes,” Dunham said. “Until recently, this technology didn’t exist.”

Brands now have access to a wide variety of automation tools to streamline SEO-related tasks. The key is to pick solutions that align with your organization’s goals and give you full control over their deployment: “There are additional risk mechanisms that can be put in place to ensure you don’t release bad code that will result in large traffic losses, ultimately driving down revenue across your critical web pages,” said Dunham.

If brands can optimize their internal process, teams and tools around organic search, they’ll increase their chances of achieving long-term success in the search-first digital landscape.


New on Search Engine Land

About The Author

Corey Patterson is an Editor for MarTech and Search Engine Land. With a background in SEO, content marketing, and journalism, he covers SEO and PPC industry news to help marketers improve their campaigns.



Source link : Searchengineland.com

How To Use IndexNow API With Python For Bulk Indexing

By | January 13, 2022


IndexNow is a protocol developed by Microsoft Bing and adopted by Yandex that enables webmasters and SEO pros to easily notify search engines when a webpage has been updated via an API.

And today, Microsoft announced that it is making the protocol easier to implement by ensuring that submitted URLs are shared between search engines.

Given its positive implications and the promise of a faster indexing experience for publishers, the IndexNow API should be on every SEO professional’s radar.

Using Python for automating URL submission to the IndexNow API or making an API request to the IndexNow API for bulk URL indexing can make managing IndexNow more efficient for you.

In this tutorial, you’ll learn how to do just that, with step-by-step instructions for using the IndexNow API to submit URLs to Microsoft Bing in bulk with Python.

Note: The IndexNow API is similar to Google’s Indexing API with only one difference: the Google Indexing API is only for job advertisements or broadcasting web pages that contain a video object within it.

Google announced that they will test the IndexNow API but hasn’t updated us since.

Bulk Indexing Using IndexNow API with Python: Getting Started

Below are the necessities to understand and implement the IndexNow API tutorial.

Below are the Python packages and libraries that will be used for the Python IndexNow API tutorial.

  • Advertools (must).
  • Pandas (must).
  • Requests (must).
  • Time (optional).
  • JSON (optional).

Before getting started, reading the basics can help you to understand this IndexNow API and Python tutorial better. We will be using an API Key and a .txt file to provide authentication along with specific HTTP Headers.

IndexNow API Usage with PythonIndexNow API Usage Steps with Python.

1. Import The Python Libraries

To use the necessary Python libraries, we will use the “import” command.

  • Advertools will be used for sitemap URL extraction.
  • Requests will be used for making the GET and POST requests.
  • Pandas will be used for taking the URLs in the sitemap into a list object.
  • The “time” module is to prevent a “Too much request” error with the “sleep()” method.
  • JSON is for possibly modifying the POST JSON object if needed.

Below, you will find all of the necessary import lines for the IndexNow API tutorial.

import advertools as adv
import pandas as pd
import requests
import json
import time

2. Extracting The Sitemap URLs With Python

To extract the URLs from a sitemap file, different web scraping methods and libraries can be used such as Requests or Scrapy.

But to keep things simple and efficient, I will use my favorite Python SEO package – Advertools.

With only a single line of code, all of the URLs within a sitemap can be extracted.

sitemap_urls = adv.sitemap_to_df("https://www.example.com/sitemap_index.xml")

The “sitemap_to_df” method of the Advertools can extract all the URLs and other sitemap-related tags such as “lastmod” or “priority.”

Below, you can see the output of the “adv.sitemap_to_df” command.

Sitemap URL Extraction for IndexNow API UsageSitemap URL Extraction can be done via Advertools’ “sitemap_to_df” method.

All of the URLs and dates are specified within the “sitemap_urls” variable.

Since sitemaps are useful sources for search engines and SEOs, Advertools’ sitemap_to_df method can be used for many different tasks including a Sitemap Python Audit.

But that’s a topic for another time.

3. Take The URLs Into A List Object With “to_list()”

Python’s Pandas library has a method for taking a data frame column (data series) into a list object, to_list().

Below is an example usage:

sitemap_urls["loc"].to_list()

Below, you can see the result:

Sitemap URL ListingPandas’ “to_list” method can be used with Advertools for listing the URLs.

All URLs within the sitemap are in a Python list object.

4. Understand The URL Syntax Of IndexNow API Of Microsoft Bing

Let’s take a look at the URL syntax of the IndexNow API.

Here’s an example:

https://<searchengine>/indexnow?url=url-changed&key=your-key

The URL syntax represents the variables and their relations to each other within the RFC 3986 standards.

  • The <searchengine> represents the search engine name that you will use the IndexNow API for.
  • “?url=” parameter is to determine the URL that will be submitted to the search engine via IndexNow API.
  • “&key=” is the API Key that will be used within the IndexNow API.
  • “&keyLocation=” is to provide an authenticity that shows that you are the owner of the website that IndexNow API will be used for.

The “&keyLocation” will bring us to the API Key and its “.txt” version.

5. Gather The API Key For IndexNow And Upload It To The Root

You’ll need a valid key to use the IndexNow API.

Use this link to generate the Microsoft Bing IndexNow API Key.

IndexNow API Key Taking There is no limit for generating the IndexNow API Key.

Clicking the “Generate” button creates an IndexNow API Key.

When you click on the download button, it will download the “.txt” version of the IndexNow API Key.

IndexNow API Key GenerationIndexNow API Key can be generated by Microsoft Bing’s stated address.
txt version of IndexNow API KeyDownloaded IndexNow API Key as txt file.

The TXT version of the API key will be the file name and as well as within the text file.

IndexNow API Key in TXT FileIndexNow API Key in TXT File should be the same with the name of the file, and the actual API Key value.

The next step is uploading this TXT file to the root of the website’s server.

Since I use FileZilla for my FTP, I have uploaded it easily to my web server’s root.

Root Server and IndexNow API Set upBy putting the .txt file into the web server’s root folder, the IndexNow API setup can be completed.

The next step is performing a simple for a loop example for submitting all of the URLs within the sitemap.

6. Submit The URLs Within The Sitemap With Python To IndexNow API

To submit a single URL to the IndexNow, you can use a single “requests.get()” instance. But to make it more useful, we will use a for a loop.

To submit URLs in bulk to the IndexNow API with Python, follow the steps below:

  1. Create a key variable with the IndexNow API Key value.
  2. Replace the <searchengine> section with the search engine that you want to submit URLs (Microsoft Bing, or Yandex, for now).
  3. Assign all of the URLs from the sitemap within a list to a variable.
  4. Use the “txt” file within the root of the web server with its URL value.
  5. Place the URL, key, and key location URL within the string manipulation value.
  6. Start your for a loop, and use the “requests.get()” for all of the URLs within the sitemap.

Below, you can see the implementation:

key = "22bc7c564b334f38b0b1ed90eec8f2c5"
url = sitemap_urls["loc"].to_list()
for i in url:
          endpoint = f"https://bing.com/indexnow?url={i}&key={key}&keyLocation={location}"
          response = requests.get(endpoint)
          print(i)
          print(endpoint)
          print(response.status_code, response.content)
          #time.sleep(5)

If you’re concerned about sending too many requests to the IndexNow API, you can use the Python time module to make the script wait between every request.

Here you can see the output of the script:

IndexNow API Automation ScriptThe empty string as the request’s response body represents the success of the IndexNow API request according to Microsoft Bing’s IndexNow documentation.

The 200 Status Code means that the request was successful.

With the for a loop, I have submitted 194 URLs to Microsoft Bing.

According to the IndexNow Documentation, the HTTP 200 Response Code signals that the search engine is aware of the change in the content or the new content. But it doesn’t necessarily guarantee indexing.

For instance, I have used the same script for another website. After 120 seconds, Microsoft Bing says that 31 results are found. And conveniently, it shows four pages.

The only problem is that on the first page there are only two results, and it says that the URLs are blocked by Robots.txt even if the blocking was removed before submission.

This can happen if the robots.txt was changed to remove some URLs before using the IndexNow API because it seems that Bing does not check the Robots.txt again.

Thus, if you previously blocked them, they try to index your website but still use the previous version of the robots.txt file.

Bing IndexNow API ResultsIt shows what will happen if you use IndexNow API by blocking Bingbot via Robots.txt.

On the second page, there is only one result:

IndexNow Bing Paginated ResultMicrosoft Bing might use a different indexation and pagination method than Google. The second page shows only one among the 31 results.

On the third page, there is no result, and it shows the Microsoft Bing Translate for translating the string within the search bar.

Microsoft Bing TranslateIt shows sometimes, Microsoft Bing infers the “site” search operator as a part of the query.

When I checked Google Analytics, it shows that Bing still hadn’t crawled the website or indexed it. I know this is true as I also checked the log files.

Google and Bing Indexing ProcessesBelow, you will see the Bing Webmaster Tool’s report for the example website:

Bing Webmaster Tools Report

It says that I submitted 38 URLs.

The next step will involve the bulk request with the POST Method and a JSON object.

7. Perform An HTTP Post Request To The IndexNow API

To perform an HTTP post request to the IndexNow API for a set of URLs, a JSON object should be used with specific properties.

  • Host property represents the search engine hostname.
  • Key represents the API Key.
  • Key represents the location of the API Key’s txt file within the web server.
  • urlList represents the URL set that will be submitted to the IndexNow API.
  • Headers represent the POST Request Headers that will be used which are “Content-type” and “charset.”

Since this is a POST request, the “requests.post” will be used instead of the “requests.get().”

Below, you will find an example of a set of URLs submitted to Microsoft Bing’s IndexNow API.

data = {
  "host": "www.bing.com",
  "key": "22bc7c564b334f38b0b1ed90eec8f2c5",
  "keyLocation": "https://www.example.com/22bc7c564b334f38b0b1ed90eec8f2c5.txt",
  "urlList": [
    'https://www.example.com/technical-seo/http-header/',
    'https://www.example.com/python-seo/nltk/lemmatize',
    'https://www.example.com/pagespeed/broser-hints/preload',
    'https://www.example.com/python-seo/nltk/stemming',
    'https://www.example.com/python-seo/categorize-queries/',
    'https://www.example.com/python-seo/nltk/tokenization',
    'https://www.example.com/review/oncrawl/',
    'https://www.example.com/technical-seo/hreflang/',
    'https://www.example.com/technical-seo/multilingual-seo/'
      ]
}
headers = {"Content-type":"application/json", "charset":"utf-8"}
r = requests.post("https://bing.com/", data=data, headers=headers)
r.status_code, r.content

In the example above, we have performed a POST Request to index a set of URLs.

We have used the “data” object for the “data parameter of requests.post,” and the headers object for the “headers” parameter.

Since we POST a JSON object, the request should have the “content-type: application/json” key and value with the “charset:utf-8.”

After I make the POST request, 135 seconds later, my live logfile analysis dashboard started to show the immediate hits from the Bingbot.

Bingbot Log File Analysis

8. Create Custom Function For IndexNow API To Make Time

Creating a custom function for IndexNow API is useful to decrease the time that will be spent on the code preparation.

Thus, I have created two different custom Python functions to use the IndexNow API for bulk requests and individual requests.

Below, you will find an example for only the bulk requests to the IndexNow API.

The custom function for bulk requests is called “submit_url_set.”

Even if you just fill in the parameters, still you will be able to use it properly.

def submit_url_set(set_:list, key, location, host="https://www.bing.com", headers={"Content-type":"application/json", "charset":"utf-8"}):
     key = "22bc7c564b334f38b0b1ed90eec8f2c5"
     set_ = sitemap_urls["loc"].to_list()
     data = {
     "host": "www.bing.com",
     "key": key,
     "keyLocation": "https://www.example.com/22bc7c564b334f38b0b1ed90eec8f2c5.txt",
     "urlList": set_
     }
     r = requests.post(host, data=data, headers=headers)
     return r.status_code

An explanation of this custom function:

  • The “Set_” parameter is to provide a list of URLs.
  • “Key” parameter is to provide an IndexNow API Key.
  • “Location” parameter is to provide the location of the IndexNow API Key’s txt file within the web server.
  • “Host” is to provide the search engine host address.
  • “Headers” is to provide the headers that are necessary for the IndexNow API.

I have defined some of the parameters with default values such as “host” for Microsoft Bing. If you want to use it for Yandex, you will need to state it while calling the function.

Below is an example usage:

submit_url_set(set_=sitemap_urls["loc"].to_list(), key="22bc7c564b334f38b0b1ed90eec8f2c5", location="https://www.example.com/22bc7c564b334f38b0b1ed90eec8f2c5.txt")

If you want to extract sitemap URLs with a different method, or if you want to use the IndexNow API for a different URL set, you will need to change “set_” parameter value.

Below, you will see an example of the Custom Python function for the IndexNow API for only individual requests.

def submit_url(url, location, key = "22bc7c564b334f38b0b1ed90eec8f2c5"):
     key = "22bc7c564b334f38b0b1ed90eec8f2c5"
     url = sitemap_urls["loc"].to_list()
     for i in url:
          endpoint = f"https://bing.com/indexnow?url={i}&key={key}&keyLocation={location}"
          response = requests.get(endpoint)
          print(i)
          print(endpoint)
          print(response.status_code, response.content)
          #time.sleep(5)

Since this is for a loop, you can submit more URLs one by one. The search engine can prioritize these types of requests differently.

Some of the bulk requests will include non-important URLs, the individual requests might be seen as more reasonable.

If you want to include the sitemap URL extraction within the function, you should include Advertools naturally into the functions themselves.

Tips For Using The IndexNow API With Python

An Overview of How The IndexNow API Works, Capabilities & Uses

  • The IndexNow API doesn’t guarantee that your website or the URLs that you submitted will be indexed.
  • You should only submit URLs that are new or for which the content has changed.
  • The IndexNow API impacts the crawl budget.
  • Microsoft Bing has a threshold for the URL Content Quality and Calculation of the Crawl Need for a URL. If the submitted URL is not good enough, they may not crawl it.
  • You can submit up to 10,000 URLs.
  • The IndexNow API suggests submitting URLs even if the website is small.
  • Submitting the same pages many times within a day can block the IndexNow API from crawling the redundant URLs or the source.
  • The IndexNow API is useful for sites where the content changes frequently, like every 10 minutes.
  • IndexNow API is useful for pages that are gone and are returning a 404 response code. It lets the search engine know that the URLs are gone.
  • IndexNow API can be used for notifying of new 301 or 302 redirects.
  • The 200 Status Response Code means that the search engine is aware of the submitted URL.
  • The 429 Status Code means that you made too many requests to the IndexNow API.
  • If you put a “txt” file that contains the IndexNow API Key into a subfolder, the IndexNow API can be used only for that subfolder.
  • If you have two different CMS, you can use two different IndexNow API Keys for two different site sections
  • Subdomains need to use a different IndexNow API key.
  • Even if you already use a sitemap, using IndexNow API is useful because it efficiently tells the search engines of website changes and reduces unnecessary bot crawling.
  • All search engines that adopt the IndexNow API (Microsoft Bing and Yandex) share the URLs that are submitted between each other.
IndexNow API Infographic SEOIndexNow API Documentation and usage tips can be found above.

In this IndexNow API tutorial and guideline with Python, we have examined a new search engine technology.

Instead of waiting to be crawled, publishers can notify the search engines to crawl when there is a need.

IndexNow reduces the use of search engine data center resources, and now you know how to use Python to make the process more efficient, too.

More resources:

An Introduction To Python & Machine Learning For Technical SEO

How to Use Python to Monitor & Measure Website Performance

Advanced Technical SEO: A Complete Guide


Featured Image: metamorworks/Shutterstock





Source link : Searchenginejournal.com

5 Phase eCommerce Website Migration Checklist: Keep Your Traffic!

By | January 13, 2022


Editor’s note: This article was originally published in 2019. It has been updated for accuracy and modern best practices.

Public service announcement: Migrating your eCommerce site to a new design, platform, or domain means that you have to be ultra-diligent in executing the move.

Otherwise, you risk a hit to SEO that affects your traffic, conversions, and revenue.

In our experience helping dozens of eCommerce stores migrate, we’ve seen a lot of “gotchas.” While these may seem like little errors, left unchecked, they can create a big downswing in your organic search traffic and rankings.

Most eCommerce migration checklists aren’t very helpful either. A vague list of SEO factors — without a real, strategic workflow to follow — does little to solve any migration issues. If you don’t know why you’re fulfilling a checkmark, you leave your site vulnerable to even bigger issues later on.

So, we put together this 5-Phase Migration Checklist, with step-by-step instructions, to fill the gap.

Before launching any new site, we recommend reading through the full guide below and following our checklist to minimize any potential negative impact to your SEO.

Remember: If you want an experienced team to help you through your eCommerce website migration, we’re always here to help.

Table of Contents

What to Expect with a Website Migration

Getting your ducks in a row prior to an eCommerce site migration is crucial for ensuring its continued success.

While a downward slope in SEO can usually be expected when migrating a site, by fixing errors prior to launch, you can actually help traffic remain steady. In some cases, this can even help traffic go up by improving the new site’s SEO.

Not all migrations are the same. It’s smart to consider certain specifics for each scenario when you migrate to:

  • A new design
  • A new hosted or self-hosted platform
  • A new domain

Given its inherent risks, migrating your entire site can be a stressful process. But, by adopting an optimized and streamlined SEO migration strategy, you’ll better reduce any organic rankings or traffic drops along the way.

Why Migrate Your Site?

If you’re thinking about a migration, you should always know why — so you don’t perform it superfluously.

It makes sense to migrate in certain situations, including when:

  • You have an old website with an outdated design.
  • Your business changes priorities or service offerings, and you need to add new information or change existing information.
  • You rebrand to a new business name.
  • You want to change to a CMS platform with more/different robust features or a lower cost.
  • Your existing site just isn’t converting, and you want to start from scratch on a new domain.

Before you make the decision to migrate, check out our guide to eCommerce platforms for SEO purposes. That will ensure you choose the best platform for your business needs, which we won’t cover here.

Instead, we’ll get into the nuts and bolts of migrating successfully — starting with the risks involved.

Types of Migrations

The risk of impact to your SEO performance will vary depending on the type of migration.

Low SEO Risk: Site Redesign/Reskin

When the extent of the migration is limited to a design change or reskin, the risk is relatively low, because no URLs are changing.

Depending on your team’s experience, it may not be necessary to enlist help from an outside agency to protect your site performance.

Note: Certain designs contain JavaScript that cloaks content. Hiding or “cloaking” content is not something Google tends to like, as it’s common on spam sites attempting to keyword-stuff their pages without the text visually displaying.

When redesigning or reskinning your site, watch out for such designs containing JavaScript that could trigger inadvertent SEO impacts.

Medium SEO Risk: URL Changes (With No Platform Change)

You’ll want to monitor things carefully when your domain changes, or when other changes occur that require a new URL structure with abundant redirects. Because any URL changes can impact SEO, there’s a moderate amount of risk with this kind of migration.

Your website’s new URLs will need to be indexed as quickly as possible post-launch.

Whether the actual design changes or not, new URL paths are a fundamental change to a site’s SEO and should be monitored accordingly.

Medium SEO Risk Scenario 2: Change in eCommerce Platform

In general, a platform change means that the site’s URLs will also change, due to the way that various eCommerce platforms structure them.

For example: As a path to products in Shopify’s “collections,” the URL structure has to be domain.com/products/productname or domain.com/collections/collectionname/products/productname.

Taylor Stitch website screenshot with the U R L: https://www.taylorstitch.com/collections/mens-shirts/products/the-california-in-olive?

Meanwhile, WordPress and other CMSs have their own URL structures with different needs. 

In many platform migrations, maintaining the original URL isn’t possible ⁠— even if you want to.

Changing platforms also presents the risk of losing core functionality that your SEO (or your other digital marketing channels!) are currently benefiting from. Evaluating and minimizing such a risk is key when migrating platforms.

High SEO Risk: Domain Migration

Regardless of any URL changes or design changes, migrating your eCommerce site’s domain presents a lot of risk.

Your domain name has a lot of context and history with Google. Starting from scratch, you’re going to lose some equity.

In this scenario, be prepared to do a bit more work to mitigate the SEO impact and pain of losing traffic.

With such a high risk, why change domains in the first place? Typically, this update is made for a long-term gain, such as a rebranding or name change of the organization.

Inflow’s 5-Phase Migration Checklist

Now that you know which migration scenario yours falls into (and the associated risks), it’s time to create a personalized checklist for your upcoming move. 

Don’t be overwhelmed by the level of detail below. Remember, there are many small, important factors that impact SEO performance. Usually, drops happen as a result of multiple little factors going wrong — or one catastrophic miss.

So, while our checklist is long, there are two important reasons why:

  1. To create the optimal site structure for Google’s indexing
  2. To catch any potential errors prior to launching a new site

Let’s start with Phase 1.

Phase 1: Evaluate Your Live Site & Opportunities for Improvement. Collect Benchmark Data.

Before you even think about starting your migration process, you need to know where your live site stands. You’ll use this site’s analytics data as a benchmark for reference and comparison to the new site in a later step.

This step will help you launch the development site with as few discrepancies as possible when comparing it to the live site.

Step 1: Crawl the Current Domain

Crawling the live domain provides a list of URLs and a wide variety of data about those URLs that can be used later for checks and benchmarks.

To gather this data, we recommend using a crawling tool such as Screaming Frog or DeepCrawl.

Screaming Frog is widely used, and its data output is suitable for this process. It’s especially useful for smaller sites on a budget.

Screaming Frog Dashboard screenshot consists of four main sections: A table at the top, beneath a table with two columns labeled Property and value, Property microdata, and  Structured Data.

If you have a larger site, we recommend using DeepCrawl instead.

DeepCrawl dashboard screenshot contains four sections: Issues, Pages breakdown, changes, non-200 pages.

That said, use the crawler of your choice — but don’t neglect this important first step.

Step 2: Analyze Google Search Console Data

You’ll also need to review Google’s own data about your site. You can find this information in GSC’s dashboard by exporting the .csv data of your site.

As you export this data, keep these questions in mind:

  • Does any structured data need to be carried over to the new site?
  • Do hreflang tags need to be carried over to the new site?
  • Do any 404 errors need to be addressed pre-launch?

If you identify any data that should be transferred or any errors that should be corrected, mark this checkbox early on in the process.

Step 3: Crawl and Analyze the Current Site

Now that you have your crawl data and Google’s data, check the following off your list (so you can reference this data later in comparison to your new site):

  • Look for meta robots nofollow tags in source code.
  • Document GAID and type of code (universal, GTM) being used.
  • Look for pages with noindex tags.
  • Review canonical tags.
  • Review title tags.
  • Review meta description tags.
  • Review header (H1, H2, etc.) usage.
  • Collect all known live URLs (via a site crawl, SERP scrape, GSC, GA, etc.) in a file, to crawl and verify working redirects upon go-live.

Often, our team uses this data to identify optimization opportunities for the new site’s SEO. It’s also generally part of our process for running an SEO content audit.

Download our free Content Audit Toolkit today to assist in this step.

Everything you need to perform a comprehensive audit of eCommerce catalog content. Get Instant Access.

Step 4: Consider Remaining Factors

The data collected in the above steps is always necessary, regardless of migration type. To double-check any remaining important data for your relevant migration scenario, don’t forget to run one last analysis.

Remind yourself of your level of SEO risk by identifying the type of migration involved. Then, ask yourself these important questions:

  • Is the domain name changing?
  • Is the URL structure changing?
  • Is the site architecture changing?
  • Have you performed keyword research, created a keyword matrix and/or a keyword gap analysis to ensure a solid keyword strategy is in place?
  • Is conversion optimization a consideration?
  • Do you currently need help selecting a new eCommerce platform, system, software, etc.? (If so, our team can help.)

In Phase 2, you’ll use this information to identify important action steps and fixes to implement prior to migration.

Phase 2: Evaluate Staging Site & Technical Opportunities for Improvement. 

Now, we’ll do the technical work of going through the possible errors (“gotchas”) and taking the needed steps to test and address them.

Revenue is linked to SEO, and a migration presents a risk of impact to it. If there was ever a time to be thorough with a technical checklist, this is it.

Tip: Understand Your Development Timeline

Without the right approach, technical SEO can seem like a never-ending process. If your migration is to be completed by a target ****, your team needs to have the scope of work and a deadline to help keep things moving.

Structuring the work according to the below outline should help you to do that.

Step 1: Check for Server Changes

If your server is changing during your migration, ask your developer to host the staging site on the new server to identify potential issues. 

Don’t forget to ask the developer to block search engine bot access to the staging or development sites using the robots.txt file.

Step 2: Establish a 301 Redirect Strategy

To get new URLs indexed (and to maintain existing high-quality backlinks), use a systematic approach to implementing redirects:

  1. Formulate a 301 redirect strategy using the site crawl data.
  2. Document how the redirect strategy will be implemented and who is responsible. Everyone should be on the same page on how redirects will be structured and implemented.
  3. If there are legacy redirects that need updated, eliminate any redirect hops. It’s best to have fewer redirects, if possible.

You’ll also check for the success of these redirects upon launch/migration.

Step 3: Conduct Technical Dev Site Optimization

Remember the goal here — to get the development site optimized prior to launching it.

Your technical SEO audit should include a few key steps:

  1. Review sitemaps & wireframes for the new site.
  2. Ensure the staging site is being blocked from indexation.
  3. Verify a custom 404 error page exists & properly issues a 404 header response. Page titles for 404s should always read “Page Not Found” (or similar), so errors can be found in Google Analytics.
  4. Crawl the staging site to fix 404 and soft 404s.
  5. Ensure any links or URLs that you don’t want indexed on the dev site are nofollowed. This would be required when implementing new faceted navigation, for example.
  6. Test redirects (if possible).
  7. Browse the staging site with multiple devices.
  8. Allow for faster future indexing by creating an XML sitemap (if necessary) with all of the old URLs. Plan to leave this up after the site has launched.
  9. Test & enhance site speed, as much as is possible for a development site.

When possible, we perform a live site to dev site comparison of site speed. However, staging environments don’t often have caching or other factors that make a site’s actual speed visible. 

Since the load times of your staging site are only an estimate, enlist your development team to help you see where things are lacking and what you can do to improve it.

Step 4: Determine Analytics Migration Strategy

To help Google index your new site faster, you’ll need to prepare the way you share its information with the search engine. 

Follow these steps:

  1. Add the Analytics code to the development site. (Block the IPs of team members accessing this site.)
  2. Compile a list of URLs to update in the Analytics goals when the new site goes live.
  3. Check mobile-friendliness of different types of pages on the site.
  4. Check that parameters function as expected (and that critical tracking parameters aren’t dropped from the URL during a page reload or redirect).
  5. Check the parameters report in GSC for duplicate content issues.
  6. Check Google Analytics to ensure that your website cannot be its own referrer (on the Property level, in Tracking Info > Referral Exclusion List).
  7. Compile a list of URLs with highest social shares. Maintain the old embed code to keep social counts. (If no pages have worthwhile counts, don’t waste development efforts to implement this.)

Now, your development site is further prepared for parity with Google.

Phase 3: Compare Your Live Site vs. Staging Site to Ensure Parity. 

To further evaluate the staging site prior to pushing it live, compare it to the current site. This will help you find a list of prioritized discrepancies to fix.

Step 1: Compare the New Site to the Current Site & Identify Issues to Fix

Here’s what we recommend evaluating on both your staging and live site. Note: Content- and mobile-site-specific tasks will apply to everyone, but some of the items below may not be applicable to your site (for example, if your site has AMP pages or is hosted on WordPress).

Content
  1. Evaluate whether the site is at risk for duplicate content and/or URLs.
  2. Migrate all content to your staging site. Identify how much content is staying the same — and which isn’t.
  3. Ensure title tags and meta descriptions are being carried over and are the same as or better than the current site.
  4. Check page content, canonical tags, internal linking, usage of H1s/H2s, image alt tags, and more.
Mobile Site Usability
  1. Crawl both sites as “mobile googlebot” in Screaming Frog. Validate that page titles, descriptions, headers, content, inlinks, images, directives, etc. are the same between mobile and desktop.
  2. Implement meta=“viewport” tag to <head> section of site: <meta name=“viewport” content=“width=device-width, initial-scale=1”>.
Javascript Site Functionality
  1. Visually audit all major page types (fetch & render in GSC, if possible).
  2. Audit HTML source for missing content. 
  3. Audit using “inspect element” for missing content.
  4. Compare HTML source vs. inspect element for contradictions.
  5. Evaluate content based on user interaction. Are there any pages that are critical for search engine access? Any that aren’t? 
AMP Page Specific Tasks
  1. Each non-AMP page (i.e. desktop, mobile) should have a tag pointing to the corresponding AMP URL.
  2. Each AMP page should have a rel=“canonical” tag pointing to the corresponding desktop page.
  3. Any AMP page that does not have a corresponding desktop URL should have a self-referring canonical tag.
eCommerce-Specific Tasks
  1. Identify if category pages contain crawlable links to the products.
  2. Check faceted navigation and pagination for best practices
  3. Check to make sure that all images have a descriptive alt text.
WordPress-Specific Tasks
  1. Redirect URLs in WordPress on Dev site using Simple 301 Redirects plugin.
  2. Install Google Tag Manager plugin on WordPress site/CMS and configure it.
  3. Set up Yoast SEO WordPress & Yoast Analytics plugins.
  4. If you use a CMS like HubSpot, install & configure its CMS plugin.
  5. Set up Yoast Analytics.

As a separate task, we also create conversion optimization recommendations based on the staging site to optimize the new site’s conversions.

Step 2: Prioritize Your Checklist for Maximum Impact

Once you have a (potentially long) list of issues to tackle, how do you know where to start? 

Remember that launching a worse site (in terms of SEO) is what is most likely to harm your traffic potential. At a minimum, make sure your new site is “on par” with the previous site to reduce the risk of traffic drops post-migration. Any fixes that help establish parity between the old and new site should be prioritized as a necessity.

If your goal is to actually grow traffic (not just maintain it), ensure that there are clear wins and improvements on your development site, in addition to all other factors being equal.

A Warning About Waking the Beast

If you are a larger, more complex site with any present SEO issues and a long (favorable) history in Google, “on par” is likely not a good enough standard to hold yourself to.

A site migration represents an opportunity for Google to focus on your site in detail. Rest assured that it will. 

Its bots may suddenly start noticing things wrong with your site that you’ve been getting away with for years — except, now, you’re not getting away with them anymore. 

We’ve seen this situation enough times that we’ve given it a nickname: “waking the beast.”

If you have any major issues present on your live site, fix them — and do so prior to migration. Otherwise, you risk stepping off your current rankings pedestal to a lower one and spending the next several months (to years) recovering your existing rankings. 

Phase 4: Launch Checks to Ensure All is Well.

Once you’ve run all of your pre-migration checks, it’s finally time to hit “publish.” But that doesn’t mean you can breathe easily quite yet.

Even the best-planned migrations can have technical issues. Data can hit an undetected technical snag that changes its parameters or doesn’t transfer it through. Your developer may have accidentally included factors on the live site that were part of the staging environment, such as bot-blocking robots.txt code. 

When you work with our website migrations team, we always stay on deck when a launch happens to verify that everything is functional and working properly.

That’s because, with all of the moving parts and URLs typical to eCommerce stores, we’ve found that the earlier things get fixed, the better.

Step 1: Run a Site Crawl & Analysis Checks

In this step, you’ll check for matching data between the live site and the staged site. Follow these steps to hit every possible factor:

  1. Verify 301-redirects were all properly implemented by:
    • Running a redirect chain report in Screaming Frog.
    • Identifying redirect chains.
    • Using the crawl file from Phase 1 to verify that all known URLs correctly 301 to the expected, 200-status code pages.
  2. Crawl the new site to identify technical issues and accessibility. 
  3. Ensure the new live site is not blocked from being crawled and indexed by:
    • Checking robots.txt.
    • Looking for noindex tags in site crawl.
  4. Verify that a “nofollow” tag was added to pages you don’t want indexed. (These should have been identified on the development site, such as on faceted navigation links.)
  5. Make sure the “index, follow” meta tag is present on each page, as necessary.
  6. Look for 404 pages that shouldn’t exist.
  7. Check internal links by looking for broken links and links to the staging site.
  8. Verify proper implementation of canonical tags.
  9. Check for canonicals and duplicate URLs (“www” vs non-www, “http” vs “https,” /trailing-slash/ vs. /non-trailing-slash URLs).
  10. Check title tags by matching to current site data.
  11.  Check meta descriptions by matching to current site data.
  12. Check heading usage by matching to current site data.
  13. Check image alt attributes by matching to current site data.
  14. Check word count by matching to current site data. Account for all template text within <body> tags.
  15. Check internal link counts by page by matching to old data. Have any critical pages lost links?

Step 2: Run Google Analytics & Search Console Checks

Review the data that Google has indexed for your newly launched site:

  1. Verify analytics code is on all pages and that the correct code is implemented.
  2. Update Goals in Analytics and verify they are working correctly.
  3. Annotate Analytics with the **** the site launched.

Step 3: Run a Site Speed Check

  1. Run a few pages through PageSpeed Insights. Compare the results to the site speed on your current site and staging site.
  2. Make notes for improvement, if applicable.

Step 4. Perform Any Remaining Tasks

  1. If your site changed servers, make sure there are no outstanding server-related issues.
  2. Compare your top landing pages, before and after the migration. This includes details like titles, meta descriptions, headers, page content, etc.
  3. Verify 404 pages return a 404 status.
  4. Verify your old XML sitemap is published on your new site. Resubmit it to Google Search Console & Bing Webmaster Tools.
  5. If the domain is changing, claim the new domain variations in GSC and submit a change of address request.

Phase 5: Monitor Your Site for 1–2 Months.

Now, this is where the benchmark data collected from the old site comes into play. If you identify any organic traffic decreases, use this data to dig into where (and why) they’re happening.

Knowing where the drops are happening clues us into what needs fixing. Your plan of action will be different if traffic drops are widespread, isolated to a few pages, or affecting just one section of the site. 

We recommend monitoring new websites periodically (one week, two weeks, one month, and two months) after migration to ensure that everything is going smoothly — and to correct anything that isn’t.

Sometimes, monitoring should continue even longer. The bigger and more complex the site, the longer you’ll need to monitor its indexation.

1 Week Post-Launch

  1. Monitor Google Search Console (crawl stats, rankings, traffic, indexed pages, etc.).
  2. Check GSC for newly indexed pages (and check for any that haven’t made it yet).
  3. Check your old GSC profile to ensure old pages are getting de-indexed.
  4. Leave old XML sitemap(s) up for Google to re-crawl. 

2-3 Weeks Post-Launch

  1. Monitor Google Search Console (crawl stats, rankings, traffic, indexed pages, etc.).
  2. Remove the old XML sitemap(s). Replace them with the new XML Sitemap(s).
  3. Check the sitemap.xml file. Are the correct URLs in there?  Watch for extraneous URLs, staging site URLs, etc.
  4. Submit the new XML sitemap to Google Search Console & Bing Webmaster Tools.

1 Month Post-Launch

  1. Monitor Google Search Console (crawl stats, rankings, traffic, indexed pages, etc.).
  2. Check analytics for loss of traffic. If so, identify which pages lost traffic and why.

2 Months Post-Launch

  1. Continue to monitor Google Search Console (crawl stats, rankings, traffic, indexed pages, etc.).

Most migrations see fluctuations in traffic (typically, an initial dip and subsequent rise) after launch. That said, every site and migration is different, so the actual impact is hard to predict. 

At Inflow, we work to negate any drops prior to site launch, and the result tends to be steady traffic — or, in some cases, growth.

Your best bet is to follow this process. Doing so ensures that your new site’s data remains indexed by Google and, thus, visible to potential customers.

Getting Started Today

Bottom line: A successful migration is definitely worth the effort for many businesses. But, to keep the website’s value while doing so, it’s important to get the technical aspects of migration right.

This website migration checklist may seem overly comprehensive and technical — but, believe it or not, it’s just the tip of the iceberg. When it comes to eCommerce websites, there are so many unique situations to consider that we’ve barely touched upon here.

We believe anyone knowledgeable in SEO should be able to execute the above process. That said, since we did develop this for our internal team first, it may be too technical for some online businesses to perform on their own.

If that’s your situation, our team is always happy to help you migrate your eCommerce website with a personalized plan of attack. We’ve handled all kinds of situations to make sure our clients’ ducks are in a row before migrating, and we can do the same for you.

Contact us to learn more and get our recommendations for your unique migration scenario.



Source link

IndexNow now officially co-sharing URLs between Microsoft Bing and Yandex

By | January 13, 2022


The Microsoft Bing team said that the IndexNow protocol is now at a place where those participating are co-sharing URLs submitted, meaning if you use IndexNow to submit URLs to Microsoft Bing, Microsoft will immediately share those URLs with Yandex, the company announced.

Co-sharing URLs. The promise of IndexNow was to submit a URL to one search engine via this protocol and not only will that search engine immediately discover that URL, but it will also be discovered on all the other participating search engines. Right now, that is just Microsoft Bing and Yandex, but Google is exploring using this protocol.

Microsoft said “the IndexNow protocol ensures that all URLs submitted by webmasters to any IndexNow-enabled search engine immediately get submitted to all other similar search engines. As a result of co-sharing URLs submitted to IndexNow-enabled search engines, webmasters just need to notify one API endpoint. Not only does this save effort and time for webmasters, but it also helps search engines in discovery, thus making the internet more efficient.”

Microsoft said that Bing “has already started sharing URLs from IndexNow with Yandex and vice-versa, with other search engines closely following suit in setting up the required infrastructure.”

When this first launched, the participating search engines have not yet begun co-sharing URLs – but now they are.

IndexNow API. Also, you no longer need to submit the URLs to https://www.bing.com/IndexNow?url=url-changed&key=your-key or https://yandex.com/indexnow?url=url-changed&key=your-key. IndexNow.org is also directly accepting these submissions at https://api.indexnow.org/indexnow?url=url-changed&key=your-key

Microsoft Bing updated this help document to make it easier to understand how to set this up at any of those URLs mentioned above.

80,000 sites. Microsoft said that 80,000 websites are now using IndexNow for URL submission. “80k websites have already started publishing and reaping the benefits of faster submission to indexation,” the company said. Last November, the company said 60,000 of those websites were using IndexNow directly through Cloudflare, which added a toggle button to turn on this feature for websites using Cloudflare.

Also, Microsoft Bing recently released a WordPress plugin for IndexNow to make this process easier.

What is IndexNow. IndexNow provides a method for websites owners to instantly inform search engines about latest content changes on their website. IndexNow is a simple ping protocol so that search engines know that a URL and its content has been added, updated, or deleted, allowing search engines to quickly reflect this change in their search results.

How it works. The protocol is very simple — all you need to do is create a key on your server, and then post a URL to the search engine to notify IndexNow-participating search engines of the change. The steps include:

  1. Generate a key supported by the protocol using the online key generation tool.
  2. Host the key in text file named with the value of the key at the root of your web site.
  3. Start submitting URLs when your URLs are added, updated, or deleted. You can submit one URL or a set of URLs per API call.

Why we care. Like we said before, instant indexing is an SEO’s dream when it comes to giving search engines the most updated content on a site. The protocol is very simple and it requires very little developer effort to add this to your site, so it makes sense to implement this if you care about speedy indexing. Plus if you use Cloudflare, it can be turned on with the flip of a switch.

Now that co-sharing URLs is enabled, you should see your content flow faster between Microsoft Bing and Yandex, hopefully other search engines will adopt this protocol going forward.


New on Search Engine Land

About The Author

Barry Schwartz a Contributing Editor to Search Engine Land and a member of the programming team for SMX events. He owns RustyBrick, a NY based web consulting firm. He also runs Search Engine Roundtable, a popular search blog on very advanced SEM topics. Barry’s personal blog is named Cartoon Barry and he can be followed on Twitter here.



Source link : Searchengineland.com

IndexNow Enables Data Sharing Between Search Engines

By | January 13, 2022


Microsoft is making the IndexNow protocol easier to implement by ensuring submitted URLs are simultaneously shared between search engines.

IndexNow launched in October 2021 as solution to help websites get their content indexed and updated in search engines faster.

The IndexNow protocol ensures that all URLs submitted to any IndexNow-enabled search engine immediately get submitted to all other similar search engines.

That means webmasters just need to notify one API endpoint, as all URLs will be co-shared to every search engine that supports the IndexNow protocol.

In addition to saving time and effort for content publishers, this assists search engines in their content discovery efforts, which makes the web more efficient as a whole.

Microsoft Bing has already started sharing URLs from IndexNow with Yandex and vice-versa, and other search engines will follow suit after setting up the required infrastructure.

All search engines adopting the IndexNow protocol agree that submitted URLs will be automatically shared with all other participating search engines.

In an announcement, the Microsoft Bing Webmaster Tools team states:

“The IndexNow protocol can help the entire search industry get their content indexed faster, while using less resources. Early adoption of IndexNow can help businesses deliver timely information to their users across search engines and reap benefits of staying ahead of the curve.”

IndexNow will continue to improve as it’s implemented by more websites.

The overall goal with IndexNow is to maximize search engine indexing, optimize crawl load management, and deliver the freshest content to searchers.

IndexNow is created by Microsoft and Yandex, but the protocol is open source and available to all search engines.

For more information about IndexNow, including how to implement it on your website, see the following resources:

Source: Microsoft Bing


Featured Image: 3DProfi/Shutterstock





Source link : Searchenginejournal.com

Be A Mortgage Broker — Concord — California — 94520

By | January 13, 2022

If you are new in the industry, you may have questions about Ways to Be A Mortgage Broker. You may be eager about the licensing and education needs and how to keep business partners. Know all answers of all your questions by reading this blog.

Rank tracking: why you should monitor your keywords

By | January 13, 2022


Keywords. Optimization. Google rankings. Choose your goal, do the work, hope you win those top SERP positions. That’s why we do SEO in the first place, right? Of course, there’s a lot more to think about than just rankings and optimization. And if you’re getting enough visitors, and those visitors seem happy with your site and its content, do you really need to think about how your pages are ranking? The truth is, it’s not essential to track keyword rankings. But knowing how you measure up can help you make better decisions! Knowledge is power, after all. In this post we’ll explain why we think it’s a good idea to monitor how you perform in the search results, and how Yoast SEO can help you do that.

Keyword tracking: it’s about knowing where you stand

In many ways, SEO is a competition. You don’t have to play the rankings game. It’s still worth optimizing your content for a whole load of other reasons: it helps with user experience, for one, and it can improve other stats like bounce rate and conversion, which apply even if you’re only focused on paid advertising. But if you are putting content online and optimizing it (for whatever reason), it can’t hurt to know how that content is doing in the search results.

Tracking how your keywords are ranking provides you with a snapshot of how Google thinks your content measures up, compared to your competition. And that’s valuable information! It can help you to see if your optimization efforts are actually working. And we’re not just talking about how fast your site is, or how many times you’ve used a keyword on your page. Google pays a lot of attention to the substance of your content. Are you providing the best answer to a particular query? Is it easy to read, and formatted in a user-friendly way? All of that stuff matters. And if you’re not ranking well, it probably means there’s still room for improvement.

Tracking keywords and monitoring your pages’ positions in search isn’t all that easy. Google Analytics and Search Console show some information about search positions and the queries your pages are being found for, but it’s not provided in a way that’s easy-to-use. If you look up a particular page you can see its average search position, for instance, but that number is the search position across all queries that it’s found for. And it takes a bit of work to find out if your page is ranking for a specific query. The information is there, but it takes a lot of effort to monitor those keywords and pages regularly.

That’s where keyword rank trackers come into play. These tools (which you almost always have to pay for) are designed to take the hard work out of the process. You simply add the keyword you want your page to rank for, and they’ll pull all the information together and show you it in a format that’s easy to understand. A keyword rank tracker makes it much easier to monitor keywords over time, so it’s easy to notice when anything changes. And that means you can make strategic decisions, and take action right away.

New: integrated keyword tracking in Yoast SEO, thanks to Wincher

A particularly noteworthy keyword rank tracker is Wincher. Why? Firstly, it keeps things simple and easy-to-use, so you don’t need to be an analytics wizard to make sense of what you see. Secondly, unlike most of their competitors, Wincher’s keyword tracking is available for a reasonably low price. You can even get started for free! Plus, you’ll get features like daily ranking updates that you’ll have to upgrade your account for at other SEO tools.

To bring rank tracking to Yoast SEO, we’ve worked closely with Wincher to integrate their rank tracking capabilities in our SEO plugins. So now, all Yoast SEO users — both free and Premium — can monitor their keywords straight from the plugin!

That’s right — from Yoast SEO 17.8 onwards, you’ll see the following capabilities added to your plugin:

  Yoast SEO free Yoast SEO Premium
Wincher free Track 1 keyphrase per post, up to 5 keyphrases in total Track 5 keyphrases per post, up to 5 keyphrases in total
Wincher paid plan* Track 1 keyphrase per post, up to 500-10,000 keyphrases in total Track 5 keyphrases per post, up to 500-10,000 keyphrases in total
*Depending on your Wincher paying plan, you can track up to 500-10.000 keywords
rank tracking with Wincher in Yoast SEO
How your SEO performance data looks in the Yoast SEO x Wincher keyword tracking tool

A significant benefit of using a keyword tracking tool is that you can stay on top of your rankings – whether they drop or rise:

  • Get automatic daily updates of your keyword rankings
  • See historical ranking data displayed in a handy graph so you can quickly notice any changes
  • Monitor your site’s top-ranking keywords from the dashboard overview
  • Explore more detailed ranking data in your account at Wincher.com

Not ranking where you want to be?

We said earlier that it can’t hurt to know where your pages stand in the Google rankings. That’s not completely true… when you’ve put your heart and soul into creating the perfect page, it can be painful to see that your page isn’t doing so well! But don’t despair. There are always things you can do to improve the situation:

Adjust your keyword strategy

It’s possible that your chosen keywords are just too competitive. If you’re trying to sell sports shoes and sportswear, for instance, you might be competing with Nike and Adidas. If you have the resources that these mega-brands have, go ahead and keep competing! But chances are that you don’t have those resources, and you don’t have much chance of ranking for those main keywords.

If you’re facing too much competition, your best strategy will be to opt for a long-tail keyword strategy, or find a niche. Or if your site is focused on a specific region, you could benefit from a local SEO strategy, too. These kinds of SEO strategies are the ones you can really win at, because the big brands don’t have time to focus on local targeting and niche customer needs.

Want some ideas for alternative keywords to target? You could take a look at the Related keyphrases tool to see similar keywords. Or you could check which queries your page is already ranking for and focus on those.

Keep improving and climb the rankings

Do you feel like you’re competing in the right arena already, but you’re just not getting the rankings you want? Remember that most sites don’t reach #1 position overnight. Keep improving your pages, and keep an eye on the results; you can climb the rankings bit by bit. If you want to try doing this, it’s a very good idea to do some competitor benchmarking! Then you’ll have a better idea of what you’re up against and what you need to improve.

Keep monitoring — and don’t forget the big picture!

One of the biggest benefits of using a keyword tracking tool is that you can see immediately if your rankings drop. So keep monitoring those keywords! If you do see a change for the worse, you can investigate what has changed. Are there new competitors climbing up the rankings? Have existing competitors improved their sites, or updated their content? By staying on top of what’s going on in the search results, you can be more competitive and (hopefully) maintain your search positions in the long run.

A word of warning, though: don’t obsess over your rankings. Being at the top of the search results isn’t the most important thing in the world. You can still get good results with lower rankings. And don’t underestimate the power of a good social media strategy and a loyal user base! Remember that without delivering a good user experience, top rankings and high traffic are worthless — all you’ll achieve is a lot of unsatisfied visitors. So stick to the golden rule: always put your users first.

Go Premium and get access to all our features!

Premium comes with lots of features and free access to our SEO courses!

Only 99 EUR / per year (ex VAT) for 1 site



Source link : Yoast.com

What Instagram Updates to expect in 2022 | Anicca Digital

By | January 13, 2022


After many years of people pining for the return of Instagram’s chronological feed, it might finally be returning – but that’s not all that’s changing!

Head of Instagram Adam Mossari has made a series of informal announcements sharing details of what to expect from the platform throughout 2022. In a recent video published on Instagram Adam has stated; ‘we’re going to have to rethink what Instagram is.’

Although there are a lot of changes that Instagram is planning to make, there are two critical focuses for the platform which have prompted some of these updates.

Instagram’s two core focuses for 2022 are:

What’s happening to video on Instagram?

Video will be the primary format focus in 2022. With the rise of TikTok, it’s fair to say that Instagram is having to up its game to compete for the user’s attention. Instagram is aiming to consolidate all its efforts towards reels and building new creative tools. For creators and brands, this means you will have to prioritise video content within your content calendars otherwise you could see a drop in organic performance.

How is Instagram giving the user more control?

Instagram has already started implementing features to support this new value such as the introduction of sensitive content control, ability to hide like counts, even extending hidden words to Direct Messages. Head of Instagram, Adam Mosseri has explained that ‘to be able to give users more control, Instagram will need to be more transparent’. This means that they will be sharing how Instagram works, giving users an understanding of how to shape the platform into a tool that works for them.

Does this mean they will be more transparent with how the algorithm works? We think so! By ‘announcing’ updates, it is a step in the right direction towards transparency. A few years ago, major platform changes would be made without any announcements or explanation. By giving more transparency it means that users, brands and content creators have more trust in the platform and will hopefully invest more time, effort and of course money into Instagram.

Other aspects which are Instagram are improving in 2022 include:

  • Messaging
  • Creator network
  • Feed

How is Instagram making messaging better?

Recognising that messaging is the primary way that people connect online, Instagram wants to be the best place to connect with family and friends about their interests. In other words – they are making sliding into the DMs a little easier. They haven’t specified the details of what updates they will be making to their messaging function yet.

How are the changes on Instagram helping creators?

By making the platform more transparent, Instagram is hoping that this can help creators capitalise on the platform. It’s no secret that Instagram has been pretty hush-hush with its algorithm changes over the years – if you’ve worked in organic social, you know that experimenting with content is a key part in trying to grow your platform – but that could be a barrier to a lot of creators. Adam Mossari also mentioned that Instagram will be improving the monetisation products that the platform to help creators make a living.

How is Instagram updating the feed?

This will be the most noticeable update for most users as the way we see posts from friends, creators and brands will change. Users will have more control over what content and how content appears on their feeds. Adam Mossari explained that they want people to feel good about the time they spend in the app, and by giving them a choice in how they consume content on Instagram is a way to achieve that goal.

What are the three different ways to update your feed?

  1. Home: This is the same Instagram experience that you know today. Instagram currently served content to you based on what they think you are interested in. This feed generally is never-ending.
  2. Favourites: The next feed view will be named ‘favourites’ and this is a list that you curate based on your favourite Instagram accounts. These could be your favourite brands, your close mates or family. By adding accounts to your favourites feed, you’ll be unlike to miss any posts from them.
  3. Following: The feed we have all been waiting for – a chronological list of posts just from accounts you follow. Adam Mossari explains that Instagram has brought back the chronological feed as they recognise it’s important to have a feed where you can access the latest posts by the accounts you follow quickly.

What do the Instagram changes mean for marketers?

These updates are currently being tested and are expected to roll out in the first quarter of the year. If you are a social media manager, you will need to consider how these updates affect your plans in the next few months. The first and potentially easier action to take is to ensure that you are prioritising video creation within your content marketing plans. Secondly, you are going to have to navigate what ‘feed’ your content is shown on. The home feed will likely have your content shown to users who Instagram thinks will like your content – however in an ideal world you would like to be added to a user’s Favourite or Following feed. Throughout this feed testing and introduction period, you are likely to see your organic performance vary considerably.

If you would like to speak to our team about organic or paid social media click to read more about our services and get in contact.



Source link

5 Competitor Analysis Tools You Should Be Using

By | January 13, 2022


It probably feels as if you need tens of marketing tools to get a complete picture of the competitor landscape. I’ve certainly tried too many of them throughout my career.

But the truth is just a few right tools will cover most of your competitor analysis needs. And they don’t need to break the bank. In fact, some of the most powerful ones are free.

In this article, we’ll go through five competitor analysis tools you should be using:

  1. Ahrefs – SEO, PPC, and content marketing
  2. Brand24 – Brand monitoring
  3. SparkToro – Audience insight
  4. BuiltWith – Tech stack checker
  5. Visualping – Webpage monitoring

Let’s get into them.

1. Ahrefs – for SEO, PPC, and content marketing

Most companies consider organic and paid search traffic to be some of their most important traffic sources. Having reliable and insightful data about what your competitors do in this space is crucial—and that’s where Ahrefs comes into play.

Ahrefs provides data about your competitors’ content, backlinks, keywords, PPC ads, and much more. With its help, you can develop an SEO, PPC, and content marketing strategy to outsmart your competition.

My favorite functionality: Finding content gaps between you and your competitors

Ahrefs’ Site Explorer provides an in-depth look at both organic and paid search traffic, including backlinks of any website. There are lots of use cases for competitor analysis, so I’ll just explain my favorite feature: Ahrefs’ Content Gap tool.

It shows keywords for which your competitors rank, but you don’t. These content gaps can quickly give you many new ideas for your content planning.

In Site Explorer, enter your own domain and then click “Content gap.” Your domain will automatically be prefilled in the “But the following target doesn’t rank for” field. All that’s left now is to list a few domains that are your organic traffic competitors:

Text field for competitors. Below that is text field for your domain

Hit “Show keywords”:

List of keywords and corresponding data (volume, KD, CPC, SERP)

You’ll get a huge list of keywords. Now it’s all about playing with the provided filters.

To increase relevance, let’s choose the option where at least two competitor websites rank for every keyword. You can do so by selecting it in the intersection filter. Then filter for the best keyword opportunities by setting a minimal search volume and relatively low maximum KD score:

List of keywords after filters applied

I’m certain you’ll find some great keyword opportunities this way.

Pricing

Free to see technical SEO recommendations and backlinks to the websites you own using Ahrefs Webmaster Tools.

For competitor analysis tools and features, you’ll need a paid plan starting at $82 a month. There’s also a seven-day trial for $7.

Check out Ahrefs here.

2. Brand24 – for brand monitoring

Using social media as a communication channel is a must for the vast majority of companies. But seeing what your competitors do and how their audience perceives them across a huge variety of platforms isn’t something you can easily do by yourself. But no worries. Media monitoring tools, like Brand24, have got this covered.

Brand24 tracks mentions of keywords that you want to monitor across the whole web with a focus on social media. For competitor analysis, it’s used to identify and analyze online conversations around your competitors’ brands and products.

My favorite functionality: Spying on competitors’ brand mentions

Brand24 revolves around setting up a project with keywords you want to track. These keyword mentions can then be segmented, filtered, and analyzed to gain actionable insights. You’ll definitely want to track your own brand and product mentions. But we’re here to talk about your competitors.

So set up a separate project (or projects) with the name of competitor brands and products. You’ll encounter keywords that also have other meanings. You can either leave them out or apply the required and excluded keyword filters along the way to keep the irrelevant ones out of your reports:

Project settings to add main keyword, add/remove other keywords

This competitor mentions monitoring allows you to:

  • Adjust your communication based on what works best in your industry.
  • Get product insights based on how people react to the development of your competitors’ products.
  • Assess how people perceive your brand and your competitors via sentiment analysis.
  • Benchmark your media reach and share of voice against your competitors.

I’m sure there are even more use cases. Here’s an example of data from a summary dashboard:

Summary data above. Below that are line graphs showing mentions and social media reach, respectively

Pricing

Plans start at $49 a month for tracking three keywords. Tracking your brand and your competitors will require a higher plan for $99 a month that offers seven keywords.

Brand24 also offers a 14-day free trial.

Check out Brand24 here.

3. SparkToro – for audience insights

Doing market research to understand your audience is essential for your marketing success. Chances are, the research also contains competitive analysis data like brand perceptions, estimations of competitors’ marketing funnels, and market shares across segments. Those are all great.

But seeing what the audience of your competitors actually does on the internet is a rather new thing that tools like SparkToro allow us to get in seconds.

SparkToro is an audience research tool that provides information about what any audience reads, watches, listens to, and follows. Those insights can be retrieved based on keywords, social media accounts, websites, or hashtags. Needless to say, all of those inputs can be used to better understand your competitors’ audience.

My favorite functionality: Discovering where competitors’ audience engages

SparkToro is easy to use and navigate. Let’s do an example analysis on the SparkToro audience itself by plugging in its Twitter profile:

Text field to add Twitter profile

It will return a lot of data regarding the demographics of the audience. But in this case, we’ll focus on social media accounts, websites, podcasts, and YouTube channels the audience follows and pays attention to. Here’s an example of a report after filtering for personal social media accounts with fewer than 50K followers:

List of social media accounts with corresponding data (percent of audience, SparkScore, social followers)

With data like this, you can easily spot new advertising and sponsorship opportunities across many different channels. Just put together all the insights by plugging in your competitors’ social profiles, websites, keywords, and any “owned” hashtags.

Price

SparkToro is free for five searches a month with limited report capabilities. Paid plans start at $38 a month.

Check out SparkToro here.

4. BuiltWith – for checking tech stacks

Today’s marketing heavily relies on all sorts of tracking codes, pixels, and using the right technologies in the background. Some areas of marketing like SEO are even directly intertwined with website development. Choosing the right tech stack can often be the first step to success.

BuiltWith is a tool that puts together all detectable technologies any website is using, e.g., tracking pixels, payment systems, web servers, and CDNs. You can create a picture of tech stacks your competitors are using. Then either take inspiration from it or choose superior solutions. 

My favorite functionality: Checking advertising technologies

BuiltWith is the easiest and most straightforward tool to use on this list. Just look up websites of your competitors and take notes of any technology that stands out.

For marketing purposes, you can quickly scan all of their detectable marketing technologies. Since most advertising platforms use tracking codes and pixels for remarketing, analytics, and attribution purposes, you’ll see which platforms your competitors are present on.

Finding out that your competitors use Google, Facebook, and Twitter ads isn’t a surprise for anyone, though. But you can find some niche platforms or display networks that may be worth looking into. Here’s an example of what BuiltWith can reveal in its “Analytics and Tracking” section:

List of websites. Below each are links to see stats and download list of websites that use the site

Pricing

BuiltWith is free for the use case I depicted above. I’ve never felt the need to consider its paid plans, which start at $246 a month. Those plans seem to go way beyond the basic functionality and may be worth it for businesses that need a deeper dive into what tech stacks their target audience and prospects use.

Check out BuiltWith here.

5. Visualping – for monitoring webpage changes

Website design and copywriting convey tons of information. They’re also constantly in development, testing, and changing. And whether you admit it or not, every website takes a bit (or a lot) of inspiration from competitors and other websites in the industry. You need to know what’s going on in this area.

Visualping is a tool that keeps track of changes on any webpage. You plug in a competitor’s URL, set up alerts, and will be updated on any website changes.

My favorite functionality: Getting inspired by UX and CRO tweaks on competitors’ websites

I’ve made a lot of decisions based on website monitoring. Generally, the most common use case for any marketer is getting inspired by how your competitors try to squeeze more out of every visitor to their website.

In other words, we’re looking for user experience (UX) and conversion rate optimization (CRO) tweaks that we can adopt on our website without having to do all the research and A/B testing.

All you need to do is to set up the tracking of your competitors’ websites, and you’ll get alerted whenever there is any noteworthy change. For this use, it’s enough to set up the checking frequency to occur daily or even weekly.

You can also opt in for “any change” or “tiny changes,” as those tweaks can range from small changes in copy to just changing CTA button positions and colors:

On left side is website that you want to track changes. On the right, job settings to tailor preferences

Keep in mind that you shouldn’t blindly copy whatever your competitors do. Those changes may be for the worse. Ideally, the change has to make sense for you, and you should know that the competitor does A/B testing (checking with BuiltWith is a good start).

Pricing

Visualping offers a free plan with up to 65 checks a month, which would probably be enough to cover the homepage, pricing, trial, and other important pages of your competitors on a weekly basis.

If you’re more serious about keeping track of your competitors’ websites, then do try out the paid plans. Those start at $11 a month for 1,200 checks. “Price per page” checks naturally get cheaper as you scale up the monitoring.

Check out Visualping here.

Final thoughts

You may be wondering: “So this is it? Just five tools to get a complete picture of the competitive landscape?”

Well, yes and no.

I only covered the essential tools that everyone can and should use without breaking the bank. I deliberately didn’t list tools that have a niche use (e.g., analyzing YouTube or mobile app landscape). I also avoided “overkill” marketing intelligence platforms like SimilarWeb that provide tons of competitive data but mainly target enterprise and agency customers.

Also, keep in mind the tools I listed here are the ones I have experience with and like the most. Almost every tool has a solid alternative that you may like more.

With this out of the way, I also want to highlight a few more competitive analysis resources (not necessarily tools) that are super helpful and mainly free:

  • Financial reports of public companies to access a goldmine of firsthand information about your competitors
  • Surveys and focus groups to get quantitative and qualitative data about your market and competitors
  • Ghost shopping to get direct customer experiences from your competitors and possibly uncover their sales tactics
  • Review platforms like G2, TrustPilot, Yelp, or Google My Business to check what your competitors’ customers say

So that’s it. If you’re just finding out about the tools and are not sure what is the right way to conduct the competitive analysis, we also have a simple guide (including a template).

Got a tool that I should mention here? Or just a question? Ping me on Twitter.





Source link : Ahrefs.com