As the SEO industry evolves, so do the optimal practices and techniques for boosting website visibility. In this article, I will be presenting the most effective SEO strategies for 2023 that will help elevate your website to the top of Google search rankings. Are you ready to learn more? Let’s get started.
Get your site properly crawled and indexed
To ensure that your website is effectively crawled and indexed by search engines, it is important to follow these best practices:
- Submit your sitemap: Submitting your sitemap to Google Search Console and other search engines can help them better understand your website’s structure and content.
- Check for crawl errors: Regularly monitor your website for crawl errors and fix any issues to ensure search engines can access your pages.
- Optimize your robots.txt file: Use a robots.txt file to block search engines from crawling pages that aren’t relevant or necessary for search results.
- Improve page load speed: Optimize your website’s load speed to ensure search engine crawlers can easily access your site and index its content.
By following these practices, you can improve your website’s crawlability and ensure that your content is indexed properly by search engines.
Speed up indexing with Google Indexing API
One way to speed up the indexing process for your website is to use the Google Indexing API. This API allows you to notify Google when new or updated content is available on your site, which can result in faster indexing and improved search visibility.
To use the Google Indexing API, you’ll need to follow these steps:
- Set up your website with Google Search Console and create an API key.
- Register your API key with the Google Indexing API service.
- Implement the API code on your website so that it sends notifications to Google when new or updated content is available.
- Monitor the API for any errors or issues and troubleshoot as necessary.
By using the Google Indexing API, you can improve your website’s visibility on search engines and ensure that your content is indexed quickly and accurately.
The difference between submitting a sitemap via Search Console and using Indexing API
Submitting a sitemap through Google Search Console is like sending a notification to Google saying, “Hey, I have a new sitemap, please take a look.” However, using the Indexing API takes it a step further by forcing Google to index your pages instead of just showing them. This requires a script to be implemented on your website.
To begin using the Indexing API, you must first activate it in the Google API Console and create a service account. It’s important to note that you’ll need to grant the new account an owner status to ensure full access to the API.
To use the new service account with Google Search Console, you’ll need to verify ownership by adding the email associated with the service account as a site owner. This will ensure that you have the necessary permissions to access and manage your website’s search data using the new service account.
Back in Google API Console, generate a key for your new account, like this:
Once you have completed the necessary steps, you can use the Indexing API to notify Google about new pages on your site. You can use scripts to add or exclude individual URLs or sets of pages by creating a .csv file. For example, you can use this script to automate the process.
If your website is built on WordPress, there is a plugin available that simplifies the process of using the Indexing API.
Using the Indexing API does require some coding skills, so you may want to enlist the help of your webmasters. Alternatively, you can generate a script using ChatGPT by providing it with a task as if you were giving it to a human developer.
Note that each service account is limited to processing 200 URLs per day. However, you can increase this limit by creating multiple accounts and connecting them to your website. This allows you to quickly index thousands of pages and avoid waiting for months.
Keep site structure shallow
An optimal site structure is shallow, meaning that all destination pages should be no more than three clicks away from the homepage. If destination pages are located too far from the homepage, they may encounter indexing issues. Google crawlers consider the depth of a page in relation to its importance, so if a page is deep within a site structure, it may be deemed less important and therefore less likely to be recrawled and appear in search engine results pages (SERPs).
Having a shallow site structure also allows for the addition of new pages without disrupting the overall site structure or increasing click depth, thus providing ample space for new content.
Tip. Use the Click Depth column of WebSite Auditor to quickly understand the click depth of your pages.
Set up redirects carefully
Issues with redirects can hinder the flow of link juice to your pages and even prevent them from being indexed. To avoid this, ensure that your redirects are properly set up and do not cause confusion for Google or your site visitors.
One important factor to consider is the topical relevance of the pages being redirected. Redirecting a page about tomatoes to a page about cucumbers, for example, would not make sense to Google or site visitors. Such redirects may be treated as “soft 404s” because Google cannot find any connection between the content of the redirected pages. Therefore, it is crucial to ensure that your redirects are relevant and accurately guide both search engines and visitors to the appropriate content.
It is interesting that John Mueller has recently said that redirects from non-relevant pages don’t cause the destination page to be less relevant:
However, this statement appears to contradict Mueller’s earlier comments during a Google Webmaster Central office-hours video hangout. In that video, Mueller stated that a 301 redirect from all pages to the homepage would be viewed by Google
While some may argue that the aforementioned video is outdated and that Google’s policies may have changed since then, there are real-life examples that indicate otherwise. For instance, a website redirected a strong page to a new page that was topically irrelevant and weaker in terms of content quality, with the hope of boosting its rankings. However, the result was the opposite of what they had hoped for, and the new page lost positions instead of gaining them.
The graph clearly illustrates that the drop in traffic coincided with the implementation of the topically irrelevant redirects. However, once the issue was fixed and the redirects were corrected, the clicks on the website began to grow again.
On the other hand, implementing a topically relevant redirect can have a positive impact on your page’s position and traffic, as it can help Google better understand the content of the page and its relevance to the search query.
To ensure that your pages receive link equity and are properly indexed, it’s essential to set up redirects correctly. One crucial factor to consider is the topical relevance of the pages being redirected. For example, redirecting a page about tomatoes to one about cucumbers could confuse Google and harm your page’s ranking.
While some people believe that redirecting all pages to the homepage can boost SEO, it can be considered a soft 404 by Google, as confirmed by Mueller in a Google Webmaster Central office-hours video hangout. To avoid this, use a 301 redirect to inform Google that your page has been moved and avoid long redirect chains, which can make it harder for Google to discover the destination page and drain PageRank.
If a page has been deleted and cannot be replaced and redirected, consider creating a custom 404 response page that features potentially helpful links to prevent users from bouncing. Keep in mind that this page is for users, not for Google, so it should not be indexed.
Link your pages properly
To ensure that all pages on your website can be discovered by Google, it is essential to have proper internal linking. If a page is not linked at all or is hidden within a redirect chain, it will not be crawled or indexed by search engines, rendering it useless.
To enhance the discoverability of your pages and improve their visibility, create a navigation menu that includes all relevant pages. As shown in the following graph, adding a page to the menu significantly increased its visibility:
It is essential to ensure that all pages on your website are well-linked to be discovered by Google during crawling. If a page is hidden in a redirect chain or not linked, it will be useless. To enhance visibility and aid in the discovery of pages by Google and users, create a navigation menu that contains all the essential pages. In the example shown in a graph, adding a page to the menu resulted in improved visibility.
Internal linking can also be utilized to allocate PageRank and improve the ranking of specific pages. Additionally, it helps Google establish logical connections between pages and topics, and the anchor texts should clearly indicate the primary idea of the page they lead users to.
There are often debates about the usefulness of footer links.
Whether or not to use footer links on a website is a matter of personal preference and depends on the specific circumstances. Footer links are typically added to help users and search engines navigate a website, particularly if some links are not included in the navigation menu for some reason. While footer links are generally harmless, whether or not to include them ultimately comes down to the needs of the website owner.
Pages that load slowly have negative impacts such as low search engine rankings and high bounce rates. In addition, Google may view slow pages as empty or cloaked, which could result in penalties or even delisting. To address issues related to PageSpeed, it is important to ensure that your pages load quickly and pass the Core Web Vitals assessment. Tools such as Google’s PageSpeed Insights, Search Console, or WebSite Auditor can be used to check your pages and identify opportunities for improvement. These tools also provide suggestions for fixes.
Excessive use of images, videos, and design elements can be one of the primary reasons for PageSpeed-related problems. While visually appealing content is crucial for a website, it’s essential not to go overboard with it.
- Don’t use heavy visual elements within the viewport — they are hard to load and don’t let Google see the content behind them;
- Do not overuse abusive pop-ups — they prevent Google from quickly loading the page and annoy users;
- Clean up your site code to remove legacy elements — you may not need them anymore, still, they waste loading resources and slow the page down;
- If you use a CMS, be careful with plugins — although they are meant to help you, they also slow the site down. Make sure you only keep the necessary ones or choose a CMS which is SEO-friendly out of the box.
Optimizing your website for mobile devices is essential as Google now follows mobile-first indexing. Additionally, many users prefer accessing information on their smartphones rather than on PCs. To improve your mobile SEO, make sure that your site design is responsive and can adapt to any device. Avoid using viewport-blocking pop-ups and ads as they can negatively affect user experience and may even result in a Google penalty. It is also crucial to ensure that the font size is easily readable. If you use a CMS, select a responsive site template that includes built-in features to support users on all device types. Finally, use the Mobile Usability report in Google Search Console to check for any mobile-related issues.
Set up and connect local versions correctly
To avoid issues with international site versions in terms of localization, it is essential to handle everything properly. One of the challenges faced by international SEOs and site owners is how to strengthen newly created localized versions and link them to the original domain.
Although interconnection can be managed using hreflang settings and specifications, spreading link juice may be more complicated. Thus, it is crucial to carefully consider how to set up local site versions as a separate domain, a subdomain, or a directory. However, the best option depends on various factors such as business peculiarities, legal procedures, and content.
It is important to note that if the primary goal is to pass link juice to new pages, then creating directories is the only option. This is because a strong domain can help boost new pages. However, having international versions as directories may not always be feasible. In such cases, it is necessary to properly interlink the new site versions with the main one to ensure that these links work like backlinks and give PageRank to newly built pages.
Build high-quality backlinks
Acquiring high-quality backlinks is a crucial aspect of SEO, despite Google’s claims that they are not. SEO experts have demonstrated their significance multiple times, and Google’s numerous updates and link regulations confirm this. Here are some ways to obtain high-quality backlinks for your website without incurring a penalty from Google.
Find sites that link to two or more competitors
Finding backlink opportunities from websites that link to your competitors is a great way to acquire authoritative and relevant backlinks. When a website links to several of your competitors, there is a higher possibility that they will link back to your website as well. Additionally, these prospects are likely to be relevant to your niche and trustworthy since they have already established a relationship with your competitors.
A nice way to find relevant prospects is to analyze your competitors via SEO SpyGlass (Domain Comparison > Link Intersection > Prospective Domains).
Note: Pay attention to the Domain InLink Rank column and check if provided links are dofollow, as they are what you’re looking for.
Pitch guest posts
One effective way to enhance link building, establish brand awareness, and showcase expertise is by placing a guest post on a reputable website. To begin, search for relevant sources and inquire about their guest post policies. Some websites have a list of preferred topics, while others may require you to suggest topics. It’s important to agree on the number of links allowed per post.
Be sure to provide the necessary information to include in the author’s profile, such as professional achievements, links that showcase expertise, and information about any conferences or events that the author has participated in. These elements demonstrate expertise and contribute to the E-A-T signals that can also affect page strength.
Outreach listicles for inclusion
Listicles are a popular type of content that feature products and services in various business areas. To get your product or service featured in a listicle, research relevant listicles for your niche and reach out to the authors to include your product.
To streamline the process, you can use tools such as LinkAssistant to find listicle inclusion opportunities in bulk. The tool can show you relevant pages based on your preferences, provide a preview of the page, and even alert you if your competitors are mentioned on the page.
Reach out to your business partners
Your potential sources for backlinks might be closer than you realize – take a look at your business partners, clients, and providers, as they may have already mentioned your brand on their websites in reviews, testimonials, or lists of trusted vendors.
If you’re running a local business, it’s worth reaching out to other local businesses to see if they would be willing to include a mention or link to your website. This can help establish connections between your business and the surrounding area, which is an important factor in local SEO.
For example, Dr. Martens recently unveiled new storefront decorations in Chicago and gave credit to local artists and designers on their website by linking to their work. Building relationships with other businesses and individuals in your community can help to boost your online presence and attract more customers.
Use broken link building
The broken link building strategy allows you to identify broken backlinks of your competitors that lead to 404 pages, and then replace those broken links with your working ones. This approach is advantageous for both you and the website owner, as you get a valuable backlink, while they can continue to provide their users with relevant information.
This is how it works. In SEO SpyGlass, create a project for a competitor’s website and check the Backlink Profile > Backlinks section.
To utilize the broken link building tactic, start by identifying the pages that contain dofollow links returning a 404 status code. Then, manually check each page to determine if it is relevant to your website. If so, reach out to the webmaster or author of the post and propose replacing the broken link with a working one from your site.
It’s worth noting that backlinks in the main content of a page typically pass more link juice than those in other locations, but don’t dismiss opportunities from links in footers or navigation blocks. Even a single backlink from a high-authority site like Amazon or Apple can provide a significant boost to your page’s ranking.
Take care of on-page SEO
On-page SEO refers to optimizing individual web pages to rank higher and earn more relevant traffic in search engines. It’s crucial to take care of on-page SEO as it helps search engines understand what your content is about and rank it accordingly.
Here are some tips for on-page optimization:
- Conduct keyword research and use relevant keywords in your page title, headers, and throughout the content
- Optimize your meta tags, including the meta title and description, to provide a clear and concise summary of your page’s content
- Ensure your website is mobile-friendly and has a fast page load speed to improve user experience
- Use internal linking to guide users to related content and help search engines understand your website structure
- Optimize your images by compressing them, adding alt text, and using descriptive filenames to improve accessibility and help search engines understand the context of the image.
Taking care of on-page SEO can improve your website’s visibility in search results and drive more relevant traffic to your pages.
Create descriptive but short meta elements
The meta title and meta description play a crucial role in attracting users to click on a webpage from the search engine results page (SERP). However, it’s important to note that Google has the tendency to rewrite these elements based on the search query. As an example, the description shown in Google’s search results for our post about “Bring Your Dead Content Back To Life” may have been modified by Google.
Whereas the actual meta description looks like this:
Although Google has a habit of rewriting meta titles and descriptions according to the search query, it’s still important to take control and optimize your own meta elements. Keep in mind that shorter meta titles and descriptions, with a maximum of 60 and 150 characters respectively, are less likely to be rewritten by Google. Additionally, placing important keywords towards the beginning of the meta elements can help Google quickly identify and display the most relevant information in the SERPs.
To get some inspiration, check how your SERP competitors manage titles and descriptions with the help of the WebSite Auditor’s Content Audit module:
Tip. You can try generating meta titles and descriptions with the help of Chat-GPT.
Remember that you can tune your preferences to help AI create a better version.
Ideas are quite nice, but still, it is better to make your meta description a single sentence.
Set up structured data
Using structured data on your website can provide Google with a more detailed understanding of your content, which can increase your chances of earning a rich snippet in search results. Rich snippets, which often include images, ratings, and other information, are eye-catching and can result in higher click-through rates. Therefore, incorporating structured data into your website is essential for improving your search engine visibility and attracting more traffic.
Using structured data can provide Google with more specific information about your page and can help you earn rich snippets, which are visually appealing and can attract more clicks. Therefore, it’s important to consider implementing structured data on your website.
Fortunately, there are schemas available for nearly every type of content, whether it’s a recipe or a product page. Simply choose the schema that fits your content and apply it correctly. However, it’s important to note that using an irrelevant or deceptive schema can result in a penalty from Google.
If you don’t feel like digging through nearly a thousand schemas, use Google Structured Data Markup Helper — its 12 schemas cover most basic needs.
Once everything is tagged, test the page via Google Rich Results Test to make sure all works fine.
Schema markup is a way to provide Google with structured data in an easily understandable form. The more information Google has, the better the chances of your entity being featured in a knowledge panel, which occupies a significant portion of the SERP.
If you are using a CMS, it is worth noting that some of them, such as Shopify, come with built-in schema markup by default. All you have to do is fill in the required data in your admin, and the system will handle the rest, granting you a validated rich snippet.
Make use of alt texts
Providing descriptive alt texts for your images serves multiple purposes. Firstly, it helps Google to comprehend the image content, thereby enabling its AI to train more effectively. Consequently, it can increase your image’s visibility, and hence, enhance the chances of getting more traffic. Secondly, even if image optimization is not your primary objective, increased clicks are always beneficial.
A good way of creating better alt texts is to upload your image to Google’s Cloud Vision API and check what Google already sees on the image.
You can then tune the text to make the description more detailed. Alternatively, you are free to change the description completely if Google did not get the content right.
Tip. If you don’t remember if images on your page have alt texts, check the page in WebSite Auditor.
Create useful content
The current trend in content creation is to prioritize the creation of valuable content for people, rather than just optimizing for search engines. In addition, with the increasing prevalence of AI writing tools, having real-life expertise in certain areas has become crucial for creating high-quality content.
Find the right keywords
When creating content for SEO, it is important to ensure that the content targets the relevant keywords and queries that the page is intended to rank for. Keyword research tools can be useful in determining these keywords, however, it is important to note that choosing keywords based solely on their potential traffic may not always be the best approach. Other factors such as relevance, intent, and competition should also be taken into consideration when selecting keywords for your content.
For example, when you search for keywords in Rank Tracker, pay attention to Keyword Difficulty.
If you find a keyword too challenging to target, it’s better to opt for a less competitive one with lower traffic potential that you can rank for. As your website grows stronger, you’ll be more equipped to compete for tougher keywords. However, it’s also important not to waste your resources on keywords that are too easy and offer little traffic potential. Therefore, finding a balance is crucial.
To assist you in deciding what keywords to use, you can utilize Google’s Natural Language API (NLP API) to comprehend how Google interprets your content. This tool can verify if Google recognizes the relevant entities in your content. Based on the results, you can include the necessary keywords to help Google understand your intended meaning.
Utilize search intent
Understanding the search intent behind a query is crucial for ranking a page in search results. For example, a keyword with a clear transactional intent such as “buy sneakers” will not match a how-to post or a video. Therefore, it is essential to grasp the search intent of a keyword to produce the right content and rank higher on SERPs.
The best way to comprehend the intent behind a keyword is to examine the search engine results page (SERP) and create content that aligns with it. For instance, if the search results mostly show listicles, then creating a listicle would be the right approach.
Moreover, Google can give you an idea of the preferred content type through its top-ranking feature. Once you conduct a search, the first Google feature you see can guide you in creating the appropriate content.
Shopping usually indicates that SERPs will contain product pages, and Maps may signalize that the SERP will feature a local pack and listings like best X places to eat in Malaga.
Here’s a helpful tip: You can use the Google SERP Features section in Rank Tracker to check what SERP features a keyword triggers, and determine what types of content to add to your page to get featured there. The gray features indicate the ones you’re not present in yet. This can be a great way to optimize your content and improve your chances of ranking higher in search results.
Prove Experience in Е-Е-А-Т
Google’s content guidelines have been emphasizing the importance of E-A-T, which has now evolved into E-E-A-T where “Experience” is the new addition. This is because as people increasingly use AI tools like Chat-GPT for content creation, real-life experience has become even more crucial. Despite the high quality of text generated by AI, it cannot match the value of hands-on experience in a given field.
It’s important to note that Chat-GPT and other AI writing tools do have their limitations. For example, Chat-GPT’s knowledge base is only up-to-date until September 2021, which means it lacks information on any events or developments that have occurred since then. The tool even acknowledges this limitation itself.
There are 4 seasons in Doom Patrol
To enhance your content, it’s important to incorporate personal experience and use cases, as Chat-GPT and other AI writing tools lack real-life experience with products. Although new datasets may be learned in the future, describing personal experience and unique use cases can add value to your content that cannot be obtained from product descriptions alone.
For instance, consider the experience of writing a post about Shopify. By creating an account and setting up a shop to sell eco-friendly bath products, one can gain firsthand knowledge of the internal workings of the platform and identify and address any issues that arise. This allows for clear and honest presentation of the platform’s capabilities to readers.
The SEO industry constantly adapts to technological advancements and evolves accordingly. Based on current trends, the aforementioned tactics are expected to remain relevant at least until 2023. However, should any changes occur, we will promptly update our information so you can effectively optimize your website with the latest SEO practices.