The Universes Most Brilliant SEO Audit for Site Traffic

The Universes Most Brilliant SEO Audit for Site Traffic

How to Perform a Genius Website Audit from Basic SEO Audit Level to Advanced SEO Audit Techniques : Professional Level SEO Audit     

A website audit is the first thing that you will do when you look at a website. Also before you publish your website, you’re supposed to make sure the audit is complete. Without an SEO audit your site will be up to possibly get a bad reputation by and SEO. I learned this as an SEO the hard way. If you want to work with other SEO’s you shouldn’t just build your site online.  Especially if you didn’t insert a user agent: no follow string/script into your robots.txt file. Wait! Do you have a robot.txt file? Nope, then you won’t get to work on an SEO team. Do I have one for my new website up, yep. If you develop websites and mobile or desktop apps too, you’ll want to be sure that it functions to its optimal level before publishing even version 1.0 of the technology process.

I’m often told by people/folks that if anyone can do marketing. Also, the best compliment a Math proof wiz or Software engineering wiz will give you is that we copy and past some back end code but not really. I always immediate say not true and walk away. That’s because you can’t fix ignorance over night (nor academic or intellectual field bias).

Up Selling SEO you’re a Professional Act like One!         

In order to up-sell SEO and development (If client pays for it), is like any business honestly. Think about how McDonalds & Burger King makes their money. Or a well built restaurant with a bar. They up sell. As an SEO & App/web dev. You must up sell yourself and team I you’re on one. In order to do this I say “I build web traffic” first always when I speak about work for whatever reason anyone. I could be at a family gathering, with my significant others friends, at work or at home. It doesn’t matter. Some people, only a few will just agree and nod. Most people in professional job fields who are not in marketing will say. “Anyone can do that”. I reply, “I build traffic and do website optimization at a professional non amateur level. This brings in traffic that has a high chance of getting customers”. Other professionals spend time on other projects, so their SEO is at an amateur level at best. Now after that processes, people all want websites. You’ll be at birthday parties, BBQ,’s, work, friends meeting, family gathering, church, and social event you can think of or any gathering that’s going well for you. People by nature talk at some point. They’ll be impressed by your professionalism. Speed,  communication, and accuracy.  I also dress business casual all the time unless I’m weight lifting or jogging in the morning. I try to remember a tie, Suit jackets are for formal events to me. As a modern professional that is after 1996 when broadband internet got big, you need a website with a portfolio and resume download link. Or you or your customer isn’t completing at the same level as others who do what you do as a profession (not amateur) weekend SEO warrior or professional level, but five days a week for 40 to 50, hopefully at least 50 or 60 hours per week at speed, quality, and detail.

Screaming Frog and Link Sleuth     

Use screaming frog and link sleuth to audit your website or your customer’s website. Screaming frog is free for small to mid-sized customers or websites. For enterprise to mid size businesses you’ll need to pay a hundred Euros x $1.1001 as per this article is written and composed for SEO folks.  

SEO Audit Web Analytics Plan, Strategy, SEO steps. Layout out the Mission.          

There Are five prioritized sections for doing a Professional SEO Audit   

  1. Accessibility of the website       
  2. Index ability  
  3. On the Pages Ranking Factors
  4. Off the Pages Ranking Pieces of the puzzle
  5. Analysis of the competitors

Check the robots.txt file. It usually is in the root or main directory of a website. The public folder in cPanel control on a computer.

So if your potential client or prospect customer’s site is type in    

The file should be there. If not they need one! Professional and standard MO or mission operation. Mission = get traffic to site = get prospect customers = get customers = make web development opportunities or built a site if they don’t even have one ASAP. That’s why I’m an SEO, the client needs a site ASAP at professional SEO level. They need an SEO professional level website in 24-72 hours not 24 t o72 weeks. Seriously, I saw you frowning at me! Seriously = Professional SEO. Feel free to even syndicate this article by re-writing it in your own thoughts kind of like college students often do. Mix  it with other articles and thoughts as long as you attribute the author, link to the author, write up an SEO outreach template (see article blog post coming up on that), and make this into an outline, template, Microsoft Sheets doc (lol I almost said Power point, I feel old now J).

Meta Tags for Web Crawlers. Robots.txt or Robots.xml file. 


<meta name=”robots” content = “no index, no follow”>


Check Http Status Codes       

The urls can’t run errors and redirects should be 301 to channel the most link juice, or SEO link authority or relevance.  

Make Sure There is an XML Sitemap      

A sitemap.xml file should be in the root directory. Test for this by typing in  Make sure that it follows the protocol.

Submit sitemap.xml file to your webmaster search engine tools accounts   

If there are pages on the sitemap that aren’t in the Search Engines, then that is bad. These are named “orphaned” pages. Look for an excellent location for the page in the website architecture. This is called SEO architecture. Create an internal back link or internal link to the page and back through the top navigation, menu navigation, and section navigation, side bar navigation or in paragraph text navigation on the website. Make sure that you do this for Google and Bing at the least and other search engines too.  

Please remember that Microsoft Corp. reports that Bing has a thirty three percent market share of search engine visits in the United States of America.

I lost thirty three percent of my traffic in a rush to put up , because I didn’t submit my xml.sitemap to Bing webmaster tools. Ouch! If you don’t SEO audit a website while your building the website on your hard drive or cloud drive and you put up a website, an SEO Professional might notice and question your professionalism. You’ll have to handle this and you can, but it’s a waste of your time, resources, and possibly your Google reviews for your Google business listing. You should have one, and fill it out!  

That would be a great time to delegate work to freelancers or people who rank lower than you at your job. Perhaps an intern or SEO specialist if you’re and SEO manager.  

Website Architecture SEO Audit    

The website architecture for digital marketing, SEO, and web development is the overall layout of the site. There is a vertical website structure and horizontal website structure. The vertical structure is how deep the website is. This means it has a lot or a ton of pages at the enterprise to mid sized business level. The horizontal level are the obvious main pages. These are the pages on the site’s home page. You’ll see a website menu link navigation near the top below the websites logo, website banner, and or Javascript.js Flash Slider (or just above that) if the site has one.

Flash and JavaScript Navigation

Flash and Javascript are being crawled by more advanced and intelligent search engine bots. However, it is still and SEO best practice standard not to use. Test this by checking on or off Javascript in your browser’s settings or option menu. Then make two reports. One with Google and Bing’s indexing of the urls with Javascript turned on and one without. If Javascript is blocking the indexing of certain pages, then fix the navigation to those pages. Those pages were SEO orphaned pages because of a heavily coded Javascript navigation and need to be fixed ASAP.   

Website Performance/ Page Loading speed    

There are a few excellent tools to test a website’s page speed. YSlow and Google Page Speed are great tools. Pingdom is a great one too. These tools enable content compression to make the images (JPEG’s) GIFs, PNG files load faster. There are also other ways to allocate resources that are loading to slow to something else or get rid of those resources if you have to. You may have to consult your server side developer. If your site gets a poor rating below 80, then it’s too slow. Users may leave and often do if the site is really slow. Also, search engine bots are programmed to leave sites that crawl too slow. They prioritize indexing sites that load faster too and your keyword rankings go up on the whole site!  

You can find and notice web pages and items on the pages that are slowing down or pinching the site like in a car traffic jam from letting the Google or Bing bots flow through the website.

Section 2 Index Ability of the Website for Search Engines       

Use the “site:” command or operator as I call it to view what Google Sees. It will give you a good insight as to how many pages Google and are indexing. Too many results difference from the sitemap.xml file pages and you have duplicated content. Copyscape will help us to pinpoint this content. I’ll be writing more about this later in this article blog post.     

Auditing Crucial or Vital main pages under the Magnifying Glass    

Next needed is to check the index ability of the web pages in more targeted detail. We need to prioritize the audit to look closely for time purposes at the higher level pages that are going to be on the website.

Search Engine Penalties Google & Bing

Often “search engine penalties” are just a page that wasn’t indexed or was an orphaned page, or a 404 missing page that wasn’t redirected via the .htaccess file or cPanel controls/settings.  

If it wasn’t usually you’ll get a message written in negative red colored typography (#FF0000). So then you’d know right away.  From webmaster tools.   

Lastly, it could have been from a major SEO update. Be sure to keep up to date with SEO news. The best way Google SEO news if you’ve never done so before. Read the major SEO news sites and their Blog. Pick your favorite or two and subscribe to their email subscriptions. This way its right in your morning e-mail box.  

Penultimate SEO Penalty Task     

Diligently fix the issue at hand. Read our site and Blog for professional tips on how to fix SEO website problems. We troubleshoot mid ranged to advanced SEO problems with a website often.  

Final SEO Step for a Search Engine Penalty

Lastly submit the website to be re-added to Google or Bing. You only need to do this when you receive and actual penalty which you should glaringly see in your webmaster tools, or every page in your website won’t be in Google or Bing. Google and Bing’s websites have information on how to go through the re-submission process.

Uniform Resource Locators or commonly known as (URLs) Optimal SEO or URLs

Urls should be not longer than 155 characters. This way it can fit into profile pages and other pages that have character limits on text boxes. Keep in mind that Twitter is only 120 characters too! So if you post your posts on Twitter, then 120 is really the character limit with spaces. Spaces, Question marks, Exclamation points, <>=, special characters get counted as a space.     

Domains and Sub-Domains  

If you create a sub domain, it is usually treated by the search engines as a different website URL. It’s best to put your content into a new folder to organize the content as opposed to creating a sub domain. Sites with sub domains usually are websites for companies that take over acquire another company, so they have entire new staffs or newly created staff to manage another entire sub-domain.

URL Parameters:   

Use static or non changing URLs. If you must must dynamic URL parameters, register the URL in your Google and Bing webmaster tools.

SEO URL Hyphens –  

These are okay as opposed to using underscores. Google and Bing have had differences with underscores in their history of website indexing.

Word press Blog:

When you create a post in WordPress, the appendices’ or extension on the Post and Pages URL’s are set to plain (default), This puts a scramble of random numbers at the end of the URL. Go to settings in WordPress. Click on permalinks. Then change the setting to use the post Title. Write your WordPress post title in the text field for it for that has the Keywords that you want for SEO in the Title tags.

Duplicate Content Relating to URLs   

URLs are ubiquitously are found guilty of producing many duplicate content issues with your website. Google and Bing don’t want to index the same content and will penalize a website in rankings or remove from the search engine results entirely if there is way too much duplicate content both on your website and offsite duplicate URLs pointing to your website. The crawling program that you use like Screaming frog should find the duplicate links. You should also manually check for them. Google’s Panda update was the update that highly affected websites who had too many duplicate URLs. 

Content is the Boss on the World Wide Web   

The content or what is on your website is at the top of the hierarchy for the web. You can click on the drop down arrow next to the website that displays your search term to view the page’s cache in Google. Then click cached. This shows how Google and Google crawlers see the web page. You can click on current to get back to Google’s SERPS. 

BrowSEO and SEO Browser are wonderful tools to use too for looking at cached pages in the Search Engine Results Pages. 

Substantive Content: Every Page should have at least 500 words on the page. If not, this lowers the ranking of the page.  

Bounce Rate: Bounce rate can be a tricky metric. A visitor might get what they want fast and leave from the same page which causes a bounce recorded in your web analytics program. A lot of bounces though, with no conversions on that page for your product or service can indicate a content or content navigation problem with that particular landing page. Of course a micro website for a sales corporate event will have a 100 percent bounce rate. 

Time Visitor Spent on the Page: Use your installed web metrics or analytics program to analyse how long visitors spent on a page. A visitor the spends longer on a page may find the content really amazing and hesitates to close out their browser tab with the page on it while doing a project for college or work. 

Keywords: Make sure that the keywords are in your content. Warning: don’t overdo it or it won’t read for humans which is bad. 

Professional Credibility: Use spell check! Don’t post content with a ton of spelling errors in it. 

Flesch Reading Ease: SEO tool used for making sure that the content is easily readable for web visitors. 

Fog index: Another SEO tool for checking the readability of a website.  

Accessible Content: When you put your visible text inside of an image file, Flash, or convoluted Javascript, then it doesn’t get crawled by Googlebot or Bingbot very well. 

Analyse Your Content for SEO    

Information Architecture: The blueprint or the drawing board of your website is its information architecture. Make sure each page has a sense of purpose. It’s themed and does something specific for the visitor. The page should be optimized for a keyword or keyword phrase. Use Google Keyword Planner tool and other keyword planning tools. 

Multiple Site Pages Targeting the Same Keyword:   

This is called keyword cannibalism. Google and Bing don’t like it. It confuses the search engines and visitors. The solution is to merge pages together or change the theme or purpose of one of the pages and target a different keyword. 

Duplicate Content: 

This is a negative ranking factor that needed to be fixed. It’s when content on your website or off the website (the sites linking to your site). 

To fix it, use your sites crawl, then determine which content is should be assigned as original. The duplicate content and pages will need to be re purposed, re-wrtten, or removed entirely. HtAccess re-directed may be needed too or cPanel re-directs. 

Blekko is a great tool for showing the duplicate content onsite and offsite for your SEO campaign. 

HTML: HTML is a very important ranking factor. Make sure it’s validated has an excellent working validator. 

CSS: Make sure this is validated too. 

Title Tags: 

Make sure that the title tags aren’t over 70 characters. That is the amount that shows up in Google and Bing. It should describe the content on each page. There should be a proposed keyword or keyword phrase at the front of the title written. Use Google webmaster tools to find duplicate titles. It’ll be under the tab Optimization > HTML improvements. 

Meta Descriptions: Doesn’t directly impact website rankings in Google and Bing. However, it affects the click through ratio, which is an SEO ranking factor. They can be 155 characters long. Make them succinct and don’t overthink or over analyze them. 

Meta Keywords: If you find pages with meta keywords on them. Delete the meta keywords. They are noted as spam by Google and Bing. 

rel=”canonical” link: These help to avoid duplicate content penalties. Review them on your website. 

Pagination: Pages should use rel=”next” or rel=”prev”next for pages that use pagination. 

 Images: Google and Bing don’t directly index images. They use image “alt” tags and the file name of the image to index images. Be sure to make sure the image tags are optimized with keywords and don’t duplicate them across the site. 

Out Bound Linking:   

Only link out to trustworthy websites. It is a very bad factor or indicator to search engines if you link out to a website that is of poor quality. I use SEOMoz toolbar to see a website page’s page authority and domain authority. Link out to content that is related to the page. This helps with the pages authority. Broken links need to be fixed. Use Link Checker or Screaming Frog site crawl. 

Internal Site Links: Make sure the most internal site links, the ones on your site go to the most important pages. This way users who visit from other pages will find your conversion pages. 

H1 Tags: The website page should have headings. They are relevant to the search engines and putting in carefully planned out keywords in it are very important. 

Frames & IFrames: These are seen as the content on the link to their source code. The content won’t be seen as on your page. 

Ads: Don’t have an add every other word. This looks spammy to search engines. Make sure that your adds are only a third of the content at most! Maybe even a quarter honestly. 

Section 4 Off Website Ranking Factors: 

Being Popular: 

Being popular means that your website influences more people and attracts more attention through social media, blogs, and offline, chat or banter. It’s an indicator or puzzle piece for predicting success for the website. 

Increasing Traffic?: Is your website gaining an increase in traffic? Use your analytics that should be installed to make sure that traffic is at least slowing gaining to your site. If it isn’t then you’ll need a new digital marketing plan or strategy. 

Site Trust: This is subjective. But you can measure malware and spam. Use DNS-BH or safe browsing API by Google. McAfee is another good one. Don’t stuff keywords, hide text, or cloak which means returning a different website then what Google thinks you’ll see and users see something else. Make sure that your link to trustworthy sites and that your inbound links are from trusted websites. Edu. links and .gov links are considered very trustworthy. 

Link Profile: Sites linking to yours that have a high quality to them are trustworthy, popular, and authoritative catapult your site rankings upward. If those backlinks have many high quality site links pointing to them, then this is a great thing too! The internet is a network of computers and websites. The world wide web. Getting 200 root domains linking to your website is more important than getting 200 links from web pages on one quality domain. You want the distribution of links to be topical (if you can), and natural. Google questions links that have the exact same keywords in them if Googlebot crawls too many. Amazing backlink tools include Ahrefs, Blekko, open site explorer, Majestic SEO, SEMRush (backlink opportunities tool and backlink audit tool and reports). The links should be socially engaged too via Facebook, Twitter, Reddit, Instagram, GooglePlus, Snapchat and more. Share Count is excellent for analyzing this. If an influential individual is following you then that increases your site’s authority. 

Gauging your Competitors: 

First know who your competitors are for your website rankings. Then perform an SEO audit on their website. It will take some time but it’s important. If a competitor gets one backlink from a great site in your field, their webmaster/admin may know the linker’s webmaster. However, there is a link opportunity for you to blog about or make a page regarding the linkers content link to them, and do link outreach. This means write up a template, edit it custom style, and request a back link. This occurs if more than one linker is linking to your competitor in your professional field. This big data means that this linker is most likely to respond to your blog post, appreciate the well written content, your site’s traffic, they care about SEO (not every professional does), and they are highly likely to take the time to email you back, link back to you, or at least call you and formally thank you. 

SEO Report Audit: 

The SEO report is mega vital. I learned this the hard way. I once reports only the granular technical nitty gritty, the hard stuff to a room full of business managers, and executives who didn’t know technical SEO protocols, and didn’t care about the technical portion. Make your Technical SEO audit summarized in parts for the non technical folks and gauge what they are looking for. They’ll most likely care more about the plan and feel of the plan. Be sure to put the critical or most pressed issues first in the report and the other important issues later. 

Detailed Examples Fast Action:  

Give actionable thoughts last. If it’s a big project, list the first protocol to get the ball rolling. Don’t be too general with your SEO action recommendations.

Now if you weren’t, your an SEO pro! Be sure to link about to me, send me an email or call to let me know! That’s SEO Outreach.

SEO Audit Post: Written by Joseph Paul Fanning. Joseph Has been working in SEO & Dev since 2001. He has a B.S from Ramapo College & Certification in CS from Harvard. He works at his site

Leave a Reply

Your email address will not be published. Required fields are marked *