403 Forbidden Crawl report
-
Hi,
I am getting 403 forbidden crawl report on some of my pages. However the pages are loading fine. Also when asked my web developer told that some times reports show errors when there is nothing wrong. Also will the errors affect the SEO/Ranking etc.
Some of the links:
https://www.medistaff24.co.uk/contact-us/https://www.medistaff24.co.uk/elderly-care-in-evesham-worcestershire/
-
I have a locks business website about locksmith Tampa Florida
we are facing the same issue on the Main page -
A 403 Forbidden error means that the server denied access to the requested page. This can happen for a few reasons, such as:
- The user does not have permission to access the page.
- The page is not published yet.
- There is a misconfiguration on the server.
If you are getting 403 Forbidden errors on your website, it is important to first check that the pages are actually loading fine for users. You can do this by visiting the pages yourself or by using a tool like Google Search Console.
If the pages are loading fine for users, then the errors in the crawl report are likely false positives. This can happen if Googlebot encounters a temporary error when crawling your website. In this case, you can ignore the errors and they should eventually go away.
However, if the pages are not loading fine for users, then the errors in the crawl report are likely real. In this case, you need to fix the underlying issue that is causing the 403 Forbidden errors.
Here are some steps you can take to fix 403 Forbidden errors:
- Check the permissions on the files and folders that contain the pages that are returning 403 Forbidden errors. Make sure that the user account that Googlebot is using has permission to access these files and folders.
- Check the robots.txt file to make sure that Googlebot is not being explicitly denied access to the pages that are returning 403 Forbidden errors.
- Check the server configuration to make sure that there are no misconfigurations that could be causing the 403 Forbidden errors.
If you have tried all of these steps and you are still getting 403 Forbidden errors, then you may need to contact your web hosting provider for assistance.
Will the 403 errors affect the SEO/Ranking
As for whether or not 403 Forbidden errors will affect your SEO/ranking, it depends on a few factors.
-
If the pages that are returning 403 Forbidden errors are important pages for your website, then the errors could potentially have a negative impact on your SEO and ranking.
-
However, if the pages that are returning 403 Forbidden errors are not important pages for your website, then the errors are unlikely to have a significant impact on your SEO and ranking.
It is best to fix 403 Forbidden errors as soon as possible. This will help to ensure that Googlebot can access all of the pages on your website and that your website is crawlable and indexable.
Warm Regards
Rahul Gupta
Suvidit Academy
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Can bots crawl this homepage's content?
The website is https://ashleydouglas.com.au/ I tried using http://www.seo-browser.com/ to see if bots could see the content on the site, but the tool was unable to retrieve the page. I used mobile-friendly test and it just rendered some menu links - no content and images. I also used Fetch and Render on Search Console. The result for 'how google sees the page' and 'how a visitor sees the page' are the same and only showing the main header image. Anything below isn't shown. Does this mean that bots can't actually read all content on the page past the header image? I'm not well versed with what's going on with the code. Why are the elements below the header not rendering? Is it the theme? Plugins? Thank you.
On-Page Optimization | | nhhernandez0 -
Questions About My Report
Hi, I have a website that aggregates NFL analysis (not news). I write 3-5 line summaries about each article I link, so there is a pretty good amount of daily content. Here's the site: http://www.profootballhotreads.com/ After I received my initial report, there were several issues, and I just wanted to get some thoughts on them. Some of these might be related to the aggregate nature, some might be not a concern, but I want to know which ones I should really worry about. Too many links. My main page is a continuously running scroll of links, so obviously this is going to be tough to accommodate. I know this makes each link less "valuable," but does it actually affect my site in any way? I don't really have links to my site on the page other than in the menu, which I assume would be scrolled first. meta description on tag pages. For site design reasons, I have several "pages" that are actually tag collection pages rather than unique pages. For example, each team's page is simply a collection of anything tagged with that team. So, I don't know if I can provide a meta description for those pages without making that the default meta description for any post with that tag. I supposedly have tons of duplicate pages but when I go to those pages, I don't see it. Webmasters said I only had one duplicate. Not sure what's going on. I'm thinking anytime I update a post, it is reading it as two different posts even though only one post exists at a time on the site. I have tons of duplicate page titles. Basically, I have tons of pages on my main page because after a certain amount of posts, it goes to a "new" page, even though it's just a continuation. So, I have Main page 1, Main Page 2, etc with the same title and meta descriptions. I don't think this is a concern, is it? Thanks for anyone who might be able to help. Let me know if there are more questions. Jason
On-Page Optimization | | JMay0 -
Handling a Huge Amount of Crawl Errors
HI all, I am faced with a crawl errors issue of a huge site (>1MiO pages) for which I am doing On-page Audit. 404 Erorrs: >80'000 Soft 404 Errors: 300 500 Errors: 1600 All of the above reported in GWT. Many of the error links are simply not present on the pages "linked from". I investigated a sample of pages (and their source) looking for the error links footprints and yet nothing. What would be the right way to address this issue from SEO perspective, anyway? Clearly. I am not able to investigate the reasons since I am seeing what is generated as HTML and NOT seeing what's behind. So my question is: Generally, what is the appropriate way of handling this? Telling the client that he has to investigate that (I gave my best to at least report the errors) Engaging my firm further and get a developer from my side to investigate? Thanks in advance!!
On-Page Optimization | | spiderz0 -
Can somebody help me with a "Grade F" report
My Seomoz account tells me i've got a Grade F for my on-page optimalisation. The report said there's no single "on page keyword" usage at the whole page. Can somebody tell me what went wrong? If you take a look at my website: www.oceandrivers.nl, you'll see that i've used the keyword "prive chauffeur huren" everywere. In the URL, the H1 etc. (See image)
On-Page Optimization | | OceanDrivers
So i don't get it?! Thanks in advance! [](<a href=)" target="_blank">a> visWA visWA0 -
WordPress Crawl Errors
I recently added wordpress to my site and get the following errors: Duplicate Page Content http://agrimapper.com/wordpress/ http://agrimapper.com/wordpress/index.php How do I define the canonical page on a .php. 4XX (Client Error) http://agrimapper.com/wordpress/index.phpindex.php Any ideas where the 4XX error comes from. Thanks.
On-Page Optimization | | MSSBConsulting0 -
Why Does SEOMOZ Crawl show that i have 5,769 pages with Duplicate Content
Hello... I'm trying to do some analysis on my site (http://goo.gl/JgK1e) and SEOMOZ Crawl Diagnostics is telling me that I have 5,769 pages with duplicate content. Can someone, anyone, please help me understand: how does SEOMOZ determine if i have duplicate content Is it correct ? Are there really that many pages of duplicate content How do i fix this, if true <---- ** Most important ** Thanks in advance for any help!!
On-Page Optimization | | Prime850 -
On Page Optimisation Reports
Firstly sorry if this has already been answered - I did look I promise.
On-Page Optimization | | Jock
Secondly sorry if the answer to this is blatently obvious! In the process of trying to optimise my landing pages, I am using On Page Optimisation reports. I have several (ok lots) with F grades which is not surprising as the landing page is not the landing page optimised for a certain keyword. If I change the landing page to the one that I have for a certain keyword then hey presto A or B grade (clever me)! Now here's the thing - presumably the landing page that is listed by default is the one that Google "sees" for a particular keyword. How do I change this if I can or do I have to be patient or am I just being plain daft?! Many thanks0 -
Not making a change of the 100's in crawl Diagnostic
Based on the PRO crawl Diagnostics – if we don’t make a change on 1 page, does that just affect the SEO on that one page, or does it affect the SEO on all pages of the site? E.g. If we get a “Too many on page links” for a certain page that we don’t really want to rank for – does not fixing that particlaur page affect the site as a whole? Hope I explained this ok..
On-Page Optimization | | inhouseninja0