Onsite audit and fix issues
-
I am not sure if I am posting in the correct area here, but I am looking for someone I can hire to do an onsite audit and fix those issues. Thanks.
-
Hi, we would highly recommend using Moz Pro, to carry out the SEO audit.
-
@Granitebusters may I ask you to discribe the exact problem you can reply me I will help you
Got a burning SEO question?
Subscribe to Moz Pro to gain full access to Q&A, answer questions, and ask your own.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Unsolved I have lost SEO Ranking while removing www from domain
I have lost search SEO ranking for 4-6 core keywords while removing www from domain switch.
On-Page Optimization | | velomate
Referring domain: https://cashforscrapcarsydney.com.au/ Earlier the domain was in the format: https://www.cashforscrapcarsydney.com.au/ But when I checked the search result, search engines had not yet crawled to the new format. Let me know if the server change or any algorithm hit might cause it. Also please share the feedback on - does removing www from the domain losses keyword ranking. Helpful replies are needed.0 -
Unsolved I had toxic backlinks attacks from some rivals that increased my spam score to 33% how to reduce and fix it ?
Hi Moz Community and SEO experts I had toxic backlinks attacks from some rivals that increased my spam score to 33% how to reduce and fix it ? my domain is https://www.safnah.com I disavowed around 300 toxic urls but nothing happened looking forward for your solutions
Support | | Safnah IT Services0 -
Issue: Duplicate Page Content
Hello SEO experts, I'm facing duplicate page content issue on my website. My website is a apartments rental website when client search apartment for availability. Automatic generate same url's. I've already block these url's in robots.txt file but facing same issue. Kindly guide me what can I do. Here are some example links. http://availability.website.com/booking.php?id=17&bid=220
On-Page Optimization | | KLLC
http://availability.website.com/booking.php?id=17&bid=242
http://availability.website.com/booking.php?id=18&bid=214
http://availability.website.com/booking.php?id=18&bid=215
http://availability.website.com/booking.php?id=18&bid=256
http://availability.website.com/details.php?id=17&bid=220
http://availability.website.com/details.php?id=17&bid=242
http://availability.website.com/details.php?id=17&pid=220&bid=220
http://availability.website.com/details.php?id=17&pid=242&bid=242
http://availability.website.com/details.php?id=18&bid=214
http://availability.website.com/details.php?id=18&bid=215
http://availability.website.com/details.php?id=18&bid=256
http://availability.website.com/details.php?id=18&pid=214&bid=214
http://availability.website.com/details.php?id=18&pid=215&bid=215
http://availability.website.com/details.php?id=18&pid=256&bid=256
http://availability.website.com/details.php?id=3&bid=340
http://availability.website.com/details.php?id=3&pid=340&bid=340
http://availability.website.com/details.php?id=4&bid=363
http://availability.website.com/details.php?id=4&pid=363&bid=363
http://availability.website.com/details.php?id=6&bid=367
http://availability.website.com/details.php?id=6&pid=367&bid=367
http://availability.website.com/details.php?id=8&bid=168
http://availability.website.com/details.php?id=8&pid=168&bid=168 Thanks and waiting for your response | |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |
| |0 -
Is it impossible to get out of Panda? Matt Cutts says if you fix the problem you "pop back" but if so why are their so few examples?
In this video matt cutts says: http://www.youtube.com/watch?v=8IzUuhTyvJk about 15 "once we re-run our data (every few weeks) if we determine your site is of higher quality you would pop back out of being affected" Panda has effected thousands of sites and a lot of smart people have been working on the problem for about 2 years since the first panda was launched, but I can only find 1 site that has "popped back" to their original rankings. e.g. http://searchengineland.com/google-panda-two-years-later-losers-still-losing-one-real-recovery-149491 Apart from Motortrend.com I can't find any sites (of reasonable size) / case studies of sites that have solved the panda problem, and were definitely hit by panda. Which doesn't feel right, some people have deleted a ton of pages, redesigned their site, improved their content, etc with no success. Therefore is it a pointless exercise? Therefore, is it better to simply give up and start a new site?
On-Page Optimization | | julianhearn1 -
In my report of my website it was indicated that I had 19 links/locations blocked by meta-robots. What does this mean and how do I fix it. My website is a Wordpress website.
In my report of my website it was indicated that I had 19 links/locations blocked by meta-robots. What does this mean and how do I fix it. My website is a Wordpress website.
On-Page Optimization | | cyaindc0 -
Can duplicate content issues be solved with a noindex robot metatag?
Hi all I have a number of duplicate content issues arising from a recent crawl diagnostics report. Would using a robots meta tag (like below) on the pages I don't necessarily mind not being indexed be an effective way to solve the problem? Thanks for any / all replies
On-Page Optimization | | joeprice0 -
Will a "no follow" "no index" meta tag resolve duplicate content issue?
I have a duplicate content issue. If the page has already been indexed will a no follow no index tag resolve the issue or do I also need a rel canonical statement?
On-Page Optimization | | McKeeMarketing0 -
Self-Cannibalization issue
Is the keyword "filme online gratis" self-cannibalization on this site filmeonlinenoi.com in the seomoz tool "On-Page Keyword Optimization" it shows that it is a self-cannibalization keyword ... i made some changes (big changes) and its still remaining the same
On-Page Optimization | | Alexsmenaru0