Moz Q&A is closed.
After more than 13 years, and tens of thousands of questions, Moz Q&A closed on 12th December 2024. Whilst we’re not completely removing the content - many posts will still be possible to view - we have locked both new posts and new replies. More details here.
GWMT / Search Analytics VS OpenSiteExplorer
-
Just had the experience of using OSE data to show what we call "linkrot" to a client -- only to find that GWMT / Search Analytics shows no such thing.
Fortunately the client is an old friend and no face was lost, but it was dicey there for a bit as I have come to rely on and reference OSE again and again and again,
OSE showed Domain Authority dropping by about 1/3 in the last 12 months, presumably due to old links getting broken, linking sites changing their architecture etc.
And of course, ranking is tanking, as you would expect.
But Google shows many more (and much more spammy looking!) backlinks.
Has anyone had any experience benchmarking the 2 data sets of backlinks against each other? Dr Pete?
Does one update more frequently than another?Do you trust one more than another?? If so, why??
Thanks!
-
I know it's not always the answer people want to here, but Matt's right - this is basically where we're at. OSE tends to focus on higher-authority links and quality over quantity. Unfortunately, while this works well for tracking the strengths in your link profile, it doesn't always do as well at tracking the weaknesses. We're very much interested in expanding the quantity as well, but it's a balancing act and, in the interest of full transparency, there are many engineering challenges.
People have compared our index to Majestic and Ahrefs on the blogosphere. Since I can't claim to be unbiased, I'd welcome you to read those posts and make your own judgments. In fairness to Majestic and Ahrefs, all three of us are somewhat transparent about sources and at least our general methodologies. Unfortunately, Google is not very transparent about how they sample links or choose which data to show. So, direct comparison with any of the major SEO tools to Google Search Console proves to be a lot trickier. We're also not clear on Google's update cycle for that data.
-
I agree with Eric. No one source is going to give you a full picture of your link profile. Generally, OSE is best for measuring the overall strength of a full link profile, as many low-authority sites aren't indexed.
Also, keep in mind that there are a _lot _ of reasons that DA can go down, many of which have nothing at all to do with your specific link profile. That's why we recommend using it to benchmark against competitors rather than as an absolute score. Rand goes into more detail about that here:
DA/PA Fluctuations: How to Interpret, Apply, & Understand These ML-Based Scores
-
Whenever you deal with links, even though I really like OSE, typically we have to compile all of the link data from multiple sources. We typically use OSE, Majestic, ahrefs, Google Search Console, as well as others and compile all of the links into one spreadsheet and then look at them there. Different sites have different crawlers and no one source is the most accurate.
Browse Questions
Explore more categories
-
Moz Tools
Chat with the community about the Moz tools.
-
SEO Tactics
Discuss the SEO process with fellow marketers
-
Community
Discuss industry events, jobs, and news!
-
Digital Marketing
Chat about tactics outside of SEO
-
Research & Trends
Dive into research and trends in the search industry.
-
Support
Connect on product support and feature requests.
Related Questions
-
Domain still not being found in search
Hi guys, I've been handed a client who needs some seo work. I've tweaked one of their pages to focus on a chosen keywords about 4 months back but still the site is not even visible using the new Domain Analysis tool from moz and it still won't rank at all for the keywords. Am I missing something here? Is there something blocking the SERP from listing the website? I've ran a site: search on Google and it returns 283 results on the website. It's puzzling me as there clearly is something stopping it from being ranked. The domain name in question is: https://cloud9inecommunications.co.uk Thanks in advance.
Moz Pro | | Easigrass1 -
Google Site search operator showing different results than Search Console
Hey everybody, I am seeing some confusing results. I am seeing that in the back of our Search Console we are showing around 4,500 sites indexed. If I use the "site" operator in google, only 2820 show up... any thoughts as to why that happens?
Moz Pro | | HashtagHustler1 -
Increase in Rankings, but search visibility is decreasing
I just started updating my site with the very basics, focus keyword, meta description, and page titles. I see that I am going up in my targeted keywords, however my search visibility has dropped from 3.34% to 2.78%. My assumption is that my keyword choice is a little off, however I see that I have increased in keyword rankings and dropped very little in what I was ranking for previously, plus the keyword volume seems relatively the same. In fact I've added far more keywords than I have dropped. Curious why my search visibility has dropped, however my keyword rankings only seem to increase.
Moz Pro | | kthomasd0 -
GOOGLE ANALYTIC SKEWED DATA BECAUSE OF GHOST REFERRAL SPAM ND CRAWL BOTS
Hi Guys, We are having some major problems with our Google Analytics and MOz account. Due to the large number of ghost/referral spam and crawler bots we have added some heavy filtering to GA. This seems to be working protecting the data from all these problems but also filtering out much needed data that is not coming through. In example, we used to get a hundred visitors a day at the least and now we are down to under ten. ANYBODY PLEASE HELP. HAVE READ THROUGH MANY ARTICLES WITH NO FIND TO PERMANENT SOLID SOLUTION (even willing to go with paid service instead of GA) Thank You so Much, S.M.
Moz Pro | | KristyKK0 -
What to do with a site of >50,000 pages vs. crawl limit?
What happens if you have a site in your Moz Pro campaign that has more than 50,000 pages? Would it be better to choose a sub-folder of the site to get a thorough look at that sub-folder? I have a few different large government websites that I'm tracking to see how they are fairing in rankings and SEO. They are not my own websites. I want to see how these agencies are doing compared to what the public searches for on technical topics and social issues that the agencies manage. I'm an academic looking at science communication. I am in the process of re-setting up my campaigns to get better data than I have been getting -- I am a newbie to SEO and the campaigns I slapped together a few months ago need to be set up better, such as all on the same day, making sure I've set it to include www or not for what ranks, refining my keywords, etc. I am stumped on what to do about the agency websites being really huge, and what all the options are to get good data in light of the 50,000 page crawl limit. Here is an example of what I mean: To see how EPA is doing in searches related to air quality, ideally I'd track all of EPA's web presence. www.epa.gov has 560,000 pages -- if I put in www.epa.gov for a campaign, what happens with the site having so many more pages than the 50,000 crawl limit? What do I miss out on? Can I "trust" what I get? www.epa.gov/air has only 1450 pages, so if I choose this for what I track in a campaign, the crawl will cover that subfolder completely, and I am getting a complete picture of this air-focused sub-folder ... but (1) I'll miss out on air-related pages in other sub-folders of www.epa.gov, and (2) it seems like I have so much of the 50,000-page crawl limit that I'm not using and could be using. (However, maybe that's not quite true - I'd also be tracking other sites as competitors - e.g. non-profits that advocate in air quality, industry air quality sites - and maybe those competitors count towards the 50,000-page crawl limit and would get me up to the limit? How do the competitors you choose figure into the crawl limit?) Any opinions on which I should do in general on this kind of situation? The small sub-folder vs. the full humongous site vs. is there some other way to go here that I'm not thinking of?
Moz Pro | | scienceisrad0 -
Noindex/nofollow on blog comments; is it good or bad ?
Hi, I changed the design of one my wordpress website at the beginning of the month. I also added a "facebook seo comments" plugin to rewrite facebook comments as normal comments. As most of the website comments are facebook comments, I went from 250 noindex/nofollow comments to 950; URL's are ?replytocom=4822 etc. Moz campaign noticed it and I'm asking myself : is it good to have comments in noindex/nofollow ? Should I do something about this ? Erwan.
Moz Pro | | johnny1220 -
Domain / Page Authority - logarithmic
SEOmoz says their Domain / Page Authority is logarithmic, meaning that lower rankings are easier to get, higher rankings harder to get. Makes sense. But does anyone know what logarithmic equation they use? I'm using the domain and page authority as one metric in amongst other metrics in my keyword analysis. I can't have some metrics linear, others exponential and the SEOmoz one logarithmic.
Moz Pro | | eatyourveggies0 -
TLD vs Sub Domain in Regards to Domain Authority
I have always been under the impression that top level (or root) domains can hold different domain authority than that of a sub domain. Meaning that sub domain's and TLD can hold different ranks and strength in search engine result pages. Is this a correct or just an assumption? If so when i add a root domain and subdomain into the campaign manager i get back the same link information and domain authority? www.datalogic.com
Moz Pro | | kchandler
www.automation.datalogic.com Have I made an incorrect assumption or is this an issue with the SEOMoz campaign manager?0