xiaolanzhuji
New member
A massive change in the spider‘s grasp reanalysis
When helping companies deal with major algorithmic ranking updates, site redesign, CMS migration, and other distractions from SEO, I found myself grabbing a lot of URLs.
This typically involves a large amount of crawling and fetching during customer engagement.
For larger sites, it is not uncommon to analyze crawling and fetching data, from technical SEO issues to content quality issues to user participation barriers.
Crawling and scraping: enterprise and surgical treatment
In mining a website, you often first to have a feeling that the site including crawling an enterprise (a larger crawl and fetching, covers enough of the site, in order to obtain a large number of SEO intelligence)。
This does not mean crawling and scraping the entire site.
For example, if a site has a 1 million page index, you might start with a crawl and crawl from a 200-300k page.
Next stage: review stage
When I help clients, I usually get access to the staging environment so I can check the changes before they visit the production site.
This is a great way to nip problems in the bud.
Unfortunately, sometimes the wrong implementation can lead to more problems.
For example, if a developer misunderstands a topic and implements the wrong changes, then more problems may eventually occur than at the beginning.
Recrawling and scraping analysis and contrast
Now, you might say that Glenn was talking about a lot of work here.
Well, yes but no.
Fortunately, some top crawling and scraping tools allow you to compare spiders.
This can help you save a lot of time on recrawling and scraping analysis.
Compare the changes in each tool
When you crawl and crawl a site through DeepCrawl, it will automatically track the changes between the final crawl and the current fetching (while providing a trend for all crawlers)。
This is a great help in comparing the problems that occurred in the previous creep.
Over time, you’ll also see trends in every problem.
Overcoming murphy‘s law
We don’t live in a perfect world.
When push changes, no one is trying to sabotage the site.
To put it simply, working on large and complex websites opens the door to small errors that can cause big problems.
The changes you lead can nip these problems in the bud.
This saves time on SEO issues.
When helping companies deal with major algorithmic ranking updates, site redesign, CMS migration, and other distractions from SEO, I found myself grabbing a lot of URLs.
This typically involves a large amount of crawling and fetching during customer engagement.
For larger sites, it is not uncommon to analyze crawling and fetching data, from technical SEO issues to content quality issues to user participation barriers.
Crawling and scraping: enterprise and surgical treatment
In mining a website, you often first to have a feeling that the site including crawling an enterprise (a larger crawl and fetching, covers enough of the site, in order to obtain a large number of SEO intelligence)。
This does not mean crawling and scraping the entire site.
For example, if a site has a 1 million page index, you might start with a crawl and crawl from a 200-300k page.
Next stage: review stage
When I help clients, I usually get access to the staging environment so I can check the changes before they visit the production site.
This is a great way to nip problems in the bud.
Unfortunately, sometimes the wrong implementation can lead to more problems.
For example, if a developer misunderstands a topic and implements the wrong changes, then more problems may eventually occur than at the beginning.
Recrawling and scraping analysis and contrast
Now, you might say that Glenn was talking about a lot of work here.
Well, yes but no.
Fortunately, some top crawling and scraping tools allow you to compare spiders.
This can help you save a lot of time on recrawling and scraping analysis.
Compare the changes in each tool
When you crawl and crawl a site through DeepCrawl, it will automatically track the changes between the final crawl and the current fetching (while providing a trend for all crawlers)。
This is a great help in comparing the problems that occurred in the previous creep.
Over time, you’ll also see trends in every problem.
Overcoming murphy‘s law
We don’t live in a perfect world.
When push changes, no one is trying to sabotage the site.
To put it simply, working on large and complex websites opens the door to small errors that can cause big problems.
The changes you lead can nip these problems in the bud.
This saves time on SEO issues.