New Blogspot seeds will have the default scoping rules automatically applied at the seed level when they are added to a collection. To learn more, including how you can add default scoping rules to existing seeds, please visit Sites with automated scoping rules.
Blogspot sites block pagination between sections (ex. Older Posts) with robots.txt. To capture this functionality you will need to:
Add an Ignore Robots.txt rule to each Blogspot seed
- or -
Add an Ignore Robots.txt rule to your seed host. Please note that this is different than adding a rule for blogspot.com. Your host could be something like example.blogspot.com, which will need to be added in its entirety for the rule to work.