Robotsxt
On the other hand, the most extreme solution to crawl budget you can use is to utilize a robotsxt file.
That problem is solved! Actually no. Because there is some compromise here. Technically, sites and pages blocked in robotsxt can be indexed. You sometimes see sites or pages appearing in SERPs with this meta description that cannot be displayed because the page is blocked in robotsxt or such messages.
Technically they can be indexed, but functionally they don't rank anything, or at least anything valid. Technically. They are not passing PageRank. When we link to a page like this we are still passing PageRank. But if it is then blocked by robotsxt, PageRank goes no further.
We ended up creating the loophole. It was a rather clumsy bangladesh mobile database solution, although it was easy to implement.
Link level nofollow
Link level nofollow, here I mean if we have links pointing to these aspects on the main laptop category page and put a nofollow attribute internally on those links, this has pros and cons.
I think the better use case is actually in listings. Imagine if we were running a used car site where we had millions of different individual product listings for used cars. Now, maybe depending on the size of our site, we don't really want Google to waste time on those individual listings.