“Google’s John Mueller warns that pages blocked by robots.txt could still get indexed if there are links pointing to them. This could become a problem because Google would then see these pages as having no content due to it being blocked from getting crawled.” Read more at Search Engine Journal >>