Like Me? Follow Me.
However, when I went to view the cached version of the page, the URL in the cache was for the correct page:
This may be a result of Google realising that this is the same page and showing the correct version in the index. The bigger issue, however, is why Google is even aware of the URL since it's been told to KEEP OUT of the /demo/ folder altogether.
This isn't the only site where we've noticed the problem of blocked pages appearing in Google's index:
are just two of the sites we found with pages in the index.
So what can we conclude? It looks like Google doesn't entirely honour the robots.txt file and in future we should consider password-protecting these pages instead.
What should Google learn from this - web design and web development companies consider it evil when you say you won't spider anything blocked in the robots.txt and then you do it anyway.