A set of pages are marked as noindex and nofollow, both in robots.txt and with X-Robots-Tag: noindex, nofollow When checking with Google Webmaster Tools, the pages are reported as "Denied by robots.txt", which is nice. Also, as mentioned in this answer, disallowed pages may still be indexed even if not technically crawled, cause that's how Google rolls.
However, after adding the Robots-Tag two weeks ago, the pages still appears in Google search results.
For example, this test page http://www.english-attack.com/profile/scott-s-sober is found when searching for its h1 title "Scott S. Sober" https://www.google.com/search?q=%22Scott+S.+Sober%22
Why is this?