CSRF protection without tokens or hidden form fields

(blog.miguelgrinberg.com)

132 points | by adevilinyc 3 days ago ago

26 comments

  • owenthejumper 6 hours ago ago

    Right now the problem is what the author already mentions - the use of Sec-Fetch-Site (FYI, HTTP headers are case insensitive :) - is considered defense in depth in OWASP right now, not a primary protection.

    Unfortunately OWASP rules the world. Not because it's the best way to protect your apps, but because the corporate overloads in infosec teams need to check the box with "Complies with OWASP Top 10"

    • miguelgrinberg 6 hours ago ago

      Hi, author here.

      This was actually a mistake. If you look at the OWASP cheat sheet today you will see that Fetch Metadata is a top-level alternative to the traditional token-based protection.

      I'm not sure I understand why, but the cheat sheet page was modified twice. First it entered the page with a top-level mention. Then someone slipped a revision that downgraded it to defense in depth without anyone noticing. It has now been reverted back to the original version.

      Some details on what happened are in this other discussion from a couple of days ago: https://news.ycombinator.com/item?id=46347280.

    • tptacek 4 minutes ago ago

      The OWASP Top 10 is a list of vulnerabilities, not a checklist of things you have to actually "do".

    • nchmy 4 hours ago ago

      Can you share links to better guidance than OWASP?

  • tmsbrg 6 hours ago ago

    I'm surprised there's no mention of the SameSite cookie attribute, I'd consider that to be the modern CSRF protection and it's easy, just a cookie flag:

    https://scotthelme.co.uk/csrf-is-dead/

    But I didn't know about the Sec-Fetch-Site header, good to know.

    • nhumrich an hour ago ago

      This is "not allowing cross site at all" so, technically it's not "request forgery" protection. Yes, this is very semantic, but, CSRF is a vulnerability introduced by enabling CS and CORS. So, technically, same-site cookies are not "protection" against CSRF.

    • tordrt 2 hours ago ago

      Yep SameSite lax, and just make sure you never perform any actions using Get requests, which you shouldn’t anyway.

      • paulryanrogers 33 minutes ago ago

        Unsubscribe often need to be GET, or at least start as GET

    • miguelgrinberg 5 hours ago ago

      The OWASP CSRF prevention cheat sheet page does mention SameSite cookies, but they consider it defense in depth: https://cheatsheetseries.owasp.org/cheatsheets/Cross-Site_Re....

      • tptacek a minute ago ago

        Because of clientside Javascript CSRF, which is not a common condition.

  • rvnx 4 hours ago ago

    If you want, “SameSite=Strict” may also be helpful and is supported on “all” browsers so it is reasonable to use it (but like you did, adding server validation is always a +).

    https://caniuse.com/mdn-http_headers_set-cookie_samesite_str...

    This checks Scheme, Port and Origin to decide whether the request should be allowed or not.

    • Macha 4 hours ago ago

      Note SameSite=Strict also counts against referrals too, which means your first request will appear unauthenticated. If this request just loads your SPA skeleton, that might be fine, but if you're doing SSR of any sort, that might not be what you want.

    • simonw 4 hours ago ago

      I find that cookie setting really confusing. It means that cookies will only be respected on requests that originated on the site that set them... but that includes when you click links from one site to another.

      So if you follow a link (e.g. from a Google search) to a site that uses SameSite=Strict cookies you will be treated as logged out on the first page that you see! You won't see your logged in state until you refresh that page.

      I guess maybe it's for sites that are so SPA-pilled that even the login state isn't displayed until a fetch() request has fired somewhere?

      • ctidd 4 hours ago ago

        You want lax for the intuitive behavior on navigation requests from other origins. Because there’s no assumption navigation get requests are safe, strict is available as the assumption-free secure option.

      • macNchz an hour ago ago

        SameSite=Strict is belt-and-suspenders protection in the case where you could have GET requests that have some kind of impact on state, and the extra safety is worth the UX impact (like with an online banking portal).

        Discussions about this often wind up with a lot of people saying "GET requests aren't supposed to change state!!!", which is true, but just because they're not supposed to doesn't mean there aren't some floating around in large applications, or that there aren't clever ways to abuse seemingly innocuous side effects from otherwise-stateless GET requests (maybe just visiting /posts/1337/?shared_by_user=12345 exposes some tiny detail about your account to user #12345, who can then use that as part of a multi-step attack). Setting the strict flag just closes the door on all of those possibilities in one go.

  • est 2 hours ago ago

    reminds me of something similar

    https://news.ycombinator.com/item?id=46321651

    e.g. serve .svg only when "Sec-Fetch-Dest: image" header is present. This will stop scripts

    • amluto 8 minutes ago ago

      Or sending Content-Security-Policy: script-src 'none' for everything that isn’t intended to be a document. Or both.

  • altmind 4 hours ago ago

    Are there any approaches to csrf tokens that don't require storing issued tokens on server-side?

    • maxbond 2 hours ago ago

      The alternative to storing tokens is to use an AEAD encryption scheme like AES-GCM to protect tokens from forgery or tampering. You will still have to worry about reuse, so you will probably want to restrict use of this token to the user it was generated for and to a lifetime (say, 24 hours). That is a very high level description, there are details (like nonce generation) that must be done correctly for the system to be secure.

    • t-writescode 4 hours ago ago

      Most of them. You can send in a cookie and a field and compare.

      CSRF is about arbitrary clicks in emails and such that automagic your logged-in-session cookies to the server. If you require an extra field and compare it, you’re fine

  • shermantanktop 5 hours ago ago

    Am I missing something? The suggested protection helps with XSS flavors of CSRF but not crafted payloads that come from scripts which have freedom to fake all headers. At that point you also need an oauth/jwt type cookie passed over a private channel (TLS) to trust the input. Which is true for any sane web app, but still…

    • varenc 5 hours ago ago

      If an attacker has a user's private authentication token, usually stored in a __Host prefixed cookie, then it's game over anyway. CSRF is about protecting other sites forcing a user to make a request to a site they're authenticated to, when the malicious site doesn't actually have the cookie/token.

      CSRF is when you don't have the authentication token, but can force a user to make a request of your choosing that includes it. In this context you're using HTML/JS and are limited by the browser in terms of what headers you can control.

      The classic CSRF attack is just a <form> on a random site that posts to "victim.com/some_action". If we were to re-write browser standards today, cross-domain POST requests probably just wouldn't be permitted.

      • naasking 3 hours ago ago

        > If we were to re-write browser standards today, cross-domain POST requests probably just wouldn't be permitted.

        That would be a terrible idea IMO. The insecurity was fundamentally introduced by cookies, which were always a hack. Those should be omitted, and then authorization methods should be designed to learn the lessons from the 70s and 80s, as CSRF is just the latest incarnation of the Confused Deputy:

        https://en.wikipedia.org/wiki/Confused_deputy_problem

        • varenc 5 minutes ago ago

          Ah, so true. That's what i mean! Cross domain requests that pass along the target domain's cookies. As in, probably every cookie would default to current __Host-* behavior. (and then some other way to allow a cookie if you want. Also some way of expressing desired cookie behavior without a silly prefix on its name...)

    • ctidd 4 hours ago ago

      CSRF exists as a consequence of insecure-by-default browser handling of cookies, whereby the browser sends the host’s cookies on requests initiated by a third-party script to the vulnerable host. If a script can fake all headers, it’s not running in a browser, and so was never exposed to the insecure browser cookie handling to be able to leverage it as a vector. If no prerequisite vector, then no vulnerability to mitigate.

    • t-writescode 4 hours ago ago

      As I understand it, the moment you’re dealing with custom scripts, you’ve left the realm of a csrf attack. They’re dependent upon session tokens in cookies