makkes 4 hours ago

The code he provides doesn't compile and needs to be changed like so:

  --- main_before.go      2025-10-15 09:56:16.467115934 +0200
  +++ main.go     2025-10-15 09:52:14.798134654 +0200
  @@ -13,8 +13,10 @@
  
          slog.Info("starting server on :4000")
  
  +       csrfProt := http.NewCrossOriginProtection()
  +
          // Wrap the mux with the http.NewCrossOriginProtection middleware.
  -       err := http.ListenAndServe(":4000", http.NewCrossOriginProtection(mux))
  +       err := http.ListenAndServe(":4000", csrfProt.Handler(mux))
          if err != nil {
                  slog.Error(err.Error())
                  os.Exit(1)
vulk 5 hours ago

On a side note Alex books are a breath of fresh air for someone who is learning. They are always updated to the latest version of Go and if there is something new the old code base is updated and the new concepts introduced while you are being notified and send the new version of the book.

I never seen that before, all the other learning sources that I have are just abandoned, often there will be something that brakes and you have to spend good amount of time to figure out how to fix it, which can just discourage you to go on.

Kudos to Alex that is how it should be done.

nchmy 3 hours ago

I deeply appreciate this thorough review of CSRF protection via headers. I've been looking into the topic to see if I can get rid of csrf tokens, and it seems like I can now - if I ignore/don't care about the 5% of browsers that don't support the required headers.

It makes me wonder though - most browser APIs top out around 95% coverage on caniuse.com. What are these browsers/who are these people...? The modern web is very capable and can greatly simplify our efforts if we ignore the outliers. I'm inclined to do so. But am also open to counterarguments

  • johannes1234321 2 hours ago

    Those 5% are probably a wild collection of devices. TVs with embedded Browsers, old phones and other computers elsewhere.

    As example: my late grandfather of 100 years took records of his stamp collection in Excel. He used the computer for Wikipedia as well, but we didn't upgrade ist as he was comfortable, but upgrading to later Windows to ruin newer browser would have been too much of a change and rather made him stop doing what brought him fun. The router etc blocked worst places frequent backups allowed restore, thus actual risk low.

    Anecdote aside: there are tons of those machines all over.

    And then another big one: bots claiming to be something which they aren't.

    • veeti an hour ago

      There are a lot of people happily browsing away on unsupported Apple devices that don't get any more Safari updates. Lot of strange webkit edge cases to be found that don't exist in any other browser.

      • nchmy 10 minutes ago

        Apparently less than 1.5% of global internet users are on versions of safari that don't support Sec-Fetch-Site.

  • kijin an hour ago

    If a browser is too old to send either the Sec-Fetch-Site header or the Origin header, it will probably ignore Referrer-Policy and always set the Referer header, which contains the origin.

    So I wonder why the author didn't consider falling back to the Referer header, instead of relying on an unrelated feature like TLS 1.3. Checking the referrer on dangerous (POST) requests was indeed considered one way to block CSRF back in the day. Assuming all your pages are on the same https origin, is there an edge case that affects Referer but not the Origin header?

  • anal_reactor 3 hours ago

    From business perspective it makes a lot of sense to just drop that bottom 5%. Actually, many businesses support Chrome only, they don't even support Firefox.

    Technological counterargument though is that you should allow people to tinker and do weird shit. Once upon a time tech wasn't about maximizing stock value, it was about getting Russian game crack to work, and making Lora's boobs bigger. Allowing weird shit is a way to respect the roots of modern tech, and allow hobbyists to tinker.

    • wongarsu an hour ago

      For a lot of businesses, 5% of revenue is a lot more than the cost of supporting older browsers

      What shifts the discussion a bit is that many of the bottom 5% aren't lost customers. If your website doesn't work on my minority browser, smart TV or PS Vita I might be willing to just try it in Chrome instead

    • nchmy 2 hours ago

      Thanks for confirming. I don't know that it has to be framed as a "business perspective" though. I'm a solo dev for a non-profit project, so ignoring the 5% is just a matter of pragmatism.

      I most defintiely do not care about tinkerers, and in fact would generally classify them as something akin to attackers. I just want to allow as many people to use the app as possible, while keeping things simple and secure.

    • cryptonym 2 hours ago

      Empathy and accessibility. Does it make sense to have a ramp in front of your shop for < 5% customers?

      • nchmy 2 hours ago

        My question, though, is who are the 5% of users in this case who are using some arcane browsers? Surely that's largely a choice, physical disabilities are not.

        It doesn't seem unreasonable to say to those folks "were evidently not using the same web"

        • cryptonym an hour ago

          Grandma or poor folk with their old device may not be "largely a choice"

          • todotask2 a minute ago

            It still depends on the target audience. Some websites or apps are single-page applications (SPAs), can older devices handle that? For example, my mum’s Android phone was too slow to even load a page.

            Secondly, users should upgrade their devices to stay safe online, since vulnerabile people are often scammed or tricked into downloading apps that contain malware.

          • nchmy 13 minutes ago

            Don't major browsers essentially auto update? And to the extent that a device is so old that it can't support newer versions, surely it must be VERY old and perhaps is somewhat likely to be replaced sooner than later.

            I think I'll probably carry on with not supporting browsers that don't have Sec-Fetch-Site. The alternative, Csrf tokens, actually causes me immense issues (they make caching very difficult, if not impossible).

            (and I say all of this as someone who is specifically building something for the poorest folks. I'm extremely aware of and empathetic to their situation).

        • littlestymaar an hour ago

          It's not comparable to a physical disability byt gatekeeping the people who just don't want to be tracked all day by Google doesn't sounds right to me though.

          • nchmy 12 minutes ago

            There's plenty of other chromium browsers - Vivaldi seems to do a good job in this regard.

            Also, Firefox exists, though they don't seem to care about privacy much anymore either.

            And, of course, safari, which is terrible in most regards

ale 7 hours ago

Are CSRF attacks that common nowadays though? Even if your app is used by the 5% of browsers that don’t set the Origin header the chances of that being exploited are even more miniscule. Besides, most webdevs reach for token-based auth libraries before even knowing how to set a cookie header.

  • littlecranky67 3 hours ago

    Curious about that too. In a modern web-app I always set HttpOnly cookies to prevent them being exposed to anything JavaScript, and SameSite=strict. Especially the later should prevent CSRF.

    • jeremyscanvic 3 hours ago

      Erratum: What I'm saying here only applies for cookies with the attribute SameSite=None so it's irrelevant here, see the comments below.

      (Former CTF hobbyist here) You might be mixing up XSS and CSRF protections. Cookie protections are useful against XSS vulnerabilities because they make it harder for attackers to get a hold on user sessions (often mediated through cookies). It doesn't really help against CSRF attacks though. Say you visit attacker.com and it contains an auto-submitting form making a POST request to yourwebsite.com/delete-my-account. In that case, your cookies would be sent along and if no CSRF protection is there (origin checks, tokens, ...) your account might end up deleted. I know it doesn't answer the original question but hope it's useful information nonetheless!

      • RagingCactus 2 hours ago

        The SameSite cookie flag is effective against CSRF when you put it on your session cookie, it's one of its main use cases. See https://developer.mozilla.org/en-US/docs/Web/HTTP/Reference/... for more information.

        SameSite=Lax (default for legacy sites in Chrome) will protect you against POST-based CSRF.

        SameSite=Strict will also protect against GET-based CSRF (which shouldn't really exist as GET is not a safe method that should be allowed to trigger state changes, but in practice some applications do it). It does, however, also make it so users clicking a link to your page might not be logged in once they arrive unless you implement other measures.

        In practice, SameSite=Lax is appropriate and just works for most sites. A notable exception are POST-based SAML SSO flows, which might require a SameSite=None cookie just for the login flow.

        • hmry 2 hours ago

          This page has some more information about the drawbacks/weaknesses of SameSite, worth a read: https://developer.mozilla.org/en-US/docs/Web/Security/Attack...

          You usually need another method as well

          • RagingCactus 2 hours ago

            Yes, you're definitely right that there are edge cases and I was simplifying a bit. Notably, it's called SameSite, NOT SameOrigin. Depending on your application that might matter a lot.

            In practice, SameSite=Lax is already very effective in preventing _most_ CSRF attacks. However, I 100% agree with you that adding a second defense mechanism (such as the Sec header, a custom "Protect-Me-From-Csrf: true" header, or if you have a really sensitive use case, cryptographically secure CSRF tokens) is a very good idea.

        • jeremyscanvic 2 hours ago

          Thanks for correcting me - I see my web sec knowledge is getting rusty!

  • zwnow 5 hours ago

    Also cant you just spoof the origin header?

    • masklinn 4 hours ago

      A CSRF is an attack against a logged in user, so has to be mediated via their browser.

      If you can spoof the origin header of a second party when they navigate to a third party, a CSRF is a complete waste of whatever vulnerability you have found.

    • kevinyew 5 hours ago

      You can if you want to deliberately CORF yourself for some reason - it's there to protect you, but spoofing it doesn't give you any special access you wouldn't otherwise have.

      The point is that arbitrary user's browsers out in the world won't spoof the Origin header, which is protecting them from CORF attacks.

teiferer 7 hours ago

CSRF: Cross-Site Request Forgery

From https://developer.mozilla.org/en-US/docs/Web/Security/Attack...

In a cross-site request forgery (CSRF) attack, an attacker tricks the user or the browser into making an HTTP request to the target site from a malicious site. The request includes the user's credentials and causes the server to carry out some harmful action, thinking that the user intended it.

cientifico 5 hours ago

Killing all the fun.

Remember when you could trick a colleague into posting in Twitter, Facebook... by just sending a link?

CSRF fixes are great for security - but they've definitely made some of the internet's harmless mischief more boring

nmadden 6 hours ago

Enforcing TLS 1.3 seems like a roundabout way to enforce this. Why not simply block requests that don’t have an Origin/Sec-Fetch-Site header?

  • nchmy 3 hours ago

    I don't understand - the article is literally about origin/Sec-Fetch-Site

    • nmadden 2 hours ago

      The article has a whole section about requiring those headers by forcing the use of TLS 1.3 — the theory being that browsers modern enough to support 1.3 are also modern enough to support the headers. But why not just enforce the headers?

      • kokada 2 hours ago

        If your case is just supporting browsers and not things like curl this seems fine. But when the headers are not set the CSRF protections are "disabled" exactly to support this case, that you may want to do this request using something like curl.

      • nchmy an hour ago

        I see what you mean. You were saying why tls in addition to Sec-Fetch-Site. The sibling comment seems to have addressed it

NewJazz 7 hours ago

Do most languages have good support for TLS 1.3 as the client?

Zababa 5 hours ago

"cop" as an abbreviation for "cross-origin protection" is delightful

dorianmariecom 3 hours ago

rails solved this a while ago ;)

  • nchmy 3 hours ago

    I don't use rails. How did they solve it?

    • brokegrammer 2 hours ago

      >Have we finally reached the point where CSRF attacks can be prevented without relying on a token-based check (like double-submit cookies)?

      Rails uses a token-based check, and this article demonstrates token-less approach.

      Rails didn't solve CSRF btw, the technique was invented long before Rails came to life.

      • nchmy an hour ago

        Yes, I assumed this is what they were ignorantly pointing towards.

        Indeed, Csrf tokens are an ancient concept. WordPress, for example, introduced nonces a couple years before rails. Though, it does appear that rails might have been thr first to introduce csrf protection in a seemingly automated way.

        • brokegrammer 21 minutes ago

          True, it does seem like Rails introduced configuration-free token based CSRF protection, which "solved" CSRF for traditional server rendered apps.

          I believe the new technique is easier to use for SPA architectures because you no longer need to extract the token from a cookie before adding it to request headers.

tankenmate 6 hours ago

I would never rely on headers such as "Sec-Fetch-Site"; having security rely on client generated (correct) responses is just poor security modelling (don't trust the client). I'll stick to time bounded HMAC cookies, then you're not relying on client properly implementing any headers and it will work with any browser that supports cookies.

And having TLS v1.3 should be a requirement; no HTTPS, no session, no auth, no form (or API), no cookie. And having HSTS again should be default but with encrypted connections and time bounded CSRF cookies the threat window is very small.

  • FiloSottile 4 hours ago

    No, in CSRF the browser is not the adversary, it is a confused deputy, and it’s perfectly reasonable to collaborate with it against the attacker (which is another site).

    You might want to read https://words.filippo.io/csrf.

  • hmry 4 hours ago

    CSRF is about preventing other websites from making requests to your page using the credentials (including cookies) stored in the browser. Cookies can't prevent CSRF, in fact they are the problem to be solved.

    • tankenmate 2 hours ago

      Somewhere auth needs to be done, somewhere, somehow, and some when. And this is done with cookies (be it CSRF, auth token, JWT, etc). There has to be some form of mechanism for a client to prove that a) it is the client it claims it is, and therefore b) it has the permission to request what it needs from the server.

      And, the server shouldn't trust the client "trust me bro" style.

      So, at the end of the day it doesn't matter whether it is a "rose by another name", i.e. it doesn't matter whether you call it a CSRF token, auth token, JWT, or whatever, it still needs to satisfy the following; a) the communication is secure (preferably encryption), b) the server can recognise the token when it sees it (headers (of which cookies are one type), etc), c) the server doesn't need to trust the client (it's easiest if the server creates the token, but it could also be a trusted OOB protocol like TOTP), and d) it identifies a given role (again it's easiest if it identifies a unique client (like a user or similar)).

      So a name is just a name, but there needs to be a cookie or a cryptographically secure protocol to ensure that an untrusted client is who it says it is. Cookies are typically easier than crypto secure protocols. Frankly it doesn't really matter what you call it, what matters is that it works and is secure.

      • RagingCactus 2 hours ago

        I work as a pentester. CSRF is not a problem of the user proving their identity, but instead a problem of the browser as a confused deputy. CSRF makes it so the browser proves the identity of the user to the application server without the user's consent.

        You do need a rigid authentication and authorization scheme just as you described. However, this is completely orthogonal to CSRF issues. Some authentication schemes (such as bearer tokens in the authorization header) are not susceptible to CSRF, some are (such as cookies). The reason for that is just how they are implemented in browsers.

        I don't mean to be rude, but I urge you to follow the recommendation of the other commenters and read up on what CSRF is and why it is not the same issue as authentication in general.

        Clearly knowledgeable people not knowing about the intricacies of (web) security is actually an issue that comes up a lot in my pentesting when I try to explain issues to customers or their developers. While they often know a lot about programming or technology, they frequently don't know enough about (web) security to conceptualize the attack vector, even after we explain it. Web security is a little special because of lots of little details in browser behavior. You truly need to engage your suspension of disbelief sometimes and just accept how things are to navigate that space. And on top of that, things tend to change a lot over the years.

        • seethishat an hour ago

          It's very complicated and ever evolving. It takes dedicated web app pentesters like you to keep up with it... back in the day, we were all 'generalists'... we knew a little bit about everything, but those days are gone. It's too much and too complicated now to do that.

      • hmry 2 hours ago

        I don't understand what you are getting at. CSRF is not another name for auth. You always need auth, CSRF is a separate problem.

        When the browser sends a request to your server, it includes all the cookies for your domain. Even if that request is coming from a <form> or <img> tag on a different website you don't control. A malicious website could create a form element that sends a request to yourdomain.com/api/delete-my-account and the browser would send along the auth cookie for yourdomain.com.

        A cookie only proves that the browser is authorized to act on behalf of the user, not that the request came from your website. That's why you need some non-cookie way to prove the request came from your origin. That's what Sec-Fetch-Site is.

      • nchmy 2 hours ago

        I don't think this is accurate. As your parent comment said, Csrf defenses (tokens, origin/Sec-Fetch-Site) serve a different purpose from Auth token/jwt. The latter says that your browser is logged in as a user. The former says "the request actually came from a genuine action on your page, rather than pwned.com disguising a link to site.com/delete-account.

        You're conflating the two types of Auth/Defense.

  • tankenmate 3 hours ago

    All the voting down but not a single comment as to why. The "Sec-Fetch-Site" primarily protects the browser against Javascript hijacking, but does little to nothing to protect the server.

    This is probably apocryphal, but Willie Sutton was asked why he kept robbing banks, he quipped "that's where the money is". Sure browser hacking occurs, but it's a means to an end because the server is where the juicy stuff is.

    So headers that can't be accessed by Javascript are way down the food chain and only provide lesser defence in depth if you have proper CSRF tokens (which you should have anyway to protect the far more valuable server resources which are the primary target).

    • nchmy 3 hours ago

      I must be missing something. What does JavaScript have to do with this? My understanding is that csrf is about people getting tricked into clicking a link that makes, for example, a post request to another site/origin that makes an undesired mutation to their account. If the site/origin has some sort of Auth (eg session cookie), it'll get sent along with the request. If the Auth cookie doesn't exist (user isn't logged in/isn't a user) the request will fail as well.

      • tankenmate 2 hours ago

        There's server security and there's client security. From what I've seen in these comments people are focused on the client security and are either a) ignoring server security, or b) don't understand server security.

        But the server security is the primary security, because it's the one with the resources (in the money analogy it's the bank).

        So yes, we do want to secure the client, but if the attacker has enough control of your computer to get your cookies then it's already game over. Like I said you can have time bounded CSRF tokens (be they cookies or whatever else, URL encoded, etc who cares) to prevent replay attacks. But at the end of the day if an attacker can get your cookies in real time you're done for, it's game over already. If they want to do a man in the middle attack (i.e. click on a fake "proxy" URL) then having the "secure" flag should be enough. If the server checks the cookie against the client's IP address, time, HMAC, other auth attributes, will then prevent the attack. If they attacker takes control of your end device, you've already lost.

        • nchmy 2 hours ago

          Sorry, but you seem to be lost.

          I, the article and most comments here quite explicitly talked about server security via Auth and csrf protections.

          None of this has anything to do with browser security, such as stealing csrf tokens (which tend to be stored as hidden fields on elements in the html, not cookies). MOREOVER, Sec-Fetch-Site obviates the need for csrf tokens.

  • jebronie 5 hours ago

    I don't understand why your post is flagged. You are 100% right. The point of CSRF protection is that -you can't trust the client-. This new header can just be set in curl, If I understand correctly. Unlimited form submissions here I come!

    • eptcyka 4 hours ago

      CSRF protects the user by not allowing random pages on the web using resources from a target website, without the user being aware of this. It only makes sense when serving people using browsers. It is not a defense against curl or skiddies.

      • nchmy 3 hours ago

        To elaborate/clarify a bit, we defend against curl with normal auth, correct? Be it session cookies or whatever. That plus origin/Sec-Fetch-Site (and tls, secure cookies, hsts) should be reasonable secure, no?

        • tankenmate 2 hours ago

          indeed, you need some form of CSRF, but the Sec-Fetch-Site is primarily focused on keeping a browser secure, not the server. Having said that it's nice defence in depth for the server as well but not strictly required as far as the server is concerned.

          • nchmy 2 hours ago

            I'm confused. In my mind, you only really need to keep the server secure, as that's where the data is. Auth cookies and csrf protections (eg Sec-Fetch-Site) are both used towards protecting the server from invalid requests (not logged in, or not coming from your actual site).

            What are you referring to when you talk about keeping the browser secure?

            • tankenmate 2 hours ago

              The Sec-Fetch-Site header can't be read / written by Javascipt (or WASM, etc), cookies (or some other tokens) on the other hand can be. In most circumstances allowing Javascript to access these tokens allows for "user friendly" interfaces where a user can log in using XMLHttpRequest / API rather than using a form on a page. OOB tokens one a one off auth basis or continuous (i.e. OAuth, TOTP with every request) are more secure, but obviously requires more engineering (and comes with its own "usability" / "failure mode" trade offs).

              • nchmy 2 hours ago

                > The Sec-Fetch-Site header can't be read / written by Javascipt

                Perfect. It's not even meant or needed to be. The server uses it to validate the request came from the expected site.

                As i and others have said in various comments, you seem to be lost. Nothing you're saying has any relevance to the topic at hand. And, in fact, is largely wrong.

    • kokada 2 hours ago

      This is not what this is supposed to protect, and if you are using http.CrossOriginProtection you don't even need to add any header to the request:

      > If neither the Sec-Fetch-Site nor Origin headers are present, then it assumes the request is not coming from web browser and will always allow the request to proceed.