Follow

You guys know I like to bash Go, but... FUCK!

"The Go security team has determined that the root causes of the vulnerabilities cannot be reliably addressed."

Ok, your language design has some serious flaw that can't be fixed, so they are basically saying "Yup, a core library is going to be vulnerable for a long time".

Also, this is going since August 2020, according to the related post. Project Zero works way fast (30 days) to disclose issues on every other project, but on a project from their own company, 4 months.

Google surely cares about the well-being of the internet, sure.

Link: mattermost.com/blog/coordinate

@juliobiason no no, you don't get it, it's part of Google's plan to get people to stop using XML.

o b v i o u s l y

@rysiek You're sooooo lucky that I wasn't drinking any coffee when reading that, otherwise you'd own me a new keyboard. 😜​

@juliobiason thanks to high security standards of the tech industry, somebody else might have in fact already owned your keyboard...

@craigmaloney Shussh. Don't spoil the solution I'll point on my blog post! 😜​

@juliobiason Eh... I think the real issue according to the Mastodon post and most commenters is that SAML does a hash before serialization - which is something we all know is a bad idea now.

Sure - the golang issue isn't great, but everyone has pointed out that the stdlib xml encoding was never meant for cryptographically secure purposes.

I'd still take the golang stdlib any day for all of its usefulness out of the box if this is the first (and honestly... meh) trade-off.

@juliobiason I found the comments on the HN post to be pretty interesting. I learned a lot more about XML specs and how difficult they are to manage and implement.

news.ycombinator.com/item?id=2

@juliobiason You make it sound as if Project Zero is involved in this, but I don't see that. PZ isn't the inbound contact for external security reports (those reside much closer to each project or product, e.g. Go has its own security team) and this vulnerability seems to have been discovered by Mattermost.

From their timeline, MM seems to apply a less strict schedule which is comfy for the Go devs, but I guess Go would have dealt with a 30 days deadline just as well (given that PZ is in the same company and established that lower bound)

@patrick It shouldn't sound like that.

My annoyance is the fact that both projects, Project Zero and go, are under the G umbrella, but it seems PZ goes great lengths to point issue on several product but none on the company products.

Also, by their reports, it seems PZ have some great resources in finding issues.

This dichotomy of products inside the same company is the annoying part: Why not use PZ resources on G products themselves?

I can even put my paranoid hat and say that G does not do this with their own product to keep PZ doing disclosures in 30 days, while giving their own products more time to adjust themselves.

It is really weird that they could seriously use their resources to improve their own products, but instead decided to use to point problems in others.

@juliobiason (Disclosure: I work at Google, but on Chrome OS firmware - we don't deal with PZ or with Go. I'm also on vacation and I have no idea what the corporate rumor mill is discussing nor any desire to figure it out while I'm unwinding. Opinions my own, etc etc)

"Project Zero works way fast (30 days) to disclose issues on every other project, but on a project from their own company, 4 months." sounds to me as if to imply that PZ was involved in disclosure, which is why I brought it up.

As I understand things, the general idea behind PZ is to get Google security standards established as lower bound in computing for two reasons:
1. Google folks themselves use all kinds of non-Google software and hardware, and ops dislike putting onerous restrictions on them, but keeping them user-managed requires sufficient level of trust in the base system.
2. Google users may benefit from Google security standards (and those are _really_ high under certain circumstances) but that means nothing if the endpoint they use has crap security.

As for using PZ resources for internal purposes, I think there is some cross-pollination, for example I think that https://security.googleblog.com/2016/08/guided-in-process-fuzzing-of-chrome.html benefits from PZ tooling (and some of the things learned there certainly benefitted PZ). The general idea seems to be that Google sec ops is uniform enough that teams can take care of such issues themselves (as in: PZ doesn't bring anything new to the table, and security is usually distinct from regular development, so there's already the additional set of eyeballs.) so PZ is looking for the things not already covered by the company's guidelines.

That said: The way things look, I'm not sure if security folks across the company (who often support projects that use Go) are happy with that "solution" (or with the approach they use to "solve" it, even if their product is not using the XML library. That thing is a huge red flag that something is wrong with security culture, at least to me and after reading only the announcement) and my guess would be that hard questions will be asked now.
That _might_ mean that PZ now takes a look at Go, but I think it's more likely that Go's securiy team gets some adult supervision to bring them up to company standards (that could even mean PZ folks switching teams, but not PZ itself running it). But see the disclosure/disclaimer at the top of the post: this is pure conjecture for many reasons.

@juliobiason why is it necessarily language design? It is an XML API, sure there is a safe way to do it.

Sign in to participate in the conversation
Functional Café

The social network of the future: No ads, no corporate surveillance, ethical design, and decentralization! Own your data with Mastodon!