cross-posted from: https://slrpnk.net/post/36882585

An arms race is on the horizon. An AI project has produced a tool that finds bugs in software. And it has gotten very good at it. The bugs it finds is a treasure trove for criminals (incl. spy agencies) looking for vulns.

Of course the AI project is commercial and the code is proprietary closed-source. The company that made it is quite tight with the s/w, so far, partly due to liability control (a legit fear that if they are loose with distribution, criminals will get a flood of 0 days like crazy and exploit it, after which the AI corp could be liable).

The FOSS world seems to be at a serious disadvantage. This tool will be unavailable to the FOSS community. So devs will be blocked from a tool that finds bugs that criminals and FOSS adversaries will have. Well-funded project (i.e. mostly non-FOSS) can either offer bug bounties and/or afford the bug finding tool.

Is it time to ask govs to get off their ass and protect the commons? Should govs (the UN? NATO?) fund a competing project to find bugs in FOSS and inform devs? Or mandate that all such commercial projects lend their AI bots to the commons?

  • Gsus4@mander.xyz
    link
    fedilink
    arrow-up
    2
    ·
    8 days ago

    Isn’t it the opposite? If they are so good, all you have to do is scan for vulnerabilities before you submit code. Closed source, who knows, maybe they’re baked in on purpose…

  • FireXtol@piefed.socialBanned
    link
    fedilink
    English
    arrow-up
    2
    ·
    8 days ago

    Nah. They’ll certainly train it with decompilers. And this probably means it see the exploits even better than with source code, as the exploit often requires understanding the machine code generated from source code (at least for humans.)