[liberationtech] Communities needed to mitigate heartbleed type bugs
Louis Suárez-Potts
luispo at gmail.com
Fri Apr 25 12:16:37 PDT 2014
On 25 Apr 2014, at 14:21, Jonathan Wilkes <jancsika at yahoo.com> wrote:
> On 04/23/2014 10:04 AM, Louis Suárez-Potts wrote:
>> On 23 Apr 2014, at 08:38, Nick <liberationtech at njw.me.uk> wrote:
>>
>>> I took the liberty of changing the subject line to something that
>>> hopefully somewhat summarises your email.
>>>
>>> Quoth Arnaud Legout:
>>>> As polemical as it can be, deeply-held belief such as "I will always
>>>> go for open source code because its security will
>>>> be much higher than any closed source counter parts" should be
>>>> seriously reconsidered
>>>> when there is not a strong community of developers working on code
>>>> maintenance.
>>> There is a lot of shitty code around. That has always been the case,
>>> and will always be so. Anyone who has used the OpenSSL codebase or
>>> looked at it even briefly has seen that it's shitty years ago, and
>>> probably won't have been too surprised by the recent heartbleed bug.
>>> Strong code can and does come out of small teams, including those of
>>> one or two people. I would recommend rather than judging a the
>>> quality of a project by whether there is a "strong community of
>>> developers" or how the project is financially backed, you take a few
>>> minutes to look at the state of the source code. That isn't a deep
>>> audit, of course, but can give you a sense for the tastes and cares
>>> of the people behind the code. Needless to say proprietary code
>>> which forbids such examination should be avoided, for this and other
>>> good reasons.
>> When I was "leading" OpenOffice.org I proposed that students, mentored by employed experts and who would probably be project committers (and who might be in fact instructors at colleges and universities), learn about open source collaboration and also programming by working on outstanding bugs and other issues brought to their attention by their teachers and relevant project members. Other large open source projects had people with similar ideas and some, as we did, acted on it.
>>
>> The idea is not to exploit student labour; and I am aware that a lot of important work actually demands the attention of experts, not students. I am also aware that many professors and teachers are indeed moving to use open source projects' code for their classes. But more could probably be done both to uncover and even fix flawed and hoary code and also teach students open source collaboration techniques. (I also would mean for this to be a global effort, not particular to any one country or region.) Thus, one element of a solution could well be the promotion of known or suspected problem code and architecture for student investigation. Any proposed bug fixes would have to go through the usual (or even more than usual) protocols before inclusion into the accepted codebases.
>
> It sounds like you want to foster a learning environment that has the added benefit of improving security software. But in reality I think your proposal would create an environment for rationalizing insecurity.
Okay; fair enough, though of course that's hardly what I or anyone else (who's like me) would want! Judging from your response, I think I wasn't very clear in my summary and proposal.
>
> The "usual" protocols aren't working very well atm-- if they were then the Openssl source wouldn't look the way that it does.
I tend not to judge a large category that includes a variety of methods by a single obvious failure. (That is, my case in point here is not limited to nor really only about OpenSSL. It is more generally about encouraging the development of a field of open source that would recognize some of what it does as not being immediately of, by and for the market.)
[OT: I have other reasons to believe that open source software production and maintenance does not always work as people think it does or wish it would. These other reasons relate to the persistent conviction, especially in the US, that open source is somehow a shining type specimen of Libertarianism at play. It isn't. It works well when there is an engaged and critical community, and that community needs funding of one sort or another to keep it engaged and critical. However, most Leviathan tech companies support only those efforts, open or not, which benefit them; and who can blame them for being what they are? In contrast, art, humanities and science programs, for whose products there may not be an obvious or immediate market or which are deemed to be doing work in the public good, are eligible to receive government grant, albeit much less so today than yesterday (no matter when we claim today is).]
> If you only keep the current barriers to entry for the student coders then at best you're no better off than you were before. Probably you're worse off because more people would be submitting code, those people are untrained in the field, and the same number of overworked reviewers are now tasked with yet more work.
Actually, I'd be interested in removing irrational barriers and have no desire to inflict experienced project members with student code. That's why I suggested a kind of sandbox (the classroom) where students could experiment (learn) and then, when they're able to pass muster, join the relevant project. But not as privileged candidates, but as anyone else might: through the proof of code and ability to collaborate. The project would still be having to negotiate the influx. But that's really up to the project and I see something as I've suggested as part of a dialogue, not as a kind of surprise party.
My experience with students—it's not insignificant—has been that even talented ones feel intimidated by some open source projects. Call this a failure of education or a sign of a neglect. What we've done in the past—where "we" includes OOo, Mozilla, Eclipse, and others working in tandem at Seneca College at York U. in Toronto (and also other post-secondary institutions in India, Brazil, etc, but not working directly with me though yes with others)—is two things: teach open source collaboration and the relevant code.
>
> If you implement more barriers for the students than for the experts, you immediately create an incentive for both the experts and the students to find and exploit the holes in the development process.
Do you have instances of this you can point to? The experience we've had at Seneca, at least from a few years ago, was not as you suggest; which is why I think I probably didn't explain the situation very well; my apologies.
> Experts would break it because they'd presumably be the ones expending more effort to ensure the students follow the extra protocol cruft; students would break it (perhaps accidentally) because they don't yet have the expertise to understand the reasoning behind the extra work. Welcome to the security line at every U.S. airport.
>
> I'll repeat my suggestion that was previously met with crickets: we should wring the last frew drops out of the current expertise devs have by requiring a video of rubber duck debugging for major code changes or additions. In the case of Openssl there should have been one first from the reviewer, then one posted from the submitter of the patch.
>
> I don't care what hour of the day it is, if a reviewer has to publish an oral account of what he/she thinks an implementation of a patch _actually_ does, and the submitter then has to do the same, those two brains have a way of spotting inconsistencies that typing one's name and clicking a button has tended to miss.
>
> -Jonathan
> —
-louis
More information about the liberationtech
mailing list