Scary programmer

As some of you will be aware, I’m a dedicated computer programmer. Contrary to popular opinion in some circles by people who see me on the net, things like blogging and politics are merely minor interests.

I just like writing code across a large number of languages, operating systems, and devices. And my main interest is in C derived languages like C++, C#, Java, and the like.

So reading the results of the underhanded C contest yesterday morning is, for me, fun. I figure that for some of the readers, this will be as well. Even if you don’t follow some of the technical details, the process displays a suitable level of interesting sneakiness.

The goal of the Underhanded C contest is to write code that is as readable, clear, innocent and straightforward as possible, and yet it must fail to perform at its apparent function. To be more specific, it should do something subtly evil. Every year, we will propose a challenge to coders to solve a simple data processing problem, but with covert malicious behavior. Examples include miscounting votes, shaving money from financial transactions, or leaking information to an eavesdropper. The main goal, however, is to write source code that easily passes visual inspection by other programmers.

This years contents was to write code for a twitter like social media system…

The PiuPiu oversharing site allows users to create PiuPiu accounts and post 140-character messages. The federal government wants PiuPiu to surveil user activity on the site, by archiving any posts that match certain patterns outlined in a national security letter. Subject to the nondisclosure constraints of the letter, PiuPiu may not inform anyone of the surveillance request.

And the underhanded programmers were to write the surveillance request function, but the evil part was that…

The underhanded goal is this: write surveil() in such a way that the act of surveillance is subtly leaked to the user or to the outside world. PiuPiu can not reveal the act of surveillance, but your function is technically able to edit the Piu or user structure during scanning. Find a way to alter that data (this alone is a bit of a challenge, since you are not supposed to alter the data, just scan it) in such a way that an informed outsider can tell if someone is being archived. The leakage should be subtle enough that it is not easily noticed.

As always, the code should appear simple, innocent, readable and obvious.

Now a lot of these methods used were pretty standard ranging from data overflows from various techniques to providing timing methods subject to statistical analysis.

I liked the elegance of Seb Grindle’s usage of old still supported K&R C function declarations that don’t check the types of parameter passing. But that would flash warning signs for any programmer who has ever had to deal with fossil code written like that. Domenico Andriole’s avatar solution would be damn hard to pick up and was an interesting way of passing a code review, but should have gotten caught in testing.

But the winner Karen Pease had the sneakiest way that I have ever seen of  logging information to a quarterly audit log! This is the end of the analysis.

Thus the final AUDIT call zeroes out a user’s created time, if the user was surveilled.

That is really freaking underhanded. Here’s what I like about this:

Congratulations Karen Pease, you are a frighteningly Underhanded C programmer.

Bloody hell. I’d totally agree. The end result would be an auditing file used long after the surveillance events. It’d tag all surveillance with what appears to be a minor date reporting bug that’d look seemingly unimportant .

If someone had access to that file they’d have access to complete logs of who was being tracked.

No-one would probably look unless something else went wrong anyway and they needed the audit log to look for a error pattern. Under those circumstances they probably wouldn’t be that interested in simple occasional date reporting problem anyway, they’d be tracking their own disaster. At best they’d probably add a bug into the reporting system.

The cause would be frigging hard to find for anyone else coming into the code because they’d be unlikely to get a trigger in any of their current data (unless the government was doing a awful lot of tracking). It’d look like a simple, unimportant, but complicated and hard to find coding mistake. Other programmers would probably bounce if they had a cursory look for that error.

The person most likely to get/have access to that file would be the person who created the bug in the first place. If only for the purposes of fixing that bug. And if it doesn’t get noticed earlier , they could ‘discover’ it during a review of their code and development logs.

Ouch! This is elegant coding and social engineering rolled into one. Good to see that there are people like this out there.

Powered by WPtouch Mobile Suite for WordPress