Richo does DEF CON

This year was the first year that the core team were all in the same place, and it was about as monumental as you’d expect.

I met up with carbon and samurai at the airport on thursday night and headed straight to the rio. After not being let into the pool party for not having badges (that we missed by scant minutes) we headed back to the bar and met up with some more psych0tik peeps. I’ve gotta say that when samurai and I cooked up this hairbrained scheme all those years ago, the absolute last thing I expected was to be meeting a bunch of people in vegas as a result of it. I was blown away.

After more than a couple of beers we staggered back to our hotel, talked some shit and crashed for the evening. Friday morning we badged up and checked out some talks. I wasted a ton of time picking up swag too, because Fuck Logic (which I think had pretty much become my catch phrase by the end of the weekend. Despite DEF CON being almost entirely populated by people at the top of their game I saw some downright retarded shit go down).

The badges this year were an interesting beast, an 8 core creation from Parallax that runs a bytecode format derived from a strange amalgamation of python, C and ASM. I had a riot with the badge and in some ways was disappointed that I was keen to do other things with the con, as I could have quite happily spent the whole time on the badge puzzle.

On the talk front, I only caught a couple of truly inspirational ones, but the highlight for me was definitely Anch and Omega from DCG Dark, who are planning to build a Darknet of Things over 6LoWPAN, with a view to building a massively fault tolerant sensor network, but they seem interested in making this technology general enough that it can reasonably be bolted to anything.

Given my fondness for bolting things to other things, I bought a couple of their prototype McMotes and scurried away. I’m already concocting plans involving my motes and my raspberry pi’s. It’s probably also worth pointing out that their talk was in a smallish room, at a slightly awkward time, but they managed to not only fill but have people standing in the halls of their Q&A session. To say that they made a splash would be a massive understatement.

I also really enjoyed Rodrigo Branco, Sergey Bratus and James Oakley’s talk on exploting the eh stack of libdwarf’s exception handling implementation. It’s been something that I’ve thought about and heard talked about for a long time, but up until now I’ve not seen a robust proof of concept that really made it look like a usable attack vector. At time of writing it appears that their paper is still unpublished, but I’m looking forward to reading it.

Sadly this year we didn’t make it to Haufbraus, which is made worse by how awesome I’m told it is. It’s definitely on the cards for next year.

The afterparty was a unique beast, not least because I finally gave into all the people mistaking me for Moxie and ran with it. While I haven’t decided whether this constitutes man in the middle or social engineering, I know for sure that it makes for a good time. Thanks to all who bought me beers. From the freak show it was obviously time to not make it into a night club and instead go for a precarious jaunt through what was almost certainly the crypts underpinning a casino. Pants shittingly terrified doesn’t quite cover it.

Enough words though. On with the pictures

Posted in Reviews | Tagged , | Leave a comment

richo is now richo in one more place

My github username has changed, so if anyone is linking to my repos:

1. Tell me, I can fork the project to richoH to keep the links working for now
2. Fix the link! Just change richoH to richo

Posted in psych0tik News | Leave a comment

threaded mutt on OSX

Just a warning- brace yourselves. I’m back on OSX as a result of the new job, and so a slew of “OSX is dicks” posts is probably on the horizon.

First up- homebrew is actually not all that bad. It only took me a weekend to get a workableish environment up with recent versions of everything I need. One thing that irked me to no end though, was lack of support for threading conversations like I used to get in Debian.

After quickly cloning the Debian repo and building it on OSX.. no dice. So evidently it’s some unique combination of the Debian patchset and the correct compile options. This managed to take me 3 days, by the time I worked out the correct options, got it to build with clang, shuffled a patch around to unbreak some inlined functions (No idea how that ever built in the first place, actually..)

The short version though, is that if you want your very own mutt that threads messages, it’s as simple as

brew install --HEAD https://raw.github.com/richo/homebrew/f35164afb848d5fd856233fca69662b55fdf2740/Library/Formula/mutt.rb

Which pulls from the psych0tik clone of mutt (I’m slowly moving all of the packages available in the psych0tik apt repo to the psych0tik GH account)

Ninja edit:

It’s more than likely that you want the latest version of this brew which does a lot more to make sure it’ll actually work. I suggest fetching the features/upstream_mutt branch from github.com/richo/homebrew

Posted in Guides, Hax | Tagged , , | Leave a comment

Using Oauth outside of the webapp domain

Recently at work we had a R&D day during which Josh Benham and I worked on a cli interface to github.

We knew immediately that we didn’t want to use basic auth, obviously preferring the oauth library which is significantly more secure, but upstream requiring a callback uri is very impractical in the case that you don’t have one available.

The solution in the end wasn’t as complex as I thought it would be. Basically, I wrote a webservice that the client connects to, which gives it a unique URL. We then use the URL we’re given as a redirect URL.

At this stage it’s realistically only sensible to use it as a proof of concept, as it gives you the token in plaintext and doesn’t have SSL.

For version 2 I’d like to export the SSL to the client, and merely relay the encrypted packets. I’d also like to have the whole thing http encapsulated, for now synchronicity complaints (and if we’re honest, the fact that I just wanted the damn thing working) meant that I wrote it with a vaguely flawed thread spawning model and not a lot of protection against DDoS attacks.

If none of this scares you off though, it’s currently running at oauth.psych0tik.net

The procedure is:

Connect on port 2000
recieve your 128 byte callback ID
Send http://oauth.psych0tik.net/callback/[callback ID] as your callback ID to the oauth endpoint
recieve your token from the original connection

Call it a day!

Source is on my github account

Posted in psych0tik News | Leave a comment

Mac-tacular backups, Apple, and OSX

This post started out as a bit of a rant about back-ups I needed to make, due to a dieing Mac Book Pro.  Over the course of writing it, I’ve come across a few other issues and have looped in some old notes I have about working with OSX.  I’m not really an Apple guy, but I do have a pair of Mac Book Pros that I’ve used for a couple of years.  I’ve come across a few issues with OSX and had to find solutions, so I figured I’d dump a list of tips/tricks I’ve discovered in here, along with a few small rants.

The newer of my two Mac Book Pros is a 2011 model, which seems to be a very buggy version of the hardware.  I’ve had two logic boards completely fail, within a few months of getting the device new.  After those were replaced, and continued to fail, the device was deemed a lemon – and replaced in whole.  The new Mac worked fine for almost a year, but is now having issues with the RAM controller, where only the top slot works.

While dealing with these issues, I’ve been to the Apple store a few times.  One of their more obnoxious policies is that repairs cannot be done with out all of the original hardware installed.  So if you’ve upgraded a hard drive, you’ve got to downgrade things.  In my case, they wanted to RMA the entire device, including a hard drive that contained my personal information.

They do offer the service to migrate your files and applications in the store (probably using the same tool that does it on first boot), but if you tend to encrypt your home directory, even with FileVault, it throws off this process (presumably because it cannot access configuration files in your homedir needed to enumerate what to copy.)

This left me with 2 options: tell the Apple Geniuses my password (sort of defeats the purpose of encrypting, eh?), or get the drive from them to suss out backing up on my own.  Needless to say, I went the later route.

I happened to have an extra Mac, a SATA to USB enclosure, and a 3TB external drive on hand to handle and test the copy.  I also pulled in a generic linux box for fdisk and other toos, to do some of the heavy lifting as I don’t particularly like OSX’s disk management tools and I don’t want my drive to end up HFS.  My plan was to make 2 different backups: a copy of the homedir and a dd of the full disk (call me paranoid, but I like being sure I didn’t forget something.)

I immediately ran into issues when I found that fdisk and diskutility don’t play nicely together.  If one partitions the drive, the other can’t read or use it (the same goes for mounting), which is probably a result of the EFI setup used by OSX.  Additionally, in their infinite wisdom, Apple decided that the only truly cross-platform file system they support would be Fat32, which won’t support large files (like a 500G dd.)   As I didn’t want to leave my Mac tethered to some USB drives for 25 hours, this caused a real headache.

Originally, I had a rant here to the effect of ‘Apple|Microsoft – y u no hav better FS support”, but I decided to do a bit of digging to sort that out.  From what I’ve read, OSX doesn’t include support for things like Ext3/Ext4 due to GPL issues and their lack of desire to publish code.  As far as I can tell, Microsoft is simply ignoring filesystems they’ve got no stake in.  Neither of these reasons really help us, but I suppose when running a business you’ve got to prioritize.  In my research efforts, I did come across a particularly good post on adding support via FUSE drivers and the like – lifehacker.

I ended up reformatting the 3TB external drive with diskutil, so that it was usable with my other Mac, and then hooked up the old drive via the USB->SATA converter.  At this point I started looking at the attached hard drive from the old Mac Book Pro, to ensure I could work with everything in the way I expected.  Part of this involved dealing with the encrypted partition of the drive using FileVault.

While playing with FileVault a discovered two things.  The first is that diskutil reports FileVault volume sizes incorrectly.  I spent a couple hours very confused on why my homedir was 200Gigs larger than the physical disk.  The second is a bit concerning, when I connected the old Mac hard drive and mounted the FileVault volume, it was able to decrypt this with the new Mac’s sudo password.  I’m not sure yet how that works, but I’d assume that the password is stored somewhere that’s root-accessible in a reversible (or decrypted?) format.  Somewhat concerning for a tool that you expect to protect your privacy.

Ultimately,  the transfer took so long I ended up convincing Apple to simply take the trade in with the promise of giving them the old drive.  I’m sure someday I’ll get a call about that…

My other Mac Book Pro has suffered significantly less, with the only real problem being somewhat dodgy internal fans.  I’ve had both fail to date.  After the headache of getting the first fan replaced, when the second fan bailed, I decided to skip the hassle and simply pull it out and run the machine without.  That’s worked “fine” for about a year, minus the box getting a little warm.  As it turns out, the fan is important (but apparently not important enough to use a quality part) and months of Texas heat and no internal fan have caused the display to fail to work.

Luckily for me, the video output from the laptop continues to work.  So I’ve simply hooked it up to my KVM and now have a shiny new ‘desktop.’  The only problem I’ve run into using this setup, is that the MBP doesn’t seem to have an option to run with the lid closed.  The only work around I’ve found is to power the machine on and, nearly instantly, close the lid.  This seems to work, tho as I switch between ports on my KVM the box occasionally loses track of the keyboard.

 

Posted in Articles, Guides, Hax | Tagged , , , , , , | Leave a comment

Displaying the current repo in your prompt

I tend to nest repos a lot in my usual workflow (The common elements are generally ~/code/ext/[repo], but with the pull_ext infrastructure I wrote, >5 levels of nesting are not uncommon).

To work around this, I wanted my prompt to hilight the root of the current repository and also tell me what type of repo it is.

vcs_info does this, but I find the configuration quite inflexible, and some of the tests are quite expensive so I decided to quickly hack something together.

I arrived at

Which recurses back from the current directory, looks for the metadata for the 4 VCS systems that I use, and exports a character for my prompt and for the colored PWD.

Then I only run this on directory changes to keep things quick (Which does mean that `git init .` won’t update the prompt until you shuffle about a bit.

The net result is this

repo aware prompt

Posted in Articles | Tagged , , | Leave a comment