I wrote a small app in OpenFrameworks using some public addons that reads the live audio and contains an accurate BPM tapper. This data is then fed through OSC to my Unity application, where the visuals happen. On the Unity side of things I use two cameras per 'deck', one foreground and one background. These are then blended. There are two decks, and there is a crossfader to fade between the two decks. With this system I perform live twice per month.
At the moment the pictures on this page are old (!), as are the videos in the link below. I'm currently working on a new VJ'ing system from scratch, of which I will post screenshots once it's further along in development.