braindump on reversing the epiphan vga2usbLR to get it to work on my system (debian stretch, kernel 4.6.xx) which is not supported by the free, but alas not Free, driver provided by Ephiphan.
- my first endeavour in libusb
- gstreamer ftw
capturing usb data
i installed an official driver on a debian jessie, and got to work from there:
usbmon text interface
easy capturing usb packets, for sure in combination with vusb-analyzer, which gives a graphical timeline view of the packets ( http://vusb-analyzer.sourceforge.net )
sudo modprobe usbmon lsusb # and look which bus your device is assigned to # eg "Bus 003 Device 003: ID 5555:3340 Epiphan Systems Inc. VGA2USB LR" cat /sys/kernel/debug/usb/usbmon/3u > capture.mon # mind the file extension is important here, vusb-analyzer needs this...
the usbmon is limited in it's use as it _only provides the first 32 bytes of data_ per usb-packet (URB), we'll need to use usbmon's binary interface for this, which is luckily supported by wireshark :
gives full inspection in each packet, but easy to get lost in the sheer amount of dumped data without the overview of vusb-analyzer.
for the binary dumps of data i exported to PDML (some xml format that exports the dissected packets, and full data. a small hackish python parser (thank you for xml.etree.ElementTree's XPath, my dear ! xml got a lot more bearable with this)
went through some iterations of working with isosynchronous communication (control packet transfers) and then got into async bulk transfers to get things to work smoothly.
no nonsense library, was excellent.
the toolbox of choice for all media transcoding / streaming. I recently played around with snowmix and noticed the use of shared memory (shmsrc) to transfer the frame buffers from one process to another, and got into setting this up. only to find out shared memory is only part of the coin. it also needs a unix-socket to send synchronous updates between processes (arrival of new frame, offset/buffersize). shared-memory is easy enough with mmap, and well documented. didn't feel like adding the socket stuff after finding out so i solved it with a hack, using socat...
i got no real clue, basically i grabbed the data and replayed it using libusb; i got some things to play with offsets (capturing one frame requires 4 consequential calls, transferring 1028x204 pixels, each with a different offset..
so there's still work to do...
here my 'git archive --format tar.gz HEAD > vga2usb_libusb_hack.tar.gz' File:Vga2usb libusb hack.tar.gz
the 'initialize' script transfers a bunch of 'a0' request-messages, uploading some configuration things. after which the devices resets the connection and shows it's through identity;
the ./capture script dumps a lot of data to the device (some fpga bitstream? -- i got no real clue) (check 'b9' requests in b9dump.inc) and repeatedly dumps raw frame buffers to 'dumpXXX' files. These files can be processed with the ./img_convert script, calling imagemagick'convert to kick them in shape.
more interesting is the ./capture_shm script. It sets up the unix-domain socket (using socat) and proceeds to call an adapted version of above to write the captured binary data to shared memory. This shared memory can then be used by gstreamer shmsrc (some gstreamer-1.0 examples are provided in ./gst_imgshow)