Live Capture Plus and QuickTime for Java

One of the particular challenges of video preservation is how to handle and preserve the content on digital tape formats from the latter days of magnetic A/V media: Digital Betacam, DVCam, DVCPro, etc. Caught in the nebulous time of transition between analog and digital signals (the medium itself, magnetic tape, is basically the same as previous videotape formats like VHS or Betacam – but the information stored on them was encoded digitally), these formats were particularly popular in production environments but there were plenty of prolific consumer-grade efforts as well (MiniDV, Digital8). In some ways, this makes transferring content easier than handling analog formats: there is no “digitization” involved, no philosophical-archival conundrum of how best to approximate an analog signal into a digital one. One simply needs to pull the digital content off the magnetic tape intact and get it to a modern storage medium (hard disk, solid-state, or maybe LTO, which yes I know is still magnetic tape but pay no attention to the man behind the curtain).

 https://twitter.com/dericed/status/981965351482249216?ref_src=twsrc%5Etfw&ref_url=https%3A%2F%2Ftweetdeck.twitter.com%2F

However, even if you still have the proper playback deck, and the right cables and adapters to hook up to a contemporary computer, there’s the issue of software – do you have a capture application that can communicate properly with the deck *and* pull the digital video stream off the tape as-is?

The last bit is getting especially tricky. As DV-encoded formats in particular lost popularity in broadcast/production environments, the number of applications that can import and capture DV video without transcoding (that is, changing the digital video stream in the process of capture), while staying compatible with contemporary, secure operating systems/environments, has dwindled. That’s created a real conundrum for a lot of archivists. Apple’s Final Cut Pro application, for instance, infamously dropped the ability to capture native DV when it “upgraded” from Final Cut Pro 7 to Final Cut X (you can hook up and capture tapes, but Final Cut X will automatically transcode the video to ProRes). Adobe Premiere will still capture DV and HDV codecs natively, but will re-package the stream into a .mov QuickTime wrapper (you can extract the raw DV back out, though, so this is still a solid option for many, though of course for just as many more an Adobe CC subscription is beyond their means).

One of the best options for DV capture is (was?) a Mac application called Live Capture Plus, made by Square Box Systems as part of its CatDV media suite. It has great options for error handling (e.g. trying to read a problem area of a tape multiple times automatically if there’s dropout), generating DV files based on clips or scenes or timecode rather than the whole tape, remote tape deck control over the FireWire/Thunderbolt connection, etc; a bunch of ingesting stuff that’s more appealing to an archivist than an application primarily for editing, like Adobe Premiere. It also talks to you, which is fun but also terrifying.

Failed to power up

However, Square Box removed Live Capture Plus from its product list some years back, and as far as I’m aware has refused all pleas to either open-source the legacy code or even continue to sell new licenses to those in the know.

Let’s say you *are* lucky enough to still have an old Live Capture Plus license on hand, however. The Live Capture Plus GUI is built on Java, but an older, legacy version of Java, so when you first try to run the app on a contemporary OS (~10.10 and up), you’ll first see this:

Luckily, for at least the moment, Apple still offers/maintains a download of this deprecated version of Java – just clicking on “More Info…” in that window will take you there, or you can search for “Java for OSX” to find the Apple Support page.

OK, so you’ve downloaded and installed the legacy Java for OSX. Yet this time, when you try again to run Live Capture Plus, you run into this fun error message instead:


All right. What’s going on here?

When I first encountered this error message, even though I didn’t know Java, this error message provided two clues: 1) “NoClassDefFoundError” – so Java *can’t find* some piece that it needs to run the application correctly; and 2) “quicktime”/”QTHandleRef” – so it specifically can’t find some piece that relates to QuickTime. That’s enough to go on a search engine deep dive, where I eventually found this page where researchers at the University of Wisconsin-Madison’s zoology/molecular biology lab apparently encountered and solved a similar issue with a piece of legacy software related to, near as I can figure from that site, taking images of tiny tiny worms. (I desperately want to get in touch and propose some sort of panel with these people about working with legacy software, but am not even sure what venue/conference would be appropriate)

The only way my ’90s-kid brain can understand what’s happening

So basically, recent versions of Mac OS have not included key files for a  plugin called “QuickTime for Java” – a deprecated software library that allowed applications built in/on Java (like Live Capture Plus) to provide multimedia functionality (playback, editing, capture) by piggybacking on the QuickTime application’s support for a pretty wide range of media formats and codecs (including DV). Is this fixable? If you can get those key files, yes!

For now, both the downloads and instructions for what to do with these three files are available on that Hardin Lab page, but I’m offering them here as well. The fix is pretty quick:

 

I would only note a couple things, which is that I’ve successfully installed on to macOS Sierra (10.12) without needing to mess with the System Integrity Protection settings by just installing for the local user (i.e. putting the files into ~/Library rather than /System/Library); and that if you want to do this via Finder rather than the command line, here is how to get to the “View Options” box to reveal the Library folder in Finder, as mentioned in Step 1 above (a useful step in general, really, if you’re up to digipres shenanigans):

Once these files are in place, Live Capture Plus should open correctly and be able to start communicating with any DV/FireWire-capable deck that you’ve got connected to your Mac – again, provided that you’ve got a registration code to enter at this point.

A final word of warning, however. Live Capture Plus comes from the era of 32-bit applications, and we’re now firmly in the era of 64-bit operating systems. Exactly what all that means is probably the subject of another post, but basically it’s just to say that legacy 32-bit apps weren’t made to take advantage of modern hardware, and may run slower on contemporary computers than they did on their original, legacy hardware. Not really an issue when you’re in video/digital preservation and your entire life is work-arounds, but recently Mac OS has taken to complaining about 32-bit apps:

Despite 32-bit apps providing, near as I can tell, no actual security or compatibility concerns for 64-bit OSes (32-bit apps just can’t take advantage of more than 4GB of RAM), this is a pretty heavy indication that Apple will likely cut off support for 32-bit apps entirely sometime in the not-so-so-distant future. And that will go for, not just Live Capture Plus, but other legacy apps capable of native DV transfer (Final Cut 7, the DVHSCap utility from the FireWire SDK, etc.)

So go get those DV tapes transferred!!!!

Upgrading Video Digitization Stations

In the primary MIAP lab we have four Mac Pro stations set up mainly for video digitization and capture. They get most heavily used during our two Video Preservation courses: Video Preservation I, which focuses on technical principles and practice of digitization from analog video sources, and Video Preservation II, which focuses more on vendor relations and guiding outsourced mass digitization projects, but by necessity involves a fair amount of digital video quality control/quality assurance as well. They get used for assorted projects in Collections Management, the “Talking Tech” workshops I’ve started leading myself, and the Cinema Studies department’s archive as well.

Over the course of 2016, the hardware on these four stations was really starting to show its age. These machines were originally bought and set up in 2012 – putting them in the last generation of the older “tower”-style silver Mac Pro desktops, before Apple radically shifted its hardware design to the “trash bin” style Mac Pros that you can buy today. The operating system hadn’t been updated in a while either, they were still running Mac OSX 10.10 (Yosemite), whose last full update came in August 2015 (with a few security updates still following, at least).

maxresdefault
This guy isn’t allowed in anymore, for instance.

These stations were stable – at least, in the sense that all the software we needed definitely worked, and they would get the job done of digitizing/capturing analog video. But the limitations of how quickly and efficiently they could do this work was more and more apparent. The amount of time it took, to, say, create a bag out of 200 GB of uncompressed video, transcode derivative copies, run an rsync script to back video packages up to a local NAS unit, or move the files to/from external  drives (a frequent case, as both Video Preservation classes usually partner with other cultural organizations in New York City who come to pick up their newly-digitized material via hard drive) was getting excruciating relative to newer systems, wasting class time and requiring a lot of coordination/planning of resources as ffmpeg or rsync chugged along for hours, or even overnight.

So, I knew it was time to upgrade our stations. But how to go about it? There were two basic options:

1. Purchase brand-new “trash bin” Mac Pros to replace the older stations

pratttrashcan_macpro
http://rudypospisil.com/wordpress/wp-content/uploads/2013/10/prattTrashCan_macPro.jpg

2. Open up the innards of the old Mac Pros and swap in updated, more powerful components

Buying brand-new Windows stations was basically out, just given the way our classes have been taught, the software we work with, and my own personal knowledge/preference/ability to maintain hardware. And I was lucky that #1 was even an option at all – the considerable resources available at NYU allow for choices that I would not have many other places. But, MIAP also has a lot of equipment needs, and I’d generally rather stash larger portions of our budget towards harder-to-get analog video equipment and refurbishment, than jump for splashy new hardware that we don’t actually need. So I drew up some thoughts on what I actually wanted to accomplish:

  • improved data transfer rate between desktops and external drives (the fastest connection available, at best, was the mid-2012 Mac Pro’s native FireWire 800 ports; and many times we were limited to USB 2.0)
  • improved application multi-tasking (allow for, say, a Blackmagic Media Express capture to run at the same time as the ffmpeg transcode of a previous capture)
  • improved single-application processing power (speed up transcoding, bag creation and validation, rsync transfer if possible)
  • update operating system to OSX 10.11 (El Capitan, a more secure and up-to-date release than Yosemite and MUCH more stable than the new 10.12 Sierra)
  • maintain software functionality with a few older programs, especially Final Cut 7 or equivalent native-DV capture software

Consulting with adjunct faculty, a few friends, and the good old internet, it became clear that a quick upgrade by way of just purchasing new Mac Pros would pose several issues: first, that the Blackmagic Decklink Studio 2 capture cards we used for analog video digitization would not be compatible, requiring additional purchases of stand-alone Blackmagic analog-to-digital converter boxes on top of the new desktops to maintain current workflows. It is also more difficult to cheaply upgrade or replace the storage inside the newer Mac Pros, again likely requiring the eventual purchase of stand-alone RAID storage units to keep up with the amount of uncompressed video being pumped out; whereas the old Mac Pro towers have four internal drive slots that can be swapped in and out within minutes, with minimal expertise, and be easily arranged into various internal RAID configurations.

In other words, I decided it was much cheaper and more efficient to keep the existing Mac Pro stations, which are extremely flexible and easy to upgrade, and via new components bring them more or less up to speed with what completely new Mac Pros could offer anyway. In addition to the four swappable storage slots, the old Mac Pro towers feature easy-to-replace RAM modules, and PCI expansion slots on the back that offer the option to add extra data buses (i.e. more USB, eSATA, or Thunderbolt ports). You can also update the CPU itself – but while adding a processor with more cores would in theory (if I understand the theory, which is also far from a 100% proposition) be the single biggest boost to improving/speeding up processing, the Intel Quad-Core processors already in the old towers are no slouch (the default new models of the Mac Pro still have Quad-Cores), and would be more expensive and difficult to replace than those other pieces. Again, it seemed more efficient, and safer given my limited history with building computer hardware, to incrementally upgrade all the other parts, see what we’re working with, and someday in the future step up the CPU if we really, desperately need to breathe more life into these machines.

So, for each of the four stations, here were the upgrades made (separation made between the upgrade and specific model/pricing found; for any of these you could pursue other brands/models/sellers as well):

  • (1) 120 GB solid-state drive (for operating system and applications)

OWC Mercury Extreme Pro 6G SSD: $77/unit
OWC Mount Pro Drive Sled (necessary to mount SSDs in old Mac Pros): $17/unit

  • (1) 1 TB hard drive (for general data storage – more on this later)

Western Digital Caviar Blue 1 TB Internal HDD: $50/unit

  • (1) PCI Express Expansion Card, w/ eSATA, USB 3.0 and USB 3.1 capability

CalDigit FASTA-6GU3 Plus: $161/unit

  • (4) 8 GB RAM modules, for a total of 32 GB

OWC 32.0 GB Upgrade Kit: $139/unit

IMG_2839.JPG
Swaaaaaaaaaaag

Summed up, that’s less than $500 per computer and less than $2000 for the whole lab, which is a pretty good price for (hopefully) speeding up our digitization workflow and keeping our Video Preservation courses functional for at least a couple more years.

The thinking: with all that RAM, multi-tasking applications shouldn’t be an issue, even with higher-resource applications like Final Cut 7, Blackmagic Media Express, ffmpeg, etc. With the OSX El Capitan operating system and all applications hosted on solid-state memory (the 120 GB SSD) rather than hard drive, single applications should run much faster (as the drives don’t need to literally spin around to find application or system data). By buying a new 1 TB hard drive for each computer, the three non-OS drive slots on each computer are now all filled with 1 TB hard drives. I could have two of those configured in a RAID 0 stripe arrangement, to increase the read and write speed of user data (i.e. video captures) – the third drive can serve as general backup or as storage for non-video digitization projects, as needed.

IMG_2843.JPG
RAM for days
IMG_2854.JPG
*Oh what fun it is to ride in a one-120-GB-solid-state-drive open sled*

IMG_2855.JPG

IMG_2856.JPG

The expansion cards will now allow eSATA or USB 3.0-speed transfers to compatible external drives. The USB 3.1 function on the specific CalDigit cards I got won’t work unless I upgrade the operating system to 10.12 Sierra, which I don’t want to do just yet. That’s basically the one downside compared to the all-new Mac Pros, which would’ve offered the Thunderbolt transfer speeds better than USB 3.0 – but for now, USB 3.0 is A) still a drastic improvement over what we had before, B) probably the most common connection on the consumer external drives we see anyway, and C) with an inevitable operating system upgrade we’ll “unlock” the USB 3.1 capability to keep up as USB 3.1 connections become more common on external drives.

IMG_2849.JPG
Uninstalled…
IMG_2852.JPG
…installed! Top row – followed by two slots for the Blackmagic Decklink Studio 2 input/output and the AMD Radeon graphics card input/output at the bottom.

Installing all these components was a breeze. Seriously! Even if you don’t know your way around the inside of a computer at all, the old Mac Pro towers were basically designed to be super customizable and easy to swap out parts, and there’s tons of clear, well-illustrated instructional videos available to follow.

[vimeo 139648427 w=640 h=360]

As I mentioned in a previous post about opening up computers, the main issue was grounding. Static discharge damaging the internal parts of your computer is always a risk when you go rooting around and touching components, and especially since the MIAP lab is carpeted I was a bit worried about accidentally frying a CPU with my shaky, novice hands. So I also picked up a $20 computer repair kit that included an anti-static wristband that I wore while removing the desktops from their station mounts, cleaning them out with compressed air, and swapping in the mounted SSDs, new HDDs, expansion cards, and RAM modules.

IMG_2841.JPG

With the hardware upgrades completed, it was time to move on to software and RAID configuration. Using a free program called DiskMaker X6, I had created a bootable El Capitan install disk on a USB stick (to save the time of having to download the installer to each of the four stations separately). Booting straight into this installer program (by plugging in the USB stick and holding down the Option key when turning on a Mac), I was able to quickly go through the process of installing OSX El Capitan on to the SSDs. For now that meant I could theoretically start up the desktop from either El Capitan (on the SSD) or Yosemite (still hosted on one of the HDDs) – but I wanted to wipe all the storage and start from scratch here.

I accomplished this using Disk Utility, the built-in program for drive management included with OSX. Once I had backed up all important user data from all the hard drives, I completely reformatted all of them (sticking with the default Mac OS Extended (Journaled) formatting), including the brand-new 1 TB drives. So now each station had an operating system SSD running El Capitan and three blank 1 TB hard drives to play with. As mentioned earlier, I wanted to put two of those in a RAID 0 data stripe arrangement – a way of turning two separate drives into one logical “volume”. RAID 0 is a mildly dangerous arrangement in that failure of either one of those drives means total data loss; but, it means a significant performance boost in read/write speed (hopefully decreasing the likelihood of dropped frames during capture, improving time spent on fixity checks and bagging, etc.), while maintaining a total of 2 TB storage on the drives (most RAID arrangements focused more on data security and redundancy, rather than performance, will result in a total amount of storage on the volume less than the capacity of the physical disks), and files are not meant to be stored long-term on these stations. They are either returned to the original institution, backed up to the more secure, RAID 6-arranged NAS, or backed up to our department’s fleet of external drives – if not some combination of those options.

So it was at this point that I discovered that in the upgrade from Yosemite to El Capitan, Apple actually removed functionality from the Disk Utility application. The graphic interface for Disk Utility in Yosemite and earlier versions of OSX featured an option to easily customize RAID arrangements with your drives. In El Capitan (and, notably, El Capitan only – the feature has returned in Sierra), you’re only allowed to erase, reformat and partition drives.

jboddiskutilityyosemite-577aac645f9b58587592afb8

screen-shot-2017-03-02-at-11-07-37-am
Cool. Cool cool cool.

Which means to the Terminal we go. The command-line version of Disk Utility (invoked with the “diskutil” command) can still quickly create format a RAID volume. First, I have to run a quick

$ diskutil list

…in order to see the file paths/names for the two physical disks that I wanted to combine to create one volume (previously named MIAP_Class_Projects and Station_X_Backup):

screen-shot-2017-03-02-at-11-08-24-am

In this case, I was working with /dev/disk1 and /dev/disk3. Once I had the correct disks identified, I could use the following command:

$ diskutil appleRAID create stripe JHFS+ disk1 disk3

Let’s break this down:

diskutil – command used to invoke the Disk Utility application

appleRAID – option to invoke the underlying function of Disk Utility that creates RAIDs – it’s still there, they just removed it from the graphical version of Disk Utility in El Capitan for some reason ¯\_(ツ)_/¯

create stripe – tells Disk Utility that I want to create a RAID 0 (striped) volume

JHFS+ – tells Disk Utility I want the striped volume to be formatted using the journaled HFS+ file system (the default Mac OS Extended (Journaled) formatting)

disk 1 disk 3 – lists the two drives, with the names taken from the previous command above, that I want to combine for this striped volume

Note: Be careful! When working with Disk Utility, especially in the command line, be sure you have all the data you want properly backed up. You can see how you could easily wipe/reformat disks by putting in the wrong disk number in the command.

End result: two physical disks combined to form a 2 TB volume, renamed to MIAP_Projects_RAID:

Screen Shot 2017-03-02 at 11.15.37 AM.png
The 2 TB RAID volume, visible in the GUI of Disk Utility – note the two physical drives are still listed in the “Internal” menu on the left, but without subset logical volumes, as with the SSD and El Capitan OS volume, or WDC hard drive with the “CS_Archive_Projects” volume.

Hooray! That’s about it. I did all of this with one station first, which allowed me the chance to reinstall all the software, both graphical and CLI, that we generally use in our courses, and test our usual video capture workflows. As mentioned before, my primary concern was older native-DV capture software like Final Cut 7 or Live Capture Plus would break, given that official OS support for those programs ended a long time ago, but near as I can tell they can still work in El Capitan. That’s no guarantee, but I’ll troubleshoot more when I get there (and keep around a bootable USB stick with OSX 10.9 Mavericks on it, just in case we have to go revert to using an older operating system to capture DV).

Screen Shot 2017-03-02 at 11.13.17 AM.png
In order to not eat up memory on the 120 GB SSD operating system drive, I figured this was advisable.

I wish that I had thought to actually run some timed tests before I made these upgrades, so that I would have some hard evidence of the improved processing power and time spent on transcoding, checksumming, etc. But I can say that having the operating system and applications hosted on solid-state memory, and the USB 3.0 transfer speeds to external drives, have certainly made a difference even to the unscientific eye. It’s basically like we’ve had brand-new desktops installed – for a fraction of the cost. So if you’re running a video digitization station, I highly recommend learning your way around the inside of a computer and the different components – whether you’re on PC or still working with the old Mac Pro towers, just swapping in some fresh innards could make a big difference and save the trouble and expense of all-new machines. I can’t speak to working with the new Mac Pros of course, but would be very interested to hear with anyone using those for digitization as to their flexibility – for instance, if I didn’t already have the old Mac Pros to work with, and had been completely starting from scratch, what would you recommend? Buying the new Pros, or hunting down some of the older desktop stations, for the greater ability to customize them?

The A/V Cable Bible

Once upon a time, the cables in the MIAP lab were organized. I have a spreadsheet and everything to prove it. Video cables were neatly separated from audio-only cables, while patch cables and computer cables inhabited their own particular niches. Order and stability reigned.

Was it a long, slow descent into chaos over the years, as various staff, faculty and students picked one cable off from its carefully arranged post and then flung it back, willy-nilly, into the fray? Was I trolled by the Joker?

cables

Who can say. But for a while now, XLR cables have fraternized with 9-pin remote data controllers, USB 2.0 co-mingling with S-Video. Madness and miscegenation, I say.

It didn’t take long to wrangle our peg board cable storage units back into something resembling structure. Start from scratch, sort by type (video, audio, patch, data), make sure every single cable is at least adequately coiled and has its own tie, and we’ve made it so that at least I will be able to find, say, a 1/4″ TRS jack-to-male RCA cable if I so need one. The addition of labels to the peg board is nominally so that other people, too, might be able to find what they need without too much of a head-scratching search – but the truth is it’s just as much to scare people into realizing that there is, in fact, SOME sort of system in place.

IMG_2207

 

IMG_2208
Progress, of a sort

Why am I talking about a minor organization project, that, really, only took about an hour of work? Well, this bout of spring cleaning inspired a sort of The More You Know moment – one of my attempts at self-education that comes (constantly) with the territory. With about 90% of the cables on the board, I was already familiar with their purpose and nomenclature – for instance, that a BNC-to-video-patch cable could be used to hook up the composite video output from one of our auxiliary decks (DVCam, S-VHS, Betamax, various formats that we don’t use enough to merit being mounted into our more permanent racks) into our digitization workflow. There were others that I had never used, and was somewhat flummoxed as to what equipment they were used with or even what kind of signal they were intended to carry. This one in particular was an old-school stumper:

IMG_2232
And also weighs about as much as a tree stump, by the way

So I started to write up a guide, originally intended for my own use. A Cable Bible, if you will, detailing all (or at least the most common) connections that I would have to make and some context for what kinds of cables tend to appear where. Because once I started looking up some of these things, I realized what a rabbit hole A/V cabling is. See, every cable you use in media production and preservation is actually a jumbled design of wire material, signal types, encodings and interfaces, physical connectors, etc. etc. You can say you have a hard drive you need to connect to a computer and you need a USB cable, but what does that really mean? Is the interface USB 2.0, 3.0 or 3.1? What kind of connecting port does the computer have – Type A? Type B? Type C? Or the hard drive itself – is its port Type Micro-A? Micro-B? Micro-B SuperSpeed?

usbc-connector-explained-150310b
I mean this isn’t even comprehensive!

There’s also the fact that the standard for discussing cable connections is to differentiate between connectors with pins as “male” and connections with ports as “female.” For instance, in order to recently set up a DVCam deck for digitization, I needed a female RCA-to-male XLR adapter in order to get a male-to-male RCA cable to hook up with a female XLR port. Perhaps I’m just being phenomenally PC about this, but I’ve never cared for the way I sound vaguely like a pervy teenager when I’m just trying to make a video deck play. And I can’t help but think that only a historically male-dominated profession would decide to make such a common part of its work a giant dick joke.*

41-ngs9jfnl-_sx300_
I know archives are sexy but come on guys.

When all’s said and done, I hope the Cable Bible will be a shareable document, easily navigable by categories that consider the broad goals of the connections purpose before burrowing down into the specific protocols and connections necessary for particular equipment (with examples listed). Do you need to make a video or audio connection? If video, does it need to carry an analog or digital signal? If analog, is the signal composite or component? If component, how are the luminance and chrominance portions of the signal divided? Into two channels, Y and C? Then you’re looking for an S-Video cable, available with mini-DIN 4-pin connectors, or perhaps a SCART connector if you’re dealing with European equipment. Is it in three channels, Y, Pb and Pr? You’re looking for “component” cables, the more-familiar three-pronged cable(s) traditionally available with red, green and blue RCA connectors, though professional or broadcast equipment might require BNC instead (or, again in Europe, SCART).

300px-scart_20050724_002
When it came to analog video, Europe was crazypants.

Or maybe the Cable Bible will let you work backwards instead: let’s take for instance, that mystery monster of a cable I found while reorganizing the lab. What we have at one end is an 8-pin monitor connector, and at the other, a 5-pin DIN and two UHF connectors. UHF was a WWII-era connection originally developed for conveying radio frequency signals (including video) within a certain range; DIN 5-pin had some applications in professional audio. 8-pin was a very specific protocol intended to carry analog video output and input over the same connection. So what we have here is likely a cable intended to carry both the analog composite video and analog, unbalanced audio signal back and forth from a very old video monitor to a very old video deck (at a guess, possibly 1/2″ open reel).

IMG_2231
An 8-pin monitor connection on the back of a Sony 3/4″ U-matic deck – I don’t have any equipment that would take the UHF/DIN end of the cable.

Even as I work on this blog post, the Cable Bible is spiraling slightly out of control in the Google Doc where I started it, as I add pictures and pinouts. It’s possible that this project would work better as some sort of Wiki, where one could more easily navigate the different layers that cables operate on and branch out to explore from there – others would also potentially be able to contribute and document more obscure or unique types of connections and protocols. Much of this information already exists on the internet, even on Wikipedia itself – but it’s a matter of consolidation and making it at least somewhat specific to archival context. I will update everyone if anything public develops!

vector-shapes-audio-video-connectors
Something like this is just the start.

 

*Upon writing this I went back and remembered that I opened this very post with a fairly unnecessary, though (I hope) innocuous miscegenation joke. I’m keeping it in as this blog is meant to be an honest reflection of my thoughts, and this little contradictory exchange with myself is evidence that even self-reflection and good intention doesn’t change the fact that I am still a straight cis white man in tech and these are the sort of weird little obstacles that still mine the field for others.