Dual-Boot a Windows Machine

It is an inconvenient truth that the MIAP program is spread across two separate buildings along Broadway. They’re only about five minutes apart, and the vast majority of the time this presents no problems for students or staff, but it does mean that my office and one of our primary lab spaces are in geographically separate locations. Good disaster planning, troublesome for day-to-day operations.

The Digital Forensics Lab (alternately referred to as the Old Media Lab or the Dead Media Lab, largely depending on my current level of frustration or endearment towards the equipment contained within it) is where we house our computing equipment for the excavation and exploration of born-digital archival content: A/V files created and contained on hard drive, CD, floppy disk, zip disk, etc. We have both contemporary and legacy systems to cover decades of potential media, primarily Apple hardware (stretching back to a Macintosh SE running OS 7), but also a couple of powerful modern Windows machines set up with virtual machines and emulators to handle Microsoft operating systems back to Windows 3.1 and MS-DOS.

Having to schedule planned visits over from my office to the main Tisch building in order to test, update, or otherwise work with any of this equipment is mildly irksome. That’s why my office Mac is chock full of emulators and other forensic software that I hardly use on any kind of regular basis – when I get a request from a class for a new tool to be installed in the Digital Forensics Lab, it’s much easier to familiarize myself with the setup process right where I am before working with legacy equipment; and I’m just point-blank unlikely to trek over the other building for no other reason than to test out new software that I’ve just read about or otherwise think might be useful for our courses.

sleepy-office-worker-at-desk-with-multiple-coffees
#ProtestantWorkEthic

This is a long-winded way of justifying why the department purchased, at my request, a new Windows machine that I will be able to use as a testing ground for Windows-based software and workflows (I had previously installed a Windows 7 virtual machine on my Mac to try to get around some of this, but the slowed processing power of a VM on a desktop not explicitly setup for such a purpose was vaguely intolerable). The first thing I was quite excited to do with this new hardware was to set up a dual-boot operating system: that is, make it so that on starting up the computer I would have the choice of using either Windows 7 or Windows 10, which is the main thing I’m going to talk about today.

IMG_2329
Swag

Pretty much all of our Windows computers in the archive and MIAP program still run Windows 7 Pro, for a variety of reasons – Windows 8 was geared so heavily towards improved communication with and features for mobile devices that it was hardly worth the cost of upgrading an entire department, and Windows 10 is still not even a year old, which gives me pause in terms of the stability and compatibility of software that we rely on from Windows 7. So I needed Windows 7 in order to test how new programs work with our current systems. However, as it increases in market share and developers begin to migrate over, I’m increasingly intrigued by Windows 10, to the point that I also wanted access to it in order to test out the direction our department might go in the future. In particular I very much wanted to try out the new Windows Subsystem for Linux, available in the Windows 10 Anniversary Update coming this summer – a feature that will in theory make Linux utilities and local files accessible to the Windows user via a Bash shell (the command-line interface already seen on Mac and Ubuntu setups). Depending how extensive the compatibility gets, that could smooth over some of the kinks we have getting all our students (on different operating systems) on the same page in our Digital Literacy and Digital Preservation courses. But that is a more complicated topic for another day.

When my new Windows machine arrived, it came with a warning right on the box that even though the computer came pre-installed with Windows 7 and licenses/installation discs for both 7 and Windows 10,

You may only use one version of the Windows software at a time. Switching versions will require you to uninstall one version and install the other version.

1d8acd8c6e8e337ce31bef84a8636491

This statement is only broadly true if you have no sense of partitioning, a process by which you can essentially separate your hard drive into distinct, discrete sections. The computer can basically treat separate partitions as separate drives, allowing you to format the different partitions with entirely separate file systems, or, as we will see here, install completely different operating systems.

Now, as it happens, it also turned out to be semi-true for my specific setup, but only temporarily and because of some kinks specific to manufacturer who provided this desktop (hi, HP!). I’ll explain more in a minute, but right now would be a good point to note that I was working with a totally clean machine, and therefore endangering no personal files in this whole partitioning/installation process. If you also want to setup some kind of dual-boot partition, please please please make sure all of your files are backed up elsewhere first. You never know when you will, in fact, have to perform a clean install and completely wipe your hard drive just to get back to square one.

1a0a18d74db871e6358d7526b271c0e749d9cedb8afd2411816625802370c924
“Arnim Zola sez: back up your files, kids!”

So, as the label said, booting up the computer right out of the box, I got a clean Windows 7 setup. The first step was to make a new blank partition on the hard drive, on to which I could install the Windows 10 operating system files. In order to do this, we run the Windows Disk Management utility (you can find it by just hitting the Windows Start button and typing “disk management” into the search bar:

start

Once the Disk Management window pops up, I could see the 1TB hard drive installed inside the computer (labelled “Disk 0”), as well as all the partitions (also called “volumes”) already on that drive. Some small partitions containing system and recovery files (from which the computer could boot into at least some very basic functionality even if the Windows operating system were to corrupt or fail) were present, but mostly (~900 GB) the drive is dedicated to the main C: volume, which contains all the Windows 7 operating files, program files, personal files if there were any, etc. By right-clicking on this main partition and selecting “Shrink Volume,” I can set aside some of that space to a new partition, on to which we will install the Windows 10 OS. (note all illustrative photos gathered after the fact, so some numbers aren’t going to line up exactly here, but the process is the same)

hesx3

If you wanted to dual-boot two operating systems that use completely incompatible file systems – for instance, Mac and Windows – you would have to set aside space for not only the operating system’s files, but also all of the memory you would want to dedicate to software, file storage, etc. However, Windows 7 and 10 both use the NTFS file system – meaning Windows 10 can easily read and work with files that have been created on or are stored in a Windows 7 environment. So in setting up this new partition I only technically had to create space for the Windows 10 operating system files, which run about 25 GB total. In practice I wanted to leave some extra space, just in case some software comes along that can only be installed on the Windows 10 partition, so I went ahead and doubled that number to 50 GB (since Disk Management works in MB, we enter “50000” into the amount of space to shrink from the C: volume).

shrink_volume

Disk Management runs for a minute and then a new Blank Partition appears on Disk 0. Perfect! I pop in the Windows 10 installation disc that came with the computer and restart. In my case, the hardware automatically knew to boot up from the installation disc (rather than the Windows 7 OS on the hard drive), but it’s possible others would have to reset the boot order to go from the CD/DVD drive first, rather than the installed hard drive (this involves the computer’s BIOS or UEFI firmware interface – more on that in a minute – but for now if it gives you problems, there’s plenty of guides out there on the Googles).

Following the instructions for the first few parts of the Windows 10 installer is straightforward (entering a user name and password, name for the computer, suchlike), but I ran into a problem when finally given the option to select the partition on to which I wanted to install Windows 10. I could see the blank, unformatted 50 GB partition I had created, right there, but in trying to select it, I was given this warning message:

Windows cannot be installed to this disk. The selected disk is of the GPT partition style.

Humph. In fact I could not select ANY of the partitions on the disk, so even if I had wanted to do a clean install of Windows 10 on to the main partition where Windows 7 now lived, I couldn’t have done that either. What gives, internet?

So for many many many years (in computer terms, anyway – computer years are probably at least equivalent to dog years), PCs came installed with a firmware interface called the BIOS – Basic Input/Output System. In order to install or reinstall operating system software, you need a way to send very basic commands to the hard drive. The BIOS was able to do this because it lived on the PC’s motherboard, rather than on the hard drive – as long as your BIOS was intact, your computer would have at least some very basic functionality, even if your operating system corrupted or your hard drive had a mechanical failure. With the BIOS you could reformat your hard drive, select whether you booted the operating system from the hard drive or an external source (e.g. floppy drive or CD drive), etc.

header
Or rule a dystopian underwater society! …wait

In the few seconds when you first powered on a PC, the BIOS would look to the very first section of a hard drive, which (if already formatted) would contain something called a Master Boot Record, a table that contains information about the partitions present on that hard drive: how many partitions are present, how large each of them are, what file system was present on each, which one(s) contained bootable operating system software, which partition to boot from first (if multiple partitions had a bootable OS).

windows-cannot-be-installed-to-this-disk
You probably saw something like this screen by accident once when your cat walked across your keyboard right as you started up the computer.

Here’s the thing: because of the limitations of the time, the BIOS and MBR partition style can only handle a total of four partitions on any one drive, and can only boot from a partition if it isless than about 2.2 TB in size. For a long time, that was plenty of space and functionality to work with, but with rapid advancements in the storage size of hard drives and the processing power of motherboards, the BIOS and MBR partitioning became increasingly severe and arbitrary roadblocks. So from the late ’90s through the mid-’00s, an international consortium developed a more advanced firmware interface, called UEFI (Unified Extensible Firmware Interface) that employed a new partition system, GPT (GUID Partition Table). With GPT, there’s theoretically no limit to the number of partitions on a drive, and  UEFI can boot from partitions as large as 9.4 ZB (yes, that’s zettabytes). For comparison’s sake, 1 ZB is about equivalent to 36,000 years of 1080p high-definition video. So we’re probably set for motherboard firmware and partition styles for a while.

n2cnt4
We’re expected to hit about 40 zettabytes of known data in 2020. Like, total. In the world. Our UEFI motherboards are good for now.

UEFI can not read MBR partitions as is, though it has a legacy mode that can be enabled to restrict its own functionality to that of the BIOS, and thereby read MBR. If the UEFI motherboard is set to only boot from the legacy BIOS, it can not understand or work with GPT partitions. Follow?

So GETTING BACK TO WHAT WE WERE ACTUALLY DOING….the reason I could not install a new, Windows 10-bootable partition on to my drive was that the UEFI motherboard in my computer had booted from the legacy BIOS -for some reason.

jdhvc
Me.

Honestly, I’m not sure why this is. Obviously this was not a clean hard drive when I received it – someone at HP had already installed Windows 7 on to this GPT-partitioned hard drive, which would’ve required the motherboard to be in UEFI boot mode. So why did it arrive with legacy BIOS boot mode not only enabled, but set first in the preferential boot order? My only possible answer is that after installing Windows 7, they went back in and set the firmware settings to legacy BIOS boot mode in order to improve compatibility with the Windows 7 OS – which was developed and released still in the days when BIOS was still the default for new equipment.

This was a quick fix – restart the computer, follow the brief on-screen instructions to enter the BIOS (usually pressing the ESC key, though it can vary with your setup), and navigating through the firmware settings to re-enable UEFI boot mode (I also left legacy BIOS boot enabled, though lower in the boot order, for the above-stated reasoning about compatibility with Windows 7 – so now, theoretically, my computer can start up from either MBR or GPT drives/disks with no problem).

Phew. Are you still with me after all this? As a reward, here’s a vine of LeBron James blocking Andre Iguodala to seal an NBA championship, because that is now you owning computer history and functionality.

https://vine.co/v/5BuzmV0Xw5b

From this point on, we can just pop the Windows 10 installation disc back in and follow the instructions like we did before. I can now select the unformatted 50 GB partition on which to install Windows 10 – and the installation wizard basically runs itself. After a lot of practical setup username and password nonsense, now when I start up my computer, I get this screen:

boot-screen-640x480

And I can just choose whether to enter the Windows 7 or 10 OS. Simple as that. I’ll go more into some of what this setup allows me to do (particularly the Windows Subsystem for Linux) another day, as this post has gone on waaaayy too long. Happy summer, everyone!

Balancing Your Audio

If anyone out there is reading this blog but happens not to follow me on Twitter, first off, let me tell you you’re missing some gems.

But you will hopefully also be pleased to learn that due to an unexpectedly quick and overwhelmingly positive response to my last post, The Cable Bible has passed from a half-rhymed idea to a concrete project in record time. The bare bones of it are in place on the AMIA Open Source Committee’s Github page, and I will be trying to expand on it throughout the summer, adding more interface and protocol descriptions, images of connectors and pinouts, etc. If it’s a project you’re interested in, please contribute! Moving the guide from my own private Google doc to a Github repository was intended to encourage collaboration (I can’t catalog every single computer cable myself, dig?) – and if using Github overwhelms you (it IS intended mainly for software developers, and using it for documentation has its advantages but also non-intuitive kinks), may I direct you to Ashley Blewer’s terrific introduction to the interface?

But before The Cable Bible becomes a one-stop shop for all this information, I’ve already started to field a couple questions about cabling based on what I’ve learned over the past month or so. And one of the main ones, which I’d like to talk briefly about today, is the question of balanced versus unbalanced audio cables, and what the hell that means.

skater
Nope not that kind of balance

Balanced and unbalanced audio were terms that came up occasionally during my coursework at NYU, primarily in relation to setting up workflows for digitization of media on magnetic tape. But unlike, say, the difference between composite and component video signals, the difference between balanced and unbalanced audio was never really satisfactorily explained to us. That’s sometimes how it is when you attend a Moving Image Archiving program – due to to the constraints of time in an already over-crowded curriculum, some aspects of audio preservation can get short shrift, as if audio is a necessary evil in our quest to appease our visual appetites (we fully recognize this is something to improve in our program and the field at large – recently the Library of Congress hosted a conference regarding the work of its Radio Preservation Task Force, and the 10th Orphan Film Symposium in April celebrated how the history of film and the moving image is intertwined with the history of sound recording).

https://www.facebook.com/plugins/video.php?href=https%3A%2F%2Fwww.facebook.com%2Fstreible%2Fvideos%2F10154045836516738%2F&show_text=0&width=560

But understanding balanced and unbalanced audio is a relatively simple concept, as it turns out, and worth understanding because from the outside the binary nature of the terms is slightly misleading, implying “good” and “bad” audio cables, when really employing either is perfectly acceptable given the right context.

Audio cables in general are susceptible to “noise.” No duh, right. But I’m not talking about noise in the general sense of “sound.” In audio cabling (whether a production, playback or preservation environment), “noise” usually refers to unwanted interference with the audio signal as it passes along a cable: an audible hum introduced by other electronic equipment or lights in the vicinity, minuscule snatches of radio and television transmissions in the air, etc.

Unbalanced cables actually contain two wires inside of them (and connectors that have two corresponding conductors each). One wire carries the audio signal, while the other is referred to as “ground” (and typically surrounds the audio signal wire in the center of the cable). The primary purpose of the ground wire is to shield the signal wire from outside interference, and in many cases it does a fine job of noise reduction. However, at other times, the ground wire can itself act as an antenna and start picking up unwanted noise itself. Typically the longer the cable, the more susceptible an unbalanced cable will be to noise.

Screen Shot 2016-05-17 at 11.52.29 AM

Balanced cables were introduced as a method of increased noise reduction. Instead of two wires, balanced audio cables contain three wires: ground, “hot” (the desired audio signal) and “cold” (a “negative” copy of the signal). The signal on the “cold” wire is exactly 180° out of phase of the audio signal – meaning if you were to play these two signals at the same time, they would theoretically cancel each other out, leaving you with silence.

phase-inversion
On the right, a “cold” signal with positive and negative peaks exactly the opposite of the original.

That seems undesirable. But what equipment designed for a balanced signal does is flip the signal on the “cold” wire back into phase at the point of input. This has two advantages: first, you’ve now theoretically doubled the strength of your audio signal, as the signals from the two wires will be perfectly the same at playback. But also keep in mind that any noise that was picked up as the signal traveled along the cable will be the same on both wires (not 180° out of phase with itself). So when the “cold” wire signal is inverted at input, the noise will now be phased out when combined with the noise signal from the “hot” wire. You’ve cancelled out the noise when the audio is played back.

balanced-wiring
Image credit to these past three images from avion.com

 

Certain types of connectors make it easy to spot a balanced or unbalanced cable. XLR connectors are most common for balanced signals (you can easily see the three pins in an XLR connection, acting as the conductors for the ground, hot and cold wires), while RCA connectors will always carry an unbalanced signal, as they only have two conductors (the main pin conducts the audio signal, while the outer, round shield of an RCA connector is the contact point for the ground wire).

xlr-male-rca-male
More acronyms please I don’t think we have enough of those yet in media preservation

1/4″ and 1/8″ jacks, however, come in different flavors (TS “tip-sleeve” or TRS “tip-ring-sleeve”), and are also frequently used to carry stereo signals (all the discussion above relates to mono signals, where one cable is designed to convey one channel of audio; stereo cables can carry two channels of audio, which adds another wrinkle to the balanced/unbalanced issue). So you have to watch out with 1/4″ and 1/8″ jack inputs and cables, to be sure you’re conveying the signal you want. (There is no harm per se to using a balanced audio cable with equipment designed to use an unbalanced audio signal, or vice versa; the sound will travel from point A to point B either way, but you won’t get the noise reduction benefit of balanced cables used with balanced equipment).

857c0c0a_stereo-adapter-1-4-to-rca
So many diagrams, so little time

Balanced cables are usually associated with professional setups, while unbalanced audio is frequently found with consumer-grade equipment – DVD and CD players, for example. But really, using either is just matter of how much you’re concerned about noise interference. For instance, in our digital audio workstation in the department, our former cassette deck (bereft of life, it rests in peace) only had RCA, unbalanced audio output. That might seem unideal for an archival digitization workflow, but given our setup the audio cable was barely traveling a foot to a patch bay; plus both the cassette deck and Wavelab software we were using for capture had built-in noise reduction capabilities. If we were more concerned with preserving the quality of these cassette recordings over the content, I might be concerned about using these features, but given at least that specific collection and this specific equipment, noise was not particularly considered an issue and unbalanced signal was perfectly acceptable.

For more lessons in audio cabling, keep an eye peeled on The Cable Bible – and maybe someday we’ll talk about optical cables, which is a whole different, flashy Tron-like ballgame.

Totally the same thing.

The A/V Cable Bible

Once upon a time, the cables in the MIAP lab were organized. I have a spreadsheet and everything to prove it. Video cables were neatly separated from audio-only cables, while patch cables and computer cables inhabited their own particular niches. Order and stability reigned.

Was it a long, slow descent into chaos over the years, as various staff, faculty and students picked one cable off from its carefully arranged post and then flung it back, willy-nilly, into the fray? Was I trolled by the Joker?

cables

Who can say. But for a while now, XLR cables have fraternized with 9-pin remote data controllers, USB 2.0 co-mingling with S-Video. Madness and miscegenation, I say.

It didn’t take long to wrangle our peg board cable storage units back into something resembling structure. Start from scratch, sort by type (video, audio, patch, data), make sure every single cable is at least adequately coiled and has its own tie, and we’ve made it so that at least I will be able to find, say, a 1/4″ TRS jack-to-male RCA cable if I so need one. The addition of labels to the peg board is nominally so that other people, too, might be able to find what they need without too much of a head-scratching search – but the truth is it’s just as much to scare people into realizing that there is, in fact, SOME sort of system in place.

IMG_2207

 

IMG_2208
Progress, of a sort

Why am I talking about a minor organization project, that, really, only took about an hour of work? Well, this bout of spring cleaning inspired a sort of The More You Know moment – one of my attempts at self-education that comes (constantly) with the territory. With about 90% of the cables on the board, I was already familiar with their purpose and nomenclature – for instance, that a BNC-to-video-patch cable could be used to hook up the composite video output from one of our auxiliary decks (DVCam, S-VHS, Betamax, various formats that we don’t use enough to merit being mounted into our more permanent racks) into our digitization workflow. There were others that I had never used, and was somewhat flummoxed as to what equipment they were used with or even what kind of signal they were intended to carry. This one in particular was an old-school stumper:

IMG_2232
And also weighs about as much as a tree stump, by the way

So I started to write up a guide, originally intended for my own use. A Cable Bible, if you will, detailing all (or at least the most common) connections that I would have to make and some context for what kinds of cables tend to appear where. Because once I started looking up some of these things, I realized what a rabbit hole A/V cabling is. See, every cable you use in media production and preservation is actually a jumbled design of wire material, signal types, encodings and interfaces, physical connectors, etc. etc. You can say you have a hard drive you need to connect to a computer and you need a USB cable, but what does that really mean? Is the interface USB 2.0, 3.0 or 3.1? What kind of connecting port does the computer have – Type A? Type B? Type C? Or the hard drive itself – is its port Type Micro-A? Micro-B? Micro-B SuperSpeed?

usbc-connector-explained-150310b
I mean this isn’t even comprehensive!

There’s also the fact that the standard for discussing cable connections is to differentiate between connectors with pins as “male” and connections with ports as “female.” For instance, in order to recently set up a DVCam deck for digitization, I needed a female RCA-to-male XLR adapter in order to get a male-to-male RCA cable to hook up with a female XLR port. Perhaps I’m just being phenomenally PC about this, but I’ve never cared for the way I sound vaguely like a pervy teenager when I’m just trying to make a video deck play. And I can’t help but think that only a historically male-dominated profession would decide to make such a common part of its work a giant dick joke.*

41-ngs9jfnl-_sx300_
I know archives are sexy but come on guys.

When all’s said and done, I hope the Cable Bible will be a shareable document, easily navigable by categories that consider the broad goals of the connections purpose before burrowing down into the specific protocols and connections necessary for particular equipment (with examples listed). Do you need to make a video or audio connection? If video, does it need to carry an analog or digital signal? If analog, is the signal composite or component? If component, how are the luminance and chrominance portions of the signal divided? Into two channels, Y and C? Then you’re looking for an S-Video cable, available with mini-DIN 4-pin connectors, or perhaps a SCART connector if you’re dealing with European equipment. Is it in three channels, Y, Pb and Pr? You’re looking for “component” cables, the more-familiar three-pronged cable(s) traditionally available with red, green and blue RCA connectors, though professional or broadcast equipment might require BNC instead (or, again in Europe, SCART).

300px-scart_20050724_002
When it came to analog video, Europe was crazypants.

Or maybe the Cable Bible will let you work backwards instead: let’s take for instance, that mystery monster of a cable I found while reorganizing the lab. What we have at one end is an 8-pin monitor connector, and at the other, a 5-pin DIN and two UHF connectors. UHF was a WWII-era connection originally developed for conveying radio frequency signals (including video) within a certain range; DIN 5-pin had some applications in professional audio. 8-pin was a very specific protocol intended to carry analog video output and input over the same connection. So what we have here is likely a cable intended to carry both the analog composite video and analog, unbalanced audio signal back and forth from a very old video monitor to a very old video deck (at a guess, possibly 1/2″ open reel).

IMG_2231
An 8-pin monitor connection on the back of a Sony 3/4″ U-matic deck – I don’t have any equipment that would take the UHF/DIN end of the cable.

Even as I work on this blog post, the Cable Bible is spiraling slightly out of control in the Google Doc where I started it, as I add pictures and pinouts. It’s possible that this project would work better as some sort of Wiki, where one could more easily navigate the different layers that cables operate on and branch out to explore from there – others would also potentially be able to contribute and document more obscure or unique types of connections and protocols. Much of this information already exists on the internet, even on Wikipedia itself – but it’s a matter of consolidation and making it at least somewhat specific to archival context. I will update everyone if anything public develops!

vector-shapes-audio-video-connectors
Something like this is just the start.

 

*Upon writing this I went back and remembered that I opened this very post with a fairly unnecessary, though (I hope) innocuous miscegenation joke. I’m keeping it in as this blog is meant to be an honest reflection of my thoughts, and this little contradictory exchange with myself is evidence that even self-reflection and good intention doesn’t change the fact that I am still a straight cis white man in tech and these are the sort of weird little obstacles that still mine the field for others.