Live Capture Plus and QuickTime for Java

One of the particular challenges of video preservation is how to handle and preserve the content on digital tape formats from the latter days of magnetic A/V media: Digital Betacam, DVCam, DVCPro, etc. Caught in the nebulous time of transition between analog and digital signals (the medium itself, magnetic tape, is basically the same as previous videotape formats like VHS or Betacam – but the information stored on them was encoded digitally), these formats were particularly popular in production environments but there were plenty of prolific consumer-grade efforts as well (MiniDV, Digital8). In some ways, this makes transferring content easier than handling analog formats: there is no “digitization” involved, no philosophical-archival conundrum of how best to approximate an analog signal into a digital one. One simply needs to pull the digital content off the magnetic tape intact and get it to a modern storage medium (hard disk, solid-state, or maybe LTO, which yes I know is still magnetic tape but pay no attention to the man behind the curtain).

 https://twitter.com/dericed/status/981965351482249216?ref_src=twsrc%5Etfw&ref_url=https%3A%2F%2Ftweetdeck.twitter.com%2F

However, even if you still have the proper playback deck, and the right cables and adapters to hook up to a contemporary computer, there’s the issue of software – do you have a capture application that can communicate properly with the deck *and* pull the digital video stream off the tape as-is?

The last bit is getting especially tricky. As DV-encoded formats in particular lost popularity in broadcast/production environments, the number of applications that can import and capture DV video without transcoding (that is, changing the digital video stream in the process of capture), while staying compatible with contemporary, secure operating systems/environments, has dwindled. That’s created a real conundrum for a lot of archivists. Apple’s Final Cut Pro application, for instance, infamously dropped the ability to capture native DV when it “upgraded” from Final Cut Pro 7 to Final Cut X (you can hook up and capture tapes, but Final Cut X will automatically transcode the video to ProRes). Adobe Premiere will still capture DV and HDV codecs natively, but will re-package the stream into a .mov QuickTime wrapper (you can extract the raw DV back out, though, so this is still a solid option for many, though of course for just as many more an Adobe CC subscription is beyond their means).

One of the best options for DV capture is (was?) a Mac application called Live Capture Plus, made by Square Box Systems as part of its CatDV media suite. It has great options for error handling (e.g. trying to read a problem area of a tape multiple times automatically if there’s dropout), generating DV files based on clips or scenes or timecode rather than the whole tape, remote tape deck control over the FireWire/Thunderbolt connection, etc; a bunch of ingesting stuff that’s more appealing to an archivist than an application primarily for editing, like Adobe Premiere. It also talks to you, which is fun but also terrifying.

Failed to power up

However, Square Box removed Live Capture Plus from its product list some years back, and as far as I’m aware has refused all pleas to either open-source the legacy code or even continue to sell new licenses to those in the know.

Let’s say you *are* lucky enough to still have an old Live Capture Plus license on hand, however. The Live Capture Plus GUI is built on Java, but an older, legacy version of Java, so when you first try to run the app on a contemporary OS (~10.10 and up), you’ll first see this:

Luckily, for at least the moment, Apple still offers/maintains a download of this deprecated version of Java – just clicking on “More Info…” in that window will take you there, or you can search for “Java for OSX” to find the Apple Support page.

OK, so you’ve downloaded and installed the legacy Java for OSX. Yet this time, when you try again to run Live Capture Plus, you run into this fun error message instead:


All right. What’s going on here?

When I first encountered this error message, even though I didn’t know Java, this error message provided two clues: 1) “NoClassDefFoundError” – so Java *can’t find* some piece that it needs to run the application correctly; and 2) “quicktime”/”QTHandleRef” – so it specifically can’t find some piece that relates to QuickTime. That’s enough to go on a search engine deep dive, where I eventually found this page where researchers at the University of Wisconsin-Madison’s zoology/molecular biology lab apparently encountered and solved a similar issue with a piece of legacy software related to, near as I can figure from that site, taking images of tiny tiny worms. (I desperately want to get in touch and propose some sort of panel with these people about working with legacy software, but am not even sure what venue/conference would be appropriate)

The only way my ’90s-kid brain can understand what’s happening

So basically, recent versions of Mac OS have not included key files for a  plugin called “QuickTime for Java” – a deprecated software library that allowed applications built in/on Java (like Live Capture Plus) to provide multimedia functionality (playback, editing, capture) by piggybacking on the QuickTime application’s support for a pretty wide range of media formats and codecs (including DV). Is this fixable? If you can get those key files, yes!

For now, both the downloads and instructions for what to do with these three files are available on that Hardin Lab page, but I’m offering them here as well. The fix is pretty quick:

 

I would only note a couple things, which is that I’ve successfully installed on to macOS Sierra (10.12) without needing to mess with the System Integrity Protection settings by just installing for the local user (i.e. putting the files into ~/Library rather than /System/Library); and that if you want to do this via Finder rather than the command line, here is how to get to the “View Options” box to reveal the Library folder in Finder, as mentioned in Step 1 above (a useful step in general, really, if you’re up to digipres shenanigans):

Once these files are in place, Live Capture Plus should open correctly and be able to start communicating with any DV/FireWire-capable deck that you’ve got connected to your Mac – again, provided that you’ve got a registration code to enter at this point.

A final word of warning, however. Live Capture Plus comes from the era of 32-bit applications, and we’re now firmly in the era of 64-bit operating systems. Exactly what all that means is probably the subject of another post, but basically it’s just to say that legacy 32-bit apps weren’t made to take advantage of modern hardware, and may run slower on contemporary computers than they did on their original, legacy hardware. Not really an issue when you’re in video/digital preservation and your entire life is work-arounds, but recently Mac OS has taken to complaining about 32-bit apps:

Despite 32-bit apps providing, near as I can tell, no actual security or compatibility concerns for 64-bit OSes (32-bit apps just can’t take advantage of more than 4GB of RAM), this is a pretty heavy indication that Apple will likely cut off support for 32-bit apps entirely sometime in the not-so-so-distant future. And that will go for, not just Live Capture Plus, but other legacy apps capable of native DV transfer (Final Cut 7, the DVHSCap utility from the FireWire SDK, etc.)

So go get those DV tapes transferred!!!!

Getting Started with BagIt in 2018

Take two!

In December, I hastily wrote an update to an old post about BagIt, the Library of Congress’ open-source specification for hierarchical packaging of files to support safe data storage and transfer. The primary motivation for the update was some issues that the Video Preservation course I work with  encountered with my instructions for installing the bagit-python command-line tool, so I wanted to double-check my process there and make sure I was guiding readers correctly. I also figured that it had been a couple years and I could write about new implementations while I was at it. A cursory search turned up a BagIt-for-Ruby library, so I threw that in there, posted, *then* opened up a call for anything I’d missed.

Uhhhh –  I missed a lot.

It was at this point, as I sifted through the various scripts, apps, tools and libraries that create bags in some way that I realized I had lost the thread of what I was even trying to summarize or explain.

Every piece of software using the BagIt spec ever? That, happily, is a fool’s errand – the whole point of the spec is that it’s super easy and flexible to implement, no matter how short the script. So…there’s a lot of implementations.

Every programming language with an available port/module for creating bags according to the BagIt spec? Mildly interesting for hybrid archivist/developers, but probably of less practical use for preservation students, or the average user/creator just trying to take care of their own files, or archivists that are less programming-inclined. A Ruby module for BagIt is objectively cool and useful – for those working/writing apps and scripts in Ruby. Given that setting up a Ruby dev environment requires some other command-line setup that I didn’t even get into, someone’s likely not heading straight to that module right out of the gate.

“Using BagIt” was/is the wrong framework. Too broad, too undefined, and as Ed Summers pointed out, antithetical to the spirit in which a simple, open source specification is made in the first place: to allow anyone to use it, anywhere, however they can – not according to one of four or five methods proscribed in a blog post.

So I am rewriting this post from the mindset, not of “here’s all the forms and tools in which BagIt exists”, but rather, “ok, so I’m learning what a bag is and why’s it useful – how can I make one to get started?”

Because the contents of a specification are terrific and informative, but in my experience nothing reinforces understanding of a spec like a concrete example. And not only that, but one step further – *making* an example. Technical concepts without hands-on labwork or activities to solidify them get lost – and budding digital preservationists told to use the BagIt spec need somewhere to start.

So whether you’re just trying to securely back up your personal files to a cloud service, or trying to get a GLAM institution’s digital repository to be OAIS-compliant, validation and fixity starts at square one. Let me do that as well.

What’s a bag?

Just for refresher’s sake, I’m going to re-post here what I wrote back in 2016 – so that this post can stand alone as a primer:

One of the big challenges in digital archiving is file fixity – a fancy term for checking that the contents of a file have not been changed or altered (that the file has remained “fixed”). There’s all sorts of reasons to regularly verify file fixity, even if a file has done nothing but sit on a computer or server or external hard drive: to make sure that a file hasn’t corrupted over time, that its metadata (file name, technical specs, etc.) hasn’t been accidentally changed by software or an operating system, etc.

But one of the biggest threats to file fixity is when you move a file – from a computer to a hard drive, or over a server. Think of it kind of like putting something in the mail: there are a lot of points in the mailing process where a computer or USPS employee has to read the labeling and sort your mail into the proper bin or truck or plane so that it ends up getting to the correct destination. And there’s a LOT of opportunity for external forces to batter and jostle and otherwise get in your mail’s personal space. If you just slap a stamp on that beautiful glass vase you bought for your mother’s birthday and shove it in the mailbox, it’s not going to get to your mom in one piece.

So a “bag” is a kind of special digital container – a way of packaging files together to make sure what we get on the receiving end of a transfer is the same thing that started the journey (like putting that nice glass vase in a heavily padded box with “fragile” stamped all over it).

Sounds great! How do I make a bag?

At its core, all you need to make a bag out of a digital file or group of files is an editor capable of making plain text files (.txt) and an ability to generate MD5 checksums. An MD5 generator takes *any* string of digital information – including an entire file – and encodes it into a 128-bit fingerprint; that is, a 32-character string of seemingly “random” letters and numbers. Running an MD5 generator on the same file will always produce the same 32-character string. If the file changes in some way (even some change or edit invisible to the user), the MD5 string will change as well. So this process of generating and checking strings allows you to know whether a file is exactly the same on the receiving end of a transfer as it was at the beginning.

BagIt bags facilitate this process via a “tag manifest” – a text file including all the digital files contained in the bag (the “data” in question) and their corresponding MD5 checksums. Packaged together (along with some meta information on the BagIt spec and the bag itself), this all allows for convenient fixity checking.

Convenient, though, in the sense of easing automation. While you *can* put together a bag by hand – generating checksums for each file, copying them into text files to create the manifests, structuring the data and manifests together in BagIt’s dictated hierarchy -that is a copy/paste nightmare, and not exactly going to encourage the computer-shy into healthier digipres practice.

This is why simple scripts and tools and apps are handy. Down the line, when you’re creating your own archival workflow, you may want to find or tweak or make your own process for creating bags – but for your first bag, there’s no need to reinvent the wheel.

I’m going to cover intro tools here, for either the command line or GUI user.

Command Line Tools

  1. this Bash scriptA simple shell script by Ed that just requires just two arguments: the directory you want to bag, and an output directory (in which to put the bag).Hit the green “Download” button in the corner of the GitHub page, select the ZIP file, then unzip the result. Move the “bagit.sh” file inside to a convenient/accessible location in your computer.Once in Terminal, you can run this bash script by navigating to wherever you put this script, then executing it with:
    $ ./bagit.sh /path/to/directory /path/to/bag

    or

    $ bash bagit.sh /path/to/directory /path/to/bag

    (the “./” or “bash” commands do the same thing – indicating to the Bash terminal to execute the bagit.sh script)

    The “/path/to/directory” should be a folder containing all the files you want to be in the bag. Then you will specify the output path for the bag with “/path/to/bag”. Both can be accomplished with drag-and-dropping folders from the Finder.

  2. bagit-pythonBagit-python is the Library of Congress’s officially-supported command line utility for making and working with bags. It requires a working Python interpreter on your computer, plus Python’s package manager, “pip”. By default, macOS comes with a Python interpreter (2.7.10), but not pip. So we go to the popular command-line Mac package manager Homebrew to put this all together.Sigh. OK. So one of the reasons this post didn’t come out last week is that, literally in that same time frame, Homebrew went through….something with regards to their Python packages and how they behaved with Python 2.x vs Python 3.x vs the Python installation that comes with your Mac. (they’ve locked/deleted a lot of the conversations and issues now, but it was really the dark side of FOSS projects in there for a bit). I kept trying to check my instructions were correct, and meanwhile, every “$ brew update” was sending my python installs haywire. It seems like they’ve finally settled, but, I’d still now generally recommend giving this page a once-over before working with python-via-homebrew.

But to summarize: if you want to work with Python 3.x, you install a *package* called “python” and then invoke it with python3 and pip3 commands. If you want to use Python 2.x, you install a package called “python@2” and then invoke with either python and pip or python2 and pip2 commands.

…got it?

For the purposes of just using the bagit-python command-line tool, at least, it doesn’t matter whether you choose Python 2.x or 3.x. It’ll work with both. But stick with one or the other through this installation process. So either:

$ brew install python

+

$ sudo pip3 install bagit

or:

$ brew install python@2

+

$ sudo pip install bagit

That’s it! It’s just making sure you have a version of python installed through Homebrew, then use the python package/module installer “pip”to install the bagit-python tool. I highly recommend using admin privileges with “sudo” to globally install and avoid some weird permissions issues that may arise from trying to run python scripts and tools like bagit-python otherwise.

One installed, look over the help page with

$ bagit.py --help

to see the command syntax – and all the features that you can cover! Including using different hash generators (rather than MD5), adding metadata, validating existing bags rather than creating new ones, etc.

*** a note about bagit-java***
If you are using Homebrew and just run

$ brew install bagit

it will install the bagit-java 4.12.3 library and command-line tool. The LOC no longer supports and doesn’t recommend this tool for command line use, and the –help instructions that come with it don’t even actually reflect the command syntax you have to use to make it work. So! This isn’t a recommendation but just a note for Homebrew users who might get confused about what’s happening here.

GUIs

1. Bagger

Again, the LOC’s official graphical utility program for creating and validating bags. Following the instructions from their GitHub repository linked above, you’re going to download a release and then run on macOS by finding and clicking on the “bagger.jar” file (you’ll need a working Java install as well).Inside Bagger, once you choose the “Create a Bag” option, Bagger will ask you to choose a “profile” – these just refer to the metadata fields available for inserting more administrative information about your bag and the files therein, within the bag itself. These are really useful for keeping metadata consistent if you’re creating a whole bunch of bags, but choosing “<no profile>” is also totally acceptable to get started (you can always re-open bags and insert more metadata later!)”Create Bag in Place” is also a useful option if you don’t want (or digital storage limitations even *prevents*) to have two copies of your files (the original + the copy inside the “data” folder in your bag). Rather than copying and creating the bag in a new directory elsewhere, it’ll just move around/checksum/restructure the files according to the BagIt spec within the original directory.

2. Exactly

A GUI developed by AVP and the University of Kentucky that combines the bagging process with file transfer – which is the presumed end-goal of bagging in any case. To that end, Exactly doesn’t “bag in place” – you always have to pick a source file/directory (or sources – Exactly will bundle them all together into one bag) and a destination for the resulting, created bag. Like Bagger, you can also add metadata via custom-designed fields or import templates/profiles. Added support for FTP or SFTP transfers to remote servers (in addition to locally-attached network storage units like a Samba or Windows share) make it a simple starter option for file delivery.

***************************

If you’re getting started with the BagIt spec, these are the places I’d begin. But as to what implementation *you* can come up with from there, based on your personal/institutional needs…that’s up to you!

Classroom Access to Interactive DVDs

Normally my focus as MIAP Technician is on classroom support for courses in the MIAP  M.A. curriculum – but, as a staff member of the wider NYU Cinema Studies department, there are occasionally cases where I can assist non-MIAP Cinema Studies courses with a need for archival or legacy equipment.

That was the case recently with a Fall 2017 course called “Interactive Cinema & New Media”, which challenged the skills I learned in MIAP regarding disk imaging, emulation, and legacy computing, and provides, I think, an interesting case study regarding ongoing access to multimedia software-based works from the ’90s and early 2000s.

In this project I worked closely with Marina Hassapopoulou, the Visiting Assistant Professor teaching the course; Ina Cajulis, recently hired as the department’s Special Events/Study Center Coordinator (also a Cinema Studies M.A. graduate who took several MIAP classes, including Handling Complex Media, the course most focused on interactive moving image works); and Cathy Holter, Cinema Studies Technical Coordinator.

Last fall when Marina was teaching “Interactive Cinema”, I worked briefly with her request to give students access to a multimedia work by Toni Dove, called “Sally or the Bubble Burst”. “Sally” is an interactive DVD-ROM in which users can navigate various menus, watch videos, and interact (sometimes via the keyboard, sometimes using audio input and speech recognition software) with a number of characters, primarily Sally Rand, a burlesque dancer from the mid-20th century. Because it was created/released in 2003, “Sally” has some unique technical requirements: namely, a PowerPC Mac running either OS 9.1-9.2 or OSX 10.2-10.6. At the time, we had to move quickly to make the DVD available for the class – after testing the disc on a couple of legacy OSX laptops from the Old Media Lab, we decided to temporarily keep an old PowerPC iBook running OSX 10.5 in the department’s Study Center lounge, where students from the “Interactive Cinema” course could book time to view “Sally”. This overall worked fine, although there was some amount of lag (some futzy and not-great sounds coming from the laptop’s internal disc drive made me prefer to run the disc off of a USB external drive – better for the disc’s physical safety, worse for its data rate), and the disc’s speech recognition components were not responsive, likely an issue with the laptop’s sound card.

Fast-forward to August 2017. Submitting her screening list for the semester, Marina let us know that not only would she be needing students to have access to “Sally or the Bubble Burst” again, but she was also expanding the course syllabus to include a number of similar interactive software-based works (by which, I’ll define, I mean CD- or DVD-ROMs with moving image material that require specific computer hardware or software components; not just an interactive DVD that will still play back in any common DVD player, which Marina also includes in her course but provide much less of a technical challenge). With more time to plan, I was interested in both more extended testing, to make sure “Sally” and all these works ran more as intended; and to have a discussion with Marina, Cathy, and Ina so we could strategize longer-term plans for access to these works. Quite simply, we are lucky that the department has (largely, I think, thanks to the presence of the MIAP program) over the years maintained a varied collection of legacy computers that could now run/test these works – we may not continue to be so lucky as the years wear on.

The alternative is pretty straightforward: migrate the content on these DVD-ROMs to file-based disk images, and run them through emulators or virtual machines on contemporary computer hardware rather than worn-down, glitchy, eventually-going-to-break legacy machines. But the questions with these kinds of access projects are always, A) has the content really been properly migrated/recreated, and B) does the experience of using the work on contemporary hardware acceptably recreate the experience of the work on its originally-intended hardware. The latter in particular was a question I could not answer on my own – without having seen, interacted with or studied these works in any detail, I did not consider myself in a position to judge whether emulated versions of these works were running as intended, in a manner acceptable for intense, classroom study. Marina and Ina, as scholars of interactive cinema and digital humanities, were in a better position to make an informed decision.

So, my initial goals were:

  • prepare a demo of emulated/virtualized works
  • match each interactive DVD with a legacy computer on which it ran best, for comparison’s sake, or, failing the emulation route, providing access to students

I set aside “Sally or the Bubble Burst”, as its processor/OS requirements put it squarely in the awkward PowerPC + Mac OSX zone that has proven difficult for emulation software and plagued my nightmares in the past. That left three discs to work with, listed here along with the technical requirements outlined in their documentation:

I wasn’t looking to perform bit-for-bit preservation/migration with this project. We still have the discs and their long-term shelf life will be a concern for another day – today I wanted acceptable emulation of the media contained on them. So by Occam’s Razor, I considered Mac’s Disk Utility app to be the quickest and best solution in this case to make disk images for demo and testing.

After selecting a disc in Disk Utility’s side menu, I browsed to Disk Utility’s File menu, selected “New Image” and then “Image from [name_of_disc]”.

Screen Shot 2017-09-29 at 1.43.11 PM.png

I selected the “CD/DVD master” option with no encryption, which, after a few minutes, creates a .cdr file. This was repeated three times, once for each disc.

Screen Shot 2017-09-29 at 1.44.01 PM.png

With a .cdr disk image ready for each work, now it was time to set up an emulated legacy OS environment to test them in. I decided to start with Mac OS 9 – an environment I was already familiar with and which matched at least the OS requirements of all three works.

For emulating Mac OS 8.0 through 9.0.4, I’ve had a lot of success with a program called SheepShaver. Going through all the steps to set up SheepShaver is its own walk-through – so I’m not even going to attempt to recreate it here, and instead just direct you to the thorough guide on the Emaculation forums, which is what I use anyway. (the only question generally is, where to get installation discs or disk images for legacy operating systems – we have a number still floating around the department, but I also have WinWorld bookmarked for all my abandoned software needs).

Once I got a working Mac OS 9 computer running in SheepShaver, I could go into SheepShaver’s preferences and mount the disk images I made earlier of “Immemory”, “Bleeding Through” and “Artintact” as Volumes, so that on rebooting SheepShaver, these discs will now appear on the emulated desktop, just as if we had inserted the original physical discs into an OS 9 desktop.

Screen Shot 2017-09-29 at 12.52.23 PM.png

First off I tried “Immemory”, the oldest work which also only required QuickTime v. 4.0 – which is the default version that comes packaged with Mac OS 9. I couldn’t be sure it was running exactly as planned, but the sound and moving images on the menu played smoothly, and I could navigate through the program with ease (well, relative ease – spotting the location of your cursor is often difficult in “Immemory”, but from reading through the instructions that seemed likely to be part of the point).

Screen Shot 2017-09-29 at 12.43.54 PM.png

Screen Shot 2017-09-29 at 12.44.29 PM.png

The next challenge was that “Bleeding Through” and “Artintact” required higher versions of QuickTime than 4.0. How do you update an obsolete piece of software in a virtual machine? First, scour the googles and the duckduckgos some more until you find another site offering abandonware (WinWorld, unfortunately, only offered up the QuickTime 4.0 installer). Yes, you need to be careful about this – plenty of trolls and far more malevolent actors are out there offering “useful” downloads that turn out to be malware. Generally I’m going to be a little more trusting of a site offering me QuickTime 5.0 than one offering QuickTime X – ancient software that only runs on obsolete or emulated equipment isn’t exactly a very tempting lure, if you’re out phishing. But, still something to watch out for. Intriguingly, I found a site called OldApps.com, similar to WinWorld, in that it has a stable, robust interface, a very active community board, and at least offers checksum information for (semi-)secure downloads. Lo and behold, a (I’m pretty sure) safe QuickTime 6.0.4 installer!

Screen Shot 2017-09-29 at 12.59.34 PM.png

With that program downloaded, now I had to get that into the virtual Mac OS 9 environment. Luckily, SheepShaver offers up some simple instructions for creating a “Shared” folder to shuttle files back and forth between your emulated desktop and your real one.

Screen Shot 2017-09-29 at 12.56.19 PM

Screen Shot 2017-09-29 at 12.57.52 PM.png

With the QuickTime 6 installer moved into my virtual environment, I could run it and ta-da: now the SheepShaver VM has QuickTime 6 in Mac OS 9!

Screen Shot 2017-09-29 at 12.53.51 PM.png

This is the point where I admit – everything had gone so swimmingly that I got a bit cocky. With the tech requirements fulfilled and the OS 9 environment set up, I went into the demo session with Marina, Ina and Cathy without having fully tested all three discs myself beforehand on the hardware it was going to run. And the results were…unideal. The color scheme on the menu for “The Complete Artintact”, supposed to be rendered in bright primary colors, was clearly off:

Screen Shot 2017-09-29 at 12.46.29 PM.png

Audio on “Bleeding Ground” played correctly, but there was no video, and the resolution on the menus was all off and difficult to control:

Screen Shot 2017-09-29 at 12.48.58 PM.png

Screen Shot 2017-09-29 at 12.49.27 PM.png

And even “Immemory”, which had run so smoothly at the start, now had clear interruptions in the audio, broken videos, and transitions between slides/pages were clunky and stuttered.

Though Marina came away impressed with the virtual OS 9 environment and the general idea of using emulators rather than the original media to provide access, the specific results were clearly not acceptable for scrutinous class use. Running some more tests and troubleshooting, I came to two conclusions: first, the iMac we were trying to install SheepShaver on in the Study Center was several years old, and probably not funneling enough processing power to the emulated computer to run everything smoothly. But also, I suspect that the OS 9 virtual machine was missing some system components or plugins for the later works (“The Complete Artintact” and “Bleeding Through”), and that the competing requirements (different versions of QuickTime in particular) was causing confusion when crammed together in one virtual environment – in other words, using QuickTime 6 was actually *too advanced* to run “Immemory”, designed for QuickTime 4.

So, solutions:

  • keep “Immemory” isolated in its own SheepShaver/OS 9 virtual machine with QuickTime 4
  • test “The Complete Artintact” and “Bleeding Through” in a virtual Windows machine, for comparison against different default OS components
  • install everything on a brand-new, more souped-up iMac

Success! Kept alone in its own virtual Mac OS 9 machine with QuickTime 4, “Immemory” went back to running smoothly. Using a different piece of emulation/virtualization software called VirtualBox (maintained by Oracle, and designed primarily to run Windows and Linux VMs), and going back to WinWorld and OldApps for legacy installers, I created a Windows 2000 virtual machine running QuickTime 6 for Windows for “The Complete Artintact” and “Bleeding Through” (settings in screenshot):

Screen Shot 2017-09-29 at 12.16.08 PM.png

Installed on new, powerful hardware (2016 iMac running macOS 10.12) that could correctly/quickly funnel plenty of CPU power and RAM to virtual machines, the works now looked “right” to me, and a second demo with Marina and Ina confirmed:

Screen Shot 2017-09-29 at 12.37.41 PM.pngScreen Shot 2017-09-29 at 12.39.34 PM.png

Screen Shot 2017-09-29 at 12.37.57 PM.png

Screen Shot 2017-09-29 at 12.38.35 PM

(The one last hitch: in “The Complete Artintact”, which is really an anthology collection of a number of interactive software works, some of the pieces had glitchy audio. Luckily, this was solved using VirtualBox’s sound settings, switching to a different virtualized audio controller, from “ICH AC97” to “Soundblaster 16”):

Screen Shot 2017-09-29 at 12.19.32 PM.png

There were a few more setup steps to make accessing the works easier to the students: creating desktop shortcuts for the virtual machines on the iMac desktop AND the disk images inside the virtual machines (so that students could just click through straight to the work, rather than navigating file systems on older, unfamiliar operating systems); adding an extra virtual optical drive to the Windows 2000 VM so that the VM could be booted up with both “Artintact” and “Bleeding Through” loaded at the same time; and creating a set of instructions and tips for the students to follow regarding navigating these emulators and legacy operating systems (for troubleshooting purposes).

Screen Shot 2017-09-29 at 12.36.35 PM.png

That left legacy testing for backup, as well as the question of “Sally and the Bubble Burst”. At this point time was running short, and emulating “Sally” seemed likely to be a more difficult and prolonged process. Luckily, we had an iMac running Mac OSX 10.6 (Snow Leopard), which includes Mac’s Rosetta software installed for running PowerPC applications (like “Sally”) on Intel machines. A disk image of Toni Dove’s work runs smoothly on that machine, including speech recognition input via the iMac’s built-in mic.

I did also run “Immemory”, “Bleeding Through” and “The Complete Artintact” on a Apple G4 desktop running OS 9 and QuickTime 6 – for whatever reason, running this combination of discs and software on the original hardware, as opposed to in the SheepShaver VM, did work acceptably. Though at this point, we accepted the emulation solution for class access, if at any point anything goes wrong, we can move that G4 from the Old Media Lab to the Study Center and run all the discs (or disk images) on that legacy machine, rather than the squiffy laptop solution that we used for “Sally or the Bubble Burst” a year ago.

So, there is the saga of “Interactive Cinema”. Aside from all this is concern that the disk images I made for this process don’t really constitute bit-for-bit preservation, and though Marina thought they were all running as intended, these are incredibly broad works and exploring and testing every detail manually was basically impossible. Ultimately, we may want to create forensic disk images off of the CD- and DVD-ROMs to ensure that we’re really capturing all the data and can ensure access to them in the future. But for now….it’s time for me to take a break!