MCN2017 and the Museum <-> Museum Gap
As a first time attendee of MCN last week, I didn’t know entirely what to expect, other than all things digital plus museums. The program was really quite diverse, and ranged from relatively technical discussions (sometimes dry; sometimes hilarious) to more meta-professional sessions that really did function as a kind of group therapy. The latter sessions were clearly necessary for many attendees. Rachel Ropeik starkly describes how this year’s MCN teemed with stories of disillusionment, marked by a distressing number of people talking about leaving the field.1 This post is an attempt to set Ropeik’s trenchant observations next to some overarching questions from the pre-conference Digital Provenance Symposium hosted by the Carnegie Museum of Art, and assess how MCN and the musetech community are doing at actually building shared digital practices.
2017 Digital Provenance Symposium seeing @cmoa’s Northbrook tools in the wild #MCN2017 #digiprov pic.twitter.com/AF5Mfly8Of— Karina Wratschko (@karinanw) November 6, 2017
The pre-conference gathering was supported by CMOA’s ArtTracks project to develop a suite of open source software tools, as well as data models and vocabularies, for structuring data about provenance. Part of ArtTracks’ stated mission is to develop outcomes that are “useful (and usable) across multiple institutions.” What were we doing, asked keynote speaker Jo Ellen Parker (president of the Carnegie Museums of Pittsburgh) to stem the tide of these proliferating, bespoke ways of talking about shared problems? Most speakers only proved how badly our community needed to hear that point. They each discussed how they are handling the creation and sharing of their own provenance data - most of which entailed creating custom data models and services to run them. Everyone, though, was acknowledging the problem Parker pointed out: what does it mean to collect data for your research but help it have a life (and users) outside that particular project? And how will we deal - both technically and socially - with the perils of OPP: Other People’s Provenance?
Sketchnotes from the afternoon session on #digiprov, pre-#MCN2017, with @matthewdlincoln @workergnome & many more. /cc @caw_ pic.twitter.com/tp10QBlOrg— Jason Alderman (@justsomeguy) November 7, 2017
Did the sessions at MCN proper fill the gaps in this conversation? Short answer: no.
I did not see sustained discussions about museum-to-museum functions and processes at this conference. When it came to bits-on-the-ground technical work, there was lots of “here’s what we do in our museum.” But there was vanishingly little of “a group of us think this is the way to go and we’re trying to get others to join, too.”
Now, there were some really important exceptions to this: workshops on IIIF, on database-backed publishing, and a session on what it means for museums to open-source. And I’m sad to say I had to leave before making the Friday session on collaborative digital projects that looks like it did touch on some of these questions more explicitly. But my overall impression solidified while I was listening to a panel on funders’ perspectives. Here, program officers from the Kress Foundation, the Knight Foundation, and the Pew Center for Arts & Heritage gathered to discuss what they look for in a grant application for digital and technology initiatives in museums. One member of the audience asked the $64,000 question: given museums’ predilection to build their own bespoke services, how are funders shaping their calls to discourage “reinventing the wheel” and one-off projects, and instead encourage building of reusable systems and infrastructure?
The response from the panel was (and I paraphrase): “When the community looks like it really wants that, we’ll be there to support it.”
While one could grumble that this sets up a catch-22, it’s hard to disagree with the assessment that MCN and the musetech community still doesn’t seem ready to dive into the weeds about growing a community of practice for building digital services and infrastructure that crosses institutional boundaries. There were many presentations on lovely systems being constructed to serve institutional needs (both staff-facing and visitor-facing). But there was little discussion of what components of those systems could be re-deployed by sibling institutions for their own uses.
This was a fascinating inversion of the drumbeat of the conference, which was “find out what your audience/visitors/users actually want.” If it’s crucial to acknowledge that “you are not your visitors,” it’s likewise important to recognize that visitors are not the only (or even the most predominant) users of your digital systems. What should my services look like when my own museum and staff are the users?2 What should my services look like when other museums are my users? How do I support visitors and users who don’t view my museum as the extent of the known universe, but want to aggregate my collections with other institutions?
If MCN2017 wasn’t taking strides towards addressing these questions, you could see those questions being asked between the lines in sessions like that on the challenges of open sourcing.
Opening your data, or your source code, may ensure someone is able to access it… but it does not necessarily help them use it.
For example, how do I implement the Barnes’ particular flavor of computer vision for collections browsing, when it’s been baked into the same repository as their custom web interface, rather than split out as its own service component?
If I’m a visitor who wants to compare collections, how do I know if the Williams College Museum of Art collections data uses the
culture tag in the same way that the Carnegie Museum of Art collections data uses
If I’m a curatorial assistant, do I really have to copy labels out of a PDF generated by a loaning museum’s TMS into my TMS by hand? (Raise your hand if you or someone you know has to do this regularly 👋)
If I want fix some of these problems by working with nascent standards, how can I start to implement the linked.art model being established by the American Art Collaborative and the Getty, when it’s just little old me and I have fifty other priorities on my to-do list?
At the end of the day, looming over this need for museum-to-museum standards are the realities of understaffing and mixed-up priorities that Ropeik outlined. In other words: “Help build a standard and implement it? In this economy!?”
Sketchnotes from the #mcn2017-F27 session on collaborative digital projects, linked data and ... operationalizing? /cc @ddegler pic.twitter.com/AfoeR3j5Q2— Jason Alderman (@justsomeguy) November 10, 2017
But paradoxically, figuring out how to foster the kinds of communities that develop practical digital standards and tools could actually help the musetech community to dig itself out of the institutional traps that are plainly causing so much pain. Rather than acceding to requests to build fancy one-off solutions (the Getty, by the way, is massively guilty of this!) we can start pointing to the work of growing communities that are making it practically possible to advocate for tool and standards reuse. It can ensure that the major investments of staff time and brainpower that places like MoMA or the Getty can afford can practically benefit much smaller institutions relying on a single “digital” staff member.
I’m not yet sure if I’m coming to MCN2018. But I don’t plan on fleeing the community quite yet. I’m hopeful it can turn towards building a community of shared digital practice that matches the incredible and aspirational community of social practice that it has fostered so well in the past few years.
In the meantime, come chime in on the growing linked.art community :)
H/T to my UMD compatriot Nicole Riesenberger for pointing me to Ropeik’s post. ↩