Question

Handling sensitive content in digital repositories

  • 21 October 2021
  • 2 replies
  • 196 views

Userlevel 4
Badge +6

Hi all:

I’m currently working as a temp at a state university working on content for an upcoming digital repository, which will hold faculty and staff scholarship as well as some materials important to the history of the university. I am writing because I’m doing some research on how academic digital repositories handle sensitive content. Is that something anyone here has experience with? Or, do you have a guide you might suggest? I’m sure it will come up from time to time, but at the moment, I’m dealing with archiving all of the university’s yearbooks back to the 1920s. Unfortunately, the yearbook editors at times included some distasteful content that would be considered sexist and racist now. I’m trying to put together a process by which we can publish and maintain digital copies of items that are important to the university, but also be sensitive to its content and add some sort of warning to content that might upset a reader or viewer. I also want to put together some sort of reporting mechanism to allow the public to flag content like this, since we can’t read through every page of every item we are publishing.

Input/feedback/opinions/process overviews welcome! I hope to put together a process proposal so we can continue to both archive this material for future academic research, but also handle it with care so that any potential users understand they may be viewing offensive content.

Best,

kate


2 replies

Hi,

This is a really great question, and I am looking forward to finding out how other archivists are handling this issue.

Kind regards,

Michaela

Userlevel 4
Badge +6

Right now, there are four options I’m looking to build into my school’s standard operating policy:

 

  1. Statement (webpage)
    An overview document that explains that the digital repository may contain sensitive content, including textual and graphic instances of violence, racism, sexism, and other unpalatable content; why it contains such information; how such information is flagged; and how to report such information.
    Examples
    1. Drexel University
    2. Temple University
       
  2. Metadata
    1. A generic statement added to the metadata of all potentially sensitive content which alerts the user generally that the object may contain offensive or sensitive text or images. (This option involves a blanket statement that does not require a staff member to research and write specifically about this instance of potentially harmful content.) OR
    2. A specific statement added to the metadata of specific potentially sensitive content which alerts the user specifically to the nature of the potentially offensive or sensitive text or images. (This option requires a more detailed description written by a staff member for that particular item.)
       
  3. Cover sheet
    A visual barrier that alerts a user that the object contains potentially sensitive or offensive content before they review page 1 of the item. This is in addition to the metadata, which may be easily overlooked or ignored by a user.
     
  4. Reporting tool
    A web tool that allows users to report potentially offensive or harmful content. Purpose: to alert staff that an object contains sensitive content so that a staff member can add the appropriate metadata and/or cover page to alert users before they reach the content. Examples
    1. Tufts

 

Kate

Reply