
(Roadblock, Ada Gonzalez CC-BY 2008)
I wouldn’t say my experience in the business world was entirely “move fast and break things.” But working in high tech certainly included an understanding that it’s sometimes better to ask forgiveness than permission. The situation couldn’t be more different in higher education.
As part of the OER advocacy I’m planning on my campus this fall, I’ve always assumed I’d do a couple of campus-wide surveys: one of faculty and one of the students affected by high textbook costs. The idea was both to locally replicate the results of student surveys like the famous Florida study, and to signal to all the students and faculty on campus that something is about to begin.
I was informed last spring that in order to survey the faculty, I would need to get permission of the Interfaculty Organization (IFO), our union. I sent an email to the President of the BSU Faculty Association with a link to the 25-question Qualtrics survey I was planning on using. He said he’d put it on the agenda of the first Faculty Senate meeting in September, but he didn’t think there would be any resistance. So, by the middle of September I’ll probably be sending out my faculty survey via the official mailing list. But he also suggested I contact the Institutional Review Board (IRB), an organization I hadn’t heard of previously.
This is where it gets a bit sticky. The IRB, it turns out, is also known as the Human Subject Committee. It was apparently formed in response to a Federal regulation (45 CFR 46.102f) that requires review and approval to do research that “deals with human subjects” in a way “designed to develop or contribute to generalizable knowledge.” The only survey activities that seem to be exempt are student and faculty evaluations and “information collected for program improvement, evaluation, and accreditation.”
I exchanged a couple of emails with the Director of Graduate Studies at BSU, who oversees the IRB. He verified that if I planned on making the data public in any way (conferences, website, publications, etc.) I would need to get IRB approval. If the information was solely for my own course development and not for public distribution, I would not be required to get approval.
What was not clear was what I would need to do to get approval. I studied the IRB website and it seemed that in addition to filling out a number of forms, I would need to get a certificate from another organization called the Collaborative Institutional Training Initiative (CITI) that I had completed a training course of some type. The course was not specified and the provided link took me to CITI’s homepage, which was no help. At this point, I have no idea how much time I would need to put in, to simply get to the point where I could submit a proposal to get my survey approved.
This is a major institutional impediment to me getting the data I was hoping to get on BSU students and faculty to guide my campaign. While I appreciate the sensitivity of using data collected from people and the need to understand issues of privacy and when a line of questioning might be inappropriate, this vague, poorly-defined requirement seems like an unnecessarily obnoxious roadblock. This IRB requirement acts as a sort of unfunded mandate, requiring me to invest an undefined amount of time not only meeting its requirements but figuring out what they are. This is the sort of bureaucratic black hole that seems like it could have been designed expressly to prevent innovation rather than to protect “human subjects”. Or is it? There seems to be a loophole, both in the published official guidelines and in the Graduate Program Director’s communication. I may be able to run my surveys on campus if I direct them only at improving my program (increasing OER acceptance and adoption on campus) and if I don’t publicize the data I collect.
It would be unfortunate if I were unable to discuss the data I collected from student and faculty surveys at the OE Global conference in the fall, or if I were unable to create charts and marketing materials documenting the significance of students’ attitudes toward excessive textbook costs. But it wouldn’t be the end of the world. We already have published studies that document these facts. Even without IRB approval, maybe I could still conduct surveys and use the data to plan my campaign, communicate with the administration and other stakeholders about the project (“information collected for program improvement, evaluation, and accreditation”), and track changes over time as I implement the program.
Maybe in the future I’ll be able to find a collaborator who either has or is interested in getting all the certifications and permissions needed to run a survey I could publicize the results of. I might make this a goal of the second-year survey, after we (hopefully) have some change to report. People at the system office and also at the IFO have expressed interest in my surveys and their results. So maybe I could involve them in some way in this future “publication” collaboration, after providing my first-year results under the limited “improvement, evaluation, and accreditation”, non-public guidelines.
In the meantime, I think I’ll try to move forward (which is the goal, after all) in the best way I can, and not let this roadblock stop me in my tracks. I’ll survey students and faculty, but with the express understanding that I will not publicize the results. Or, in other words, that the results will be expressed in what I do about the data, not what I say about it.