Unattainable Goals

One of the things that jumped out at me, during a day full of meetings yesterday related to the beginning of the fall semester, was a guest speaker who opened our meeting in the College of Arts, Education, and Humanities. John Eggers is a Bemidji Pioneer columnist and an advocate for 100% High School graduation. John argues that we should set a goal of trying to get 100% of Bemidji’s high school students to graduate, and he claims this goal could be achieved in not five years or three years, but in one if we really put our minds to it.

goals.jpg

(Goals by Nick Youngson CC BY-SA 3.0 Alpha Stock Images)

I’m not going to argue whether John’s goal is attainable. The thing that struck me about it is, it’s obviously the right goal. How could one justify setting a goal that aspired to less than 100%? “Yeah, we want to leave 2% or 3% or 5% behind each year” doesn’t cut it. That may be a reality, but it’s not a vision. Whether or not you believe it can be achieved in a year, three, five, or maybe never at all…how could you argue that we want less?

I thought this was a useful idea for OER, especially in the context of the currently-popular idea of Z-Degrees. The Minnesota Legislature has mandated three Associate Degrees will be created with zero textbook costs – not only for the money that will be saved by the students that got through those particular tracks, but for all the students around them, who will get the benefit of being in the Z-courses created, even if their entire program isn’t free of textbook costs. And of course the focus on creating Z-courses will inspire other changes and the benefits will snowball.

Similarly, as I’ve mentioned before, students at 4-year schools like Bemidji State University would benefit from substantial decreases in their textbook costs even if we can never eliminate them entirely. When John was talking about 100% Graduation, I wrote in my notes, “an unattainable goal isn’t necessarily a bad thing.” Then I immediately started thinking of ways it can be a bad thing. Like if you’ve pegged your compensation to a goal you can’t achieve. But then I ultimately decided that on the whole, I like the idea of refusing to compromise on a vision and then celebrating getting as close to it as possible.

 

Institutional roadblocks

IMG_0350 - Version 2

(Roadblock, Ada Gonzalez CC-BY 2008)

 

I wouldn’t say my experience in the business world was entirely “move fast and break things.” But working in high tech certainly included an understanding that it’s sometimes better to ask forgiveness than permission. The situation couldn’t be more different in higher education.

As part of the OER advocacy I’m planning on my campus this fall, I’ve always assumed I’d do a couple of campus-wide surveys: one of faculty and one of the students affected by high textbook costs. The idea was both to locally replicate the results of student surveys like the famous Florida study, and to signal to all the students and faculty on campus that something is about to begin.

I was informed last spring that in order to survey the faculty, I would need to get permission of the Interfaculty Organization (IFO), our union. I sent an email to the President of the BSU Faculty Association with a link to the 25-question Qualtrics survey I was planning on using. He said he’d put it on the agenda of the first Faculty Senate meeting in September, but he didn’t think there would be any resistance. So, by the middle of September I’ll probably be sending out my faculty survey via the official mailing list. But he also suggested I contact the Institutional Review Board (IRB), an organization I hadn’t heard of previously.

This is where it gets a bit sticky. The IRB, it turns out, is also known as the Human Subject Committee. It was apparently formed in response to a Federal regulation (45 CFR 46.102f) that requires review and approval to do research that “deals with human subjects” in a way “designed to develop or contribute to generalizable knowledge.” The only survey activities that seem to be exempt are student and faculty evaluations and “information collected for program improvement, evaluation, and accreditation.”

I exchanged a couple of emails with the Director of Graduate Studies at BSU, who oversees the IRB. He verified that if I planned on making the data public in any way (conferences, website, publications, etc.) I would need to get IRB approval. If the information was solely for my own course development and not for public distribution, I would not be required to get approval.

What was not clear was what I would need to do to get approval. I studied the IRB website and it seemed that in addition to filling out a number of forms, I would need to get a certificate from another organization called the Collaborative Institutional Training Initiative (CITI) that I had completed a training course of some type. The course was not specified and the provided link took me to CITI’s homepage, which was no help. At this point, I have no idea how much time I would need to put in, to simply get to the point where I could submit a proposal to get my survey approved.

This is a major institutional impediment to me getting the data I was hoping to get on BSU students and faculty to guide my campaign. While I appreciate the sensitivity of using data collected from people and the need to understand issues of privacy and when a line of questioning might be inappropriate, this vague, poorly-defined requirement seems like an unnecessarily obnoxious roadblock. This IRB requirement acts as a sort of unfunded mandate, requiring me to invest an undefined amount of time not only meeting its requirements but figuring out what they are. This is the sort of bureaucratic black hole that seems like it could have been designed expressly to prevent innovation rather than to protect “human subjects”. Or is it? There seems to be a loophole, both in the published official guidelines and in the Graduate Program Director’s communication. I may be able to run my surveys on campus if I direct them only at improving my program (increasing OER acceptance and adoption on campus) and if I don’t publicize the data I collect.

It would be unfortunate if I were unable to discuss the data I collected from student and faculty surveys at the OE Global conference in the fall, or if I were unable to create charts and marketing materials documenting the significance of students’ attitudes toward excessive textbook costs. But it wouldn’t be the end of the world. We already have published studies that document these facts. Even without IRB approval, maybe I could still conduct surveys and use the data to plan my campaign, communicate with the administration and other stakeholders about the project (“information collected for program improvement, evaluation, and accreditation”), and track changes over time as I implement the program.

Maybe in the future I’ll be able to find a collaborator who either has or is interested in getting all the certifications and permissions needed to run a survey I could publicize the results of. I might make this a goal of the second-year survey, after we (hopefully) have some change to report. People at the system office and also at the IFO have expressed interest in my surveys and their results. So maybe I could involve them in some way in this future “publication” collaboration, after providing my first-year results under the limited “improvement, evaluation, and accreditation”, non-public guidelines.

In the meantime, I think I’ll try to move forward (which is the goal, after all) in the best way I can, and not let this roadblock stop me in my tracks. I’ll survey students and faculty, but with the express understanding that I will not publicize the results. Or, in other words, that the results will be expressed in what I do about the data, not what I say about it.

Adding to OER ebooks

One of the useful aspects of Pressbooks is that authors can edit a title and add content whenever they need to. This allows errors to be corrected and materials to remain up to date as new information becomes available. How often have you discovered a problem in a textbook you’re using, and hoped it would be caught and corrected in the next edition in a few years?

Keeping up with research isn’t an issue only in the sciences, though. New information becomes available in all fields as researchers continue discovering new facts or refining their interpretations. For example, I recently discovered another historical source for my volume of primary readings relating to the Ranney brothers and their migrations across the continent in the nineteenth century.

The source wasn’t exactly new: it was a volume called the Compendium of history and biography of Hillsdale County, Michigan, written by Elon G. Reynolds in 1903. Reynolds’ work was typical of the genre, including about 80 pages of general history of the county and then over 450 pages of short biographical sketches of Hillsdale’s leading men and institutions. On pages 302 and 303 there is a sketch of Henry Ranney’s younger brother, Lemuel Sears Ranney.

Screen Shot 2019-08-11 at 2.13.16 PM.png

The passage adds some details to Lemuel’s life I was not aware of, provides validation of some of the events Lemuel and his brothers describe in their letters, and gives us an interesting look at the elements of Lemuel’s story that seemed interesting to the editors of this 1903 volume, and presumably its readers. It also shows the degree of respect Michigan residents seem to have had for Ranney, who was still alive when the book was published.

This was all interesting enough to me that I wrote an extra short “chapter” about it and added it to the end of my ebook. Readers who are reading it online will find it automatically appended after the previous final chapter that covered Henry Ranney’s obituary. Folks who have downloaded the ebook or pdf versions to their own devices can return to the Pressbook’s homepage and download another. I’ll probably not be adding a lot more to this volume, but if I come across any new material it’s nice to be able to!