Jeffrey Selingo has published an interesting article on The Chronicle of Higher Education’s site, regarding the bachelor’s degree and his opinion of a proposed makeover. The general point of the article is that institutions of higher education should offer bachelor’s degrees that meet both students’ aspirations and needs at the same time. As the economy shifts and changes (or remains stagnant), the article proposes that the bachelor’s degree be reworked somehow. One interesting example given, is from Stanford University.
Dubbed the “open loop” university, this plan would admit students at 18 but give them six years of access to residential learning opportunities, to use anytime in their life. Such a path through college could shift our deep-rooted cultural belief that college is something young people do, and would make alternative pathways, such as gap years and low-residency colleges, more acceptable to those students who wouldn’t benefit from the typical campus experience.
We always need to look for ways to be ahead of the curve in higher education. Whether this relates to distance learning, study abroad, assessment technology and techniques, or general change to degrees and programs, it is essential for all higher education professionals to look for the writing on the wall, and make change when necessary. I agree that an examination of the bachelor’s degree is required. Figuring our where and what to reorganize, is where the true work begins.
Business Insider recently published an article that discusses the increasing amount of hires at Google who have not earned college degrees. The majority of employees have graduated from college but this number is decreasing. Google has found that high GPAs and strong college transcripts do not guarantee success in the workplace. The ability to creatively consider problems and arrive at novel solutions is considered most important. It is noted, however that a college degree is still important in order to acquire advanced skills and receive better pay.
Administrators in higher education are continuously looking for better ways to gauge how well a student will perform in a particular program. A student’s SAT scores, GPA, and high school transcript are just a few of the factors that are taken into account. The Assessment office in the College of Pharmacy at St. John’s University routinely collects and analyzes data to achieve a better understanding of how these elements impact student learning. There is a pressing concern that some pieces are not being examined when considering a student’s eligibility for admission. It may be necessary to begin examining factors like non-traditional data or program specific criteria. It seems that the only way to determine what works best is to look at the current population and determine if new factors need to be taken into account. The idea of taking different approaches into account in this area is something that is currently being looked at on our end, and an idea that other assessment professionals should also be considering.
The College recently purchased several Nexus 7 tablets for the Office of Assessment to use for survey and assessment purposes in a newly created Mobile Survey Lab. We rolled out the tablets this week for two separate surveys: a satisfaction survey for the ‘Finals Grab & Go Breakfast’ and a survey on the utilization of professional development funds of graduating seniors, during cap and gown distribution. The tablets are being set up to loop to the start of a new survey once a student completes one, allowing students to take the survey on a rolling basis. During the breakfast, students stopped at our Mobile Survey Lab table as they exited, to complete the satisfaction questionnaire:
During the funding survey administered at cap and gown distribution, students passed the tablets around as they waited to be called up. This instantaneous survey method allowed for real-time, instant results, and is already proving to be a useful tool for increasing our survey response rates greatly. We were able to achieve an 83% response rate on the funding survey, as 276/332 graduating seniors completed the survey on tablets in person. We are looking forward to using these tablets in new and innovative ways for surveys and other assessment related tasks. This is just another way that assessment is allowing us to push into new areas and affect positive change. Look for us at the ‘Finals Grab & Go Breakfast’ for the rest of finals week, as we continue to collect data, and give our students an opportunity to provide valuable feedback.
A heat map is a visual representation of data, utilizing colors to represent certain values. The assessment team decided a few years ago that a heat map would be a good way to take the complex set of PCOA data provided by NABP, and provide a quick visual overview. We found this colorful visualization to be a useful way to compare our own PCOA results against other schools in the nation. You can read more about “information aesthetics” (defined by Lev Manovich) here: Infosthetics.com.
The idea here, is that by giving a visual representation of data, we can engage our stakeholders, while helping them understand precisely what this data means. As assessment professionals, we often find ourselves mired in the same types of data reports, absorbed often in personality-less information. Our hope is, through the idea of ‘showing’ data, rather than just providing it, we can give our faculty and students important information in a more engaging way.
The College administered the PCOA (Pharmacy Curriculum Outcomes Assessment) examination to P1-P4 pharmacy students in January for the third consecutive year. As stated on NABP’s web site, “The PCOA is a comprehensive tool for colleges and schools of pharmacy to use as they assess student performance in the curricula.” Every year, after this examination is administered at SJU, the assessment team generates a heat map to visualize how our students have fared against the national reference group. This heat map takes advantage of the conditional formatting/color scales in Excel, and allows us to see the overall visual difference between % correct scores for SJU and the national reference. Ideally, this heat map should give a visualization of our curriculum. More specifically, the heat map provides the difference between the college and the national number for the mean percent correct scores for each topic and subtopic within the exam. For example, our P1 students achieved a ‘2’ in microbiology this year, meaning that their microbiology mean percent correct score was 2 points above the national number. The conditional formatting tool in excel takes whatever selection you choose, and conditionally formats it based on the existing numbers. For our purposes, we conditionally format each separate section, so that we could look at the ‘heat’ of individual areas. Each blocked group has been conditionally formatted separately, which explains why one zero may be yellow, while another zero is green.
The PCOA heat map is one way for us to look at our PCOA scores, and gives us something to compare against nationally. As ACPE considers whether or not to include the PCOA in the new standards, it is important for us to be prepared, and utilize the PCOA as a validated assessment tool to compare our curriculum to those across the country. I have attached a Sample School Heatmap, drawn from data provided by NABP on their published ‘Sample Score Report’. This should give a good idea of what we are looking at on our own heat map.
The College of Pharmacy and Health Sciences’ Committee on Assessment and Outcomes utilizes a built-in, cyclical assessment process throughout the year. This process allows for each program’s achievement of goals and outcomes to be analysed at some point in the academic year by the committee. Each month when the committee meets, a different program presents their ‘assessment matrix’. This matrix is a functional table, that organizes and describes the program’s goals, outcomes, measures, targets, findings and associated action plans. We have found this process to be useful, and it has turned into an organizational fundamental for our institution. We view this systematic approach to the measurement of goals and outcomes as a type of internal program review. The process itself forces us to continuously look at ourselves, and think about ways we can improve how and what we assess within our programs.
By taking a big project (analysing each programs’ goals and outcomes) and breaking it up into smaller bits, one month at a time, we have given the committee time to look at other things. This allows us to get away from having a rushed large scale review of all programs in a month or two at the end of the academic year. This ongoing programmatic review process has simply become another part of what we do, on a rolling basis, and enhanced the ‘continuous’ part of our continuous quality improvement efforts.