|The Office of Assessment in the College of Pharmacy and Health Sciences at St. John’s University is offering an assessment workshop on Wednesday, June 20, at the Dr. Andrew J. Bartilucci Center.
Impactful assessment begins with an established infrastructure that collects and analyzes data. Decisions based on cyclical goal and outcome assessment are evidence-driven and allow for analysis of interventions in the following cycle. Typically described as closing the loop, this cycle requires an efficient assessment framework and robust data collection, analysis, and reporting. Whether establishing a new program or reworking an existing one, this structure will determine the success of your efforts.
We describe typical assessment structures consisting of a stakeholder-driven committee and assessment and administrative support. This framework is the foundation for all assessment activities that follow.
Efficient assessment structures are both data-focused and flexible. Assessment data is drawn from institutional and seedling sources. Examples of both types of data will be provided, as well as best practices for data management.
Course modules include
· Instituting an efficient assessment framework
· Utilizing diverse assessment data
· Developing an Assessment Plan and Matrix
For more information, visit stjohns.edu/CPHSAssessment.
Category Archives: Education
As the Fall semester begins, the College of Pharmacy and Health Sciences, its program leaders, faculty, and administrators aim to provide students with the information and resources needed to succeed. On August 31st, 2015, the incoming Fall 2015 didactic cohort of Physician Assistant (PA) students joined Dr. S. William Zito and the Office of Assessment for its annual Assessment Workshop led by Dr. Marc Gillespie.
Prior to the workshop, this cohort of students was invited to participate in a survey regarding their experience and knowledge of assessment. The Pre-Assessment Workshop Survey received a 46% response rate and based on a summary of responses, it was evident students were seeking more information on assessment. Following the workshop, the Post-Assessment Survey was administered and received a 58% response rate.
Some highlights of the survey include that 94% of students are now aware of the assessment process in the college in comparison to 37% before our workshop.
84% of students are now willing to participate in assessment as compared to only 66% of respondents beforehand.
Now, 96% of students either strongly agree or agree that they are involved in improving the culture of assessment in comparison to 28% before the workshop.
A majority of respondents now know that the College of Pharmacy and Health Sciences has an Assessment Committee and an Office of Assessment. Previously, less than 30% were aware of their roles. 89% are now in agreement that the results of assessment initiatives are used effectively.
Results of the Post-Assessment Workshop Survey demonstrate students’ interest and willingness to participate in the College’s assessment process, as well as, the effectiveness of the Assessment Workshop. The Assessment Team encourages all students to share their feedback by electing to participate in any survey invitations they may receive. We look forward to next year’s PA Assessment Workshop and hope to hear from all students throughout the year.
“This seems like a lot of work.”
“What do I have to do?”
“I’m not comfortable with computer-based testing.”
These are all valid concerns raised by new Examsoft users and the types of concerns which need to be considered when rolling out a new initiative. As computer-based testing was recently introduced to the College of Pharmacy & Health Sciences, the Office of Assessment has been named its Key Examsoft Administrators. A great deal of information was compiled during our recent pilot phase, which ultimately led to the creation of an instructional user manual tailored to the needs of our specific programs.
Information was collected by tracking the issues experienced during exams’ administrations, disseminating student surveys, & collaborating with faculty. Observing new users as they were introduced to Examsoft during training sessions and taking note of their concerns alerted us of elements which are crucial to helping new users understand Examsoft. All this information accumulated into a wealth of knowledge we now held and the next logical step was to share it with fellow users.
The manual provides a complete look at delivering computer-based assessments. It is intended to be the first resource for new users to utilize when building an exam. It promotes a ‘help yourself’ approach while still having other lines of support in place. The Key Administrators are available to any new or current users and the resources already offered by Examsoft are still actively promoted. In designing the manual, we opted to provide certain details, so users can better visualize how the software’s many functions are connected, as well as, insight into an exam taker’s experience.
As specific plans for computer-based testing in courses of upcoming semesters are finalized, roles & responsibilities within each course will be established. The manual will be distributed to faculty users whose courses are ultimately authorized to use the software. We will continue to monitor all issues experienced by students and users, so we may continue evolving our computer-based testing practices to best track learning outcomes and student success.
Middle States Commission on Higher Education (MSCHE) is holding its annual meeting in Washington DC and we are here. As we listened to the plenary talks and met other folks in higher ed, I wondered if it was as apparent to others as it is to us why our assessment team is here?
As an assessment group we spend a lot of our time thinking of ways to link the fantastic work of our faculty and students to measurable outcomes. The work that it takes to define what is to be measured, how it will be measured, and how it will be fed back into improving the educational environment or back to the student is far more taxing than the actual measuring itself once we have a plan. As a group we work with all of our stakeholders to determine what plan is best for our students and faculty. While we do that, we look to see how we can feed these results back into the loop to assure our accrediting body that we are doing what we set out to achieve.
At the institutional level, MSCHE is that accrediting body, and like many higher education accrediting bodies, it is made of higher education faculty and administrators. In short: it is us. Middle States is rolling out new standards and unsurprisingly those standards reflect many of its stakeholders aspirations and challenges for higher education. As a faculty member working in the trenches, splitting time between research, teaching, and scholarship it is a challenge to take the time to measure how that effort contributes to the good that higher education provides. It is difficult to see what society expects from higher education institutions and how they are reacting to our work. Taking the time to listen to the accrediting body that credits us as a degree granting institution and meeting with other stakeholders, provides a moment for faculty to pause and reflect on where higher education is today and where it may be going.
As I listened to the talks and the crowd, I thought about how important it is for faculty to reflect upon the mission of their institution, but also their own goals as a teacher, scholar, and educator. If we are to have an impact in a meaningful way, we must take time to think carefully as to what we can reasonably achieve and what society wants and expects of us.
So why should faculty attend an annual accreditor’s meeting?
As I listened to one of the talks a speaker paused for a second and noted that faculty should be able to describe what the standards are. The standards? “Those esoteric rules and chains that we are held to”, I thought. And then it struck me that, yes, we really should be able to. In fact if the standards are any good, they should aim to quantify the very characteristics that drew us to higher education in the first place. The standards serve as a description of the good that a degree can provide to a student and the good that an institution can provide to society.
The College of Pharmacy and Health Sciences recently approved a new policy on student travel funding. This policy applies to any registered students of the college who plan to attend a professional meeting or conference to support and promote their professional development.
Summary of the funding levels:
$1,000 for students who are podium speakers or national competitors
$750 for students who are first authors of a research poster
$500 for students who are leaders in the recognized student organization
$250 for any students who attend a professional conference
The College recently completed publication of its annual Fall fact document, the Fall 2013 Fast Facts. This document contains pertinent demographics and other useful information about the College. It is also published on the College’s web site: http://www.stjohns.edu/academics/undergraduate/pharmacy/about/fast_facts_about_the_college.stj
WIth all of the eLearning hype out there, the Wall Street Journal has been a voice of calm, often publishing balanced articles covering the different aspects of eLearning. They continued this here in an early take on how MOOCs are doing. The nice thing about this article is the embedded graphics that provide a nice snapshot of what is happening to the students who register for these courses. Much like the eLearning assessment article that we described previously, this article examines what happens to students as they register and try out a course, as well as ultimate passing rates.