Though it has been sometime in the making the Federal College Scorecard site is up and running.
The site uses a combination of Federal data sources including loan and tax return data. The upside of this is the documentation that goes with the site here, contains a wealth of information on how the analysis was done, what data sources were used, and instructions how to tap into (or download) the data using a pretty straightforward API.
There is much discussion concerning the pros and cons of using the data this way. Each position is worth a read. You can read a sample of the concerns from the New York Times, Wall Street Journal, and the Chronicle of Higher Education. The issues raised include fears that this represents another step in the view that Higher Ed Institutions can be viewed as and shopped for. While there is lots of data present here, there is also the underlying worry that the data is incomplete, or provides a skewed view of the institutions.
As with any new data source it is worth a look, especially with the open nature and full descriptions of the data used.
Site – College Scorecard
SJU Information – St. John’s University
There are also a number of nice US Department of Education Blog posts available on their HomeRoom Blog. For College Scorecard specific posts, have a look here.
This type of data is definitely the direction that the new knowledge based society is taking us in. It provides a snapshot of how students fair after school and it deeply lists the data sources that were used. Incomplete or not, there is only going to be more data like this released and used.
As the Fall semester begins, the College of Pharmacy and Health Sciences, its program leaders, faculty, and administrators aim to provide students with the information and resources needed to succeed. On August 31st, 2015, the incoming Fall 2015 didactic cohort of Physician Assistant (PA) students joined Dr. S. William Zito and the Office of Assessment for its annual Assessment Workshop led by Dr. Marc Gillespie.
Prior to the workshop, this cohort of students was invited to participate in a survey regarding their experience and knowledge of assessment. The Pre-Assessment Workshop Survey received a 46% response rate and based on a summary of responses, it was evident students were seeking more information on assessment. Following the workshop, the Post-Assessment Survey was administered and received a 58% response rate.
Some highlights of the survey include that 94% of students are now aware of the assessment process in the college in comparison to 37% before our workshop.
84% of students are now willing to participate in assessment as compared to only 66% of respondents beforehand.
Now, 96% of students either strongly agree or agree that they are involved in improving the culture of assessment in comparison to 28% before the workshop.
A majority of respondents now know that the College of Pharmacy and Health Sciences has an Assessment Committee and an Office of Assessment. Previously, less than 30% were aware of their roles. 89% are now in agreement that the results of assessment initiatives are used effectively.
Results of the Post-Assessment Workshop Survey demonstrate students’ interest and willingness to participate in the College’s assessment process, as well as, the effectiveness of the Assessment Workshop. The Assessment Team encourages all students to share their feedback by electing to participate in any survey invitations they may receive. We look forward to next year’s PA Assessment Workshop and hope to hear from all students throughout the year.
“This seems like a lot of work.”
“What do I have to do?”
“I’m not comfortable with computer-based testing.”
These are all valid concerns raised by new Examsoft users and the types of concerns which need to be considered when rolling out a new initiative. As computer-based testing was recently introduced to the College of Pharmacy & Health Sciences, the Office of Assessment has been named its Key Examsoft Administrators. A great deal of information was compiled during our recent pilot phase, which ultimately led to the creation of an instructional user manual tailored to the needs of our specific programs.
Information was collected by tracking the issues experienced during exams’ administrations, disseminating student surveys, & collaborating with faculty. Observing new users as they were introduced to Examsoft during training sessions and taking note of their concerns alerted us of elements which are crucial to helping new users understand Examsoft. All this information accumulated into a wealth of knowledge we now held and the next logical step was to share it with fellow users.
The manual provides a complete look at delivering computer-based assessments. It is intended to be the first resource for new users to utilize when building an exam. It promotes a ‘help yourself’ approach while still having other lines of support in place. The Key Administrators are available to any new or current users and the resources already offered by Examsoft are still actively promoted. In designing the manual, we opted to provide certain details, so users can better visualize how the software’s many functions are connected, as well as, insight into an exam taker’s experience.
As specific plans for computer-based testing in courses of upcoming semesters are finalized, roles & responsibilities within each course will be established. The manual will be distributed to faculty users whose courses are ultimately authorized to use the software. We will continue to monitor all issues experienced by students and users, so we may continue evolving our computer-based testing practices to best track learning outcomes and student success.
The implementation of computer-based testing is on the rise in the education sector as institutions continue to integrate technology with pedagogical approaches when delivering the curriculum. The demand for computer-based testing aligns with the institution’s continued efforts to seek greater understandings of learning outcomes achievements and student performances. The College of Pharmacy and Health Sciences implemented a trial period to test the functionality of Examsoft, a program which “combines assessment creation, administration, scoring and analysis into a single, platform that provides teachers with direct evidence of student learning and faculty with greater insight for metric-based decision-making.”
There are many advantages to implementing computer-based testing. Students receive prompt feedback on their exam performance and are given access to electronic exam reviews. Students receive detailed reports, which show their strengths and weakness following each exam, allowing for self-directed learning. Faculty are able to create, deliver, and score assessments much more efficiently. The reports generated by Examsoft provide analyses on exam taker achievement of learning goals providing powerful data-driven resources to assist in student evaluation, accreditation and program assessment.
The article Computer-Assisted Assessment: Impact on Higher Education Institutions (Bull, 1999), outlines the many factors surrounding computer-based testing. A hurdle to overcome will include the cultural shift for some stakeholders. Students are accustomed to taking pen and paper exams, so there may be some apprehension during this transition. In the attempt to counteract any hesitation, we’ve provided students with opportunities to sit through mock exams and offer informative resources regarding the new platform. Faculty organize the design, implementation, and maintenance of computer-based testing protocols assisted by the Assessment Office staff with each step. These organizational concerns are alleviated by properly training all users, collaborating during these initial stages of development, and establishing a clear picture of all responsibilities and services following the trial period.
As the trial period comes to a close, the organizational structure of this platform will need to be finalized. The experiences of all stakeholders will be taken into consideration, to get a thorough understanding of the trial’s successes and any difficulties we need to address. Nevertheless, the advantages of implementing computer-based testing outweigh the perceived difficulties through proper collaboration and significant thought to ensure a strong protocol is put into practice.
Student Leaders Met at the D’Angelo Center
We recently held a student forum to gain feedback on the college’s PharmD self-study process. This forum utilized student leaders from a number of different pharmacy organizations. Students were asked to review ACPE standards in light of our current preparation for accreditation. These essential stakeholders were broken into small groups mirroring the college’s committee structure. Each group reviewed the standards and discussed their experiences, provided suggestions and offered their unique perspectives on the PharmD program.
Student responses are funnelled back into the committee structure via the Office of Assessment, providing committees with the students’ valuable input. Committees are currently reviewing and integrating this material and will respond to all PharmD students with their preliminary ideas on how to address these important concerns.
This process is an example of how all stakeholders can work together to improve the learning experience and academic self-study process.
The Office of Assessment recently attended the 2014 AACP Annual Meeting in Grapevine, Texas. There were many interesting sessions related to assessment and its place in pharmacy education. One of the sessions focused on the role of dashboards in organizing data and making it visually appealing. Tableau Desktop was one of the dashboard interfaces discussed at the meeting. This software seems to be an effective means of understanding data and identifying trends especially with large data sets. The raw data from surveys or research projects must be input into Excel before analysis can take place. The Tableau Desktop dashboard displays several views of the data at once. Data can be filtered to highlight specific information depending on the task at hand. Fields can be combined into groups and subsets of data can be separated for a variety of purposes. Statistical analyses like trend analyses, regressions, and correlations can be computed. Dashboards are a good way to help stakeholders become involved in data analysis while presenting the data in a clear format. The Office of Assessment is beginning to discuss the pros and cons of purchasing or creating a dashboard for the Assessment Office. More research needs to be done in order to determine what software would be most appropriate for our needs.