Capital Community College – Assessment Team Website

Academic Year 2014-2015

Team Members

Marie Basche, Leonel Carmona, Becky DeVito (Co-Chair), Bonnie Edelen, Seth Freeman (Co-Chair), Bonnie Edelen, Ira Hessmer, Daniela Ragusa, Suzanne Rocco-Foertsch, Minati Roychoudhuri, Katie Schackner, Angela Simpson, Jenny Wang

Summary of Activities

In the Fall semester the Assessment Team approved a new assessment process to increase faculty engagement in our assessment work and ensure faculty-driven actionable recommendations are generated as a result of the assessment process. The new process includes new activities such as instructor scoring of artifacts, instructor feedback forms and faculty focus groups. The new assessment process is detailed in the Assessment Process Diagram.

In support of this new process, team members developed and approved a new Instructor Feedback Form that instructors participating in assessment work will complete. The feedback form solicits observations and recommendations from faculty participating in the assessment, and yields rich qualitative data – in addition to quantitative data on student artifact scores – to inform the assessment process. The Assessment Team will later use responses from the Assessment Worksheet and Instructor Feedback Forms to develop an interview guide for faculty focus groups. The new process also included collecting feedback from tutors in the Academic Success Center (ASC), and surveying tutors and administering Tutor Focus Groups.

Members of the assessment team were split into two subgroups – one focusing on the Quantitative Reasoning (QR) assessment on one focused on the Written Communication (WC) assessment.

Members in the QR subgroup continued to work with their assigned faculty teaching QR-designated courses to conduct the QR assessment. Members collected completed Assessment Worksheets and Instructor Feedback Forms from some participating faculty, and worked with other faculty to prepare them to conduct the assessment in the Spring semester. Members in the WC subgroup continued developing the WC rubric started the previous Spring semester. Alongside completing the rubric, team members met with with English faculty to ensure the faculty are ready to participate in the assessment the following semester. In the Spring semester, members of the WC subgroup began assessing all sections of ENG 101, ENG 101P, and IDS 250. All ENG 101, ENG 101 P and IDS 250 sections were selected for participation because ENG 101 was recently revamped, and the assessment will help measure the effect of the new curriculum.

Academic Year 2013-2014

Team Members

Marie Basche, Femi Bogle-Assegai, Leonel Carmona, Becky DeVito (Co-Chair), Bonnie Edelen, Seth Freeman (Co-Chair), Ira Hessmer, Jason McCormick, Angela Simpson, Jennifer Thomassen, Jenny Wang

Summary of Activities

The Assessment Team developed the Quantitative Reasoning (QR) rubric during the Fall 2013 semester. Members first reviewed quantitative reasoning rubrics developed by other colleges and universities as well as the AACU value rubrics. Consideration was given to developing a rubric for quantitative reasoning that could be used across any academic discipline, as well as that could be used to score student assignments of different types, including multuple choice exams.

Alongside the development of the QR rubric, team members also identified and created sample assignments within each academic department that could be evaluated using the new draft rubric. Sample assignments arose from courses such as PSY 111, MAT 137, MAT 167 CSA 135 and PHL 111. Some these courses identfied assignments measured student achievement of all the QR outcomes, while others only measured student achievement of a subset of outcomes.

In December 2013, members completed the draft Quantitative Reasoning Rubric.

At the beginning of Spring 2014 the Assessment Team created the QR Assessment Cycle documents to forecast how multiple competency areas could be assessed simultaneously. The Cycle documents also specified how multiple rounds of assessment must be planned in advance for a given competency area to allow for curricular changes to be made after the first round of results and a second round of assessment to follow up to ascertain if the changes were useful.

Members continued to evaluate the new QR rubric by applying the rubric to new and existing QR assignments. Members also began identifying courses and corresponding faculty within their departments willing to participate in the 2014 – 2015 QR assessment. Team members met with faculty across academic departments and worked with faculty to tailor assignments to measure all or a subset of the QR outcomes. The focuc of this work was to ensure faculty were prepared to conduct the assessment in the Fall 2014 semester.

Assessment Team members also began development of the new Written Communication (WC) rubric. Members researched and evaluated existing WC rubrics used by other institutions as well as the previously developed Communicate Effectively rubric developed by CCC. Team members developed the draft WC rubric concurrently with the drafts of sample assignments in various subject areas, applying the rubric components to drafts of sample assignments in an iterative process.

Academic Year 2012-2013

Team Members

Femi Bogle-Assegai, Becky DeVito (Co-Chair), Bonnie Edelen, Seth Freeman (Co-Chair), Ira Hessmer, Bujar Konjusha, Daniel Tauber, Jennifer Thomassen, Jessica Vanderhoff, Jenny Wang

Summary of Activities

As governed by state law Public Act No. 12-31, a new Transfer and Articulation Policy (TAP) will govern student learning at the 12 community colleges, 3 state universities, and Charter Oak State College, all of which comprise the ConnSCU institutions. Statewide committees developed new core competencies (TAP Framework). Each of the ConnSCU institutions voted on whether to ratify this new TAP Framework, and if so, identified two competencies that would be required for all transfer students under Section B of the Framework. At Capital, the Campus-level Core Curriculum Design and Assessment Committee (C-DAC) facilitated discussions in the various academic divisions across the school (Departmental Votes on TAP), leading up to a vote of C-DAC members, which was then formulated as a recommendation to CAP and the Senate (C-DAC Recommendations). CAP and the Senate voted in line with C-DAC’s recommendations, so the TAP Framework was ratified at Capital in December of 2012, and the core competencies chosen for Section B of the Framework were: Social Phenomena and Aesthetic Appreciation.

The new core competencies identified in the TAP Framework have now replaced the General Education Goals at Capital, so the Assessment Team will be developing rubrics and identifying appropriate student artifacts to collect and assess in the coming years. In order to prepare the groundwork for such changes, both the Assessment Team and C-DAC began assisting the departments in mapping courses to the core competencies. The Assessment Team created guidelines and worksheets (Course Competency Matching, Course Competency Matching Department Summary, Curriculum Mapping Guidelines ).

A workshop was developed to sensitize Assessment Team and C-DAC members to issues involved in revising the learning outcomes of the Standardized Course Outlines (CUK Memo, CUK Syllabus, CUK Syllabus Key).

A specific guide was also created, Recommendations for Embedding the Continuing Learning/ Information Literacy Competency into Your Revised Course Outline, for the Continuing Learning/Information Literacy Competency Area.

In the Spring 2013 semester C-DAC began reviewing Standardize Course Outlines of courses that newly matriculating freshmen are eligible to take across the school, agreeing on the following criteria to guide their observations and comments to faculty (C-DAC Standards for Reviewing). Course outlines approved by C-DAC were then sent to CAP and Senate for approval in the usual manner. C-DAC review of additional course outlines, followed by the process of approval through CAP and the Senate will be an ongoing activity in the coming semesters.

Meeting Minutes/Agendas

 

Academic Year 2011-2012

Team Members

Becky DeVito, Bonnie Edelen, Seth Freeman (Co-Chair), Ira Hessmer, Bujar Konjusha, Catherine Schackner, Daniel Tauber (Co-Chair), Jennifer Thomassen, Jessica Vanderhoff, Jenny Wang

Summary of Activities

The Assessment Team conducted the Critical Thinking general education assessment in Fall 2011 and Spring 2012. The Assessment Team worked with faculty across all departments to collect student artifacts from selected courses in which faculty mapped assignments to the Critical Thinking rubric. The Critical Thinking Rubric used was originally developed by Valencia Community College in 2005. X different assignments throughout the college were part of the assessment. Due to the complexity of using one rubric so many different assignments, faculty teaching the courses created customized Critical Thinking Rubrics that mapped from the rubric to the specific assignment/project.

The Assessment Team used the Digication online assessment platform to store the artifacts, rubrics and artifact scores. Digication Training Sessions are held to prepare participating faculty and staff.

Critical Thinking Scoring Sessions were held in May 2012. The Scoring Sessions were interdepartmental, wherein faculty within each department scored artifacts generated within their department. Some departments input the artifact scores into Digication, while others gave it to the Assessment Team to complete this step. The artifact scores were aggregated and summarized into the Critical Thinking Assessment Results.

In Spring 2012, the Assessment Team began planning to assess the Global Perspective general education outcome. The Assessment Team evaluated the current Global Perspective rubric, and identified ways to improve the rubric. The team also began working with Program Coordinators to identify courses to participate in the next Global Perspective Assessment.

Documents
  • Critical Thinking Rubric (modified from Valencia Community College)
  • Critical Thinking Participating Courses/Assignments
  • Sample Modified Critical Thinking Rubric
  • Critical Thinking Assessment Results

Academic Year 2010-2011

Team Members

Winchester Brown, Karen DeLoatch, Becky DeVito, Bonnie Edelen, Evelyn Farbman, Seth Freeman, Bardh Hoxha, Daniel Ragusa, Kristen Swider (Co-Chair), Daniel Tauber (Co-Chair), Jennifer Thomassen, Jenny Wang

Summary of Activities

The Assessment Team laid the groundwork for the Critical Thinking general education assessment. The team first evaluated the Critical Thinking rubric previously created by CCC faculty and reviewed the Critical Thinking assessment conducted during the X-Y assessment cycle. The team decided to use authentic course artifacts in the assessment, and chose to use a new Critical Thinking rubric developed by Valencia Community College. This new rubric was shared with faculty and staff across the college. Team members met with Program Coordinators to review previously created degree program curriculum maps, that list beginning and terminal courses within the degree programs that align with Critical Thinking outcomes. Program Coordinators are asked to refine the curriculum map, by identifying and/or developing specific critical thinking activities (assignments, projects, etc.) within the courses, and detailing where and how the activities align with specific “Think Indicators” in the Valencia Community Collee rubric. This work was conducted throughout the academic year, to prepare for the actual Critical Thinking assessement the following year.

To follow-up on the Comunicate Effectively assessment the previous year, the Assessment Team administered the Survey on Faculty Writing to faculty across the college. This survey is used to determine the nature/make-up of writing assignments assigned to students, types of feedback given to students, faculty perceptions on teaching writing, and more. 83 faculty across the college participated in the survey. A Detailed Item Analysis is compiled.

In Spring 2011, the Assessment Team coordinates the first “Assessment Day”. This is a college-wide event to showcase the hard work done across the college measuring student learning outcomes and using results to inform curriculum and services. The event is modeled after Assessment Day held at Keane State College. Assessment measures at the course-level, programmatic level, and general education level are highlighted. X faculty present workshops, and Y faculty prepare posters.

Meeting Minutes/Agenda
  • May 2, 2011
  • Apr 4, 2011
  • Mar 21, 2011
  • Feb 28, 2011
  • Feb 7, 2011
  • Dec 1, 2010
  • Nov 17, 2010
  • Nov 3, 2010
  • Oct 10, 2010
  • Oct 6, 2010
  • Sep 22, 2010
Documents
  • Critical Thinking Rubric (modified from Valencia Community College)
  • Survey on Faculty Writing
  • Survey on Faculty Writing Detailed Item Analysis
  • Assessment Day Agenda
  • Assessment Day Video

Academic Year 2009-2010

Team Members

Leonel Carmona, Seth Freeman (Co-Chair), Evelyn Farbman, Kathy Herron, Bujar Konjusha, Art Kureczka, Daniel Tauber, Jennifer Thomassen (Co-Chair), Jenny Wang, Michelle White

Summary of Activities

The Assessment Team assessed the Written Communication general education outcome. The assessment strategy consisted of collecting authentic course artifacts from courses across the curriculum. This was a new approach taken to increase faculty participation and engagement in assessment activities … The strategy also ensure students will fully apply themselves …   This approach, however, offerred new challenges, most significantly the need to measure types of assignments with the same rubric.

Early in Fall 2009, the Assessment Team distributed Memo to Participating Instructors and Memo for Participating Students to inform faculty and students participating in the assessment, and specifically aware students of the rubric that would be used in evaluating their writings.   ….

The Assessment Team evaluated online assessment tools, and decided to use the Digication online assessment platform to store student artifacts, rubrics and artifact scores. To facilitate this and prepare faculty participating in the assessment to use Digication, Digication Training Sessions were held.

Artifacts were collected throughout the Fall 2009 and Spring 2010 semesters. Artifacts were collected from X courses, with Y students participating in the assessment. Artifacts were collected from courses identified as “beginning” and “terminal”. “Beginning” artifacts are writing samples created earlier in students’ programs of study, while “terminal” artifacts are collected form courses taken in the students last semester prior to graduation.

Three Scoring Sessions were held in Summer 2010 to score the writing artifacts collected during the academic year. A norming session was held at the beginning of each scoring session. All the student scores and artifacts were uploaded to the Digication online assessment platform. The assessment results were later analyzed and summarized into the …

Meeting Minutes/Agenda
  • May 3, 2010
  • Apr 19. 2010
  • Mar 22, 2010
  • Feb 17, 2010
  • Dec 16, 2009
  • Nov 18, 2009
  • Oct 21, 2009
  • Oct 7, 2009
  • Sep 23, 2009
  • Sep 9, 2009
  • Sep 2, 2009
Documents