ACADEMIC YEAR 2018-2019

TEAM MEMBERS

Bharat Bhushan, Jennifer Briggs, Becky DeVito (co-chair), Mary-Joan Forstbauer, Seth Freeman (co-chair), Ira Hessmer, Ricardo Martinez, Suzanne Rocco-Foertsch, Jenny Wang

CRITICAL ANALYSIS and LOGICAL THINKING

The Assessment Team developed the Critical Analysis and Logical Thinking Rubric for faculty to assess student achievement of the Critical Analysis (CA) and Logical Thinking competency.

Call to Participate

The ability to think critically is necessary skill for all of our students to be successful in their current and future studies and careers. All courses throughout the College teach critical thinking in various ways.

The Assessment Team invites faculty across all disciplines to participate in the CA assessment during the 2018-2019 academic year. Representatives from the Assessment Team are eager to collaborate with faculty during both the Fall and Spring semesters. Representatives will provide support to faculty to develop and/or modify assignments to align with the CA competency. Stipends are available for all PT faculty who participate.

Please begin by reviewing the Critical Analysis and Logical Thinking Rubric.

If you are interested in participating in the CA assessment, please email Assessment Team co-chair Seth Freeman [email protected].


Academic Year 2017-2018

Team Members

Bharat Bhushan, Jennifer Briggs, Becky DeVito (co-chair), Mary-Joan Forstbauer, Seth Freeman (co-chair), Ira Hessmer, Ricardo Martinez, Suzanne Rocco-Foertsch, Jenny Wang

In 2017-2018, Assessment Team members are working in different phases of assessment with the following General Education Competency Areas:

  • Written Communication
  • Scientific Reasoning
  • Critical Analysis and Logical Thinking
WRITTEN COMMUNICATION

In Fall 2017, Assessment Team members compiled results from the Written Communication Instructor Focus Groups into the WC Instructor Assessment Findings document.


Academic Year 2016-2017

Team Members

Bharat Bhushan, Becky DeVito (co-chair), Bonnie Edelen, Mary-Joan Forstbauer, Seth Freeman (co-chair), Ira Hessmer, Ricardo Martinez, Suzanne Rocco-Foertsch, Angela Simpson, Jenny Wang

In early Fall 2016, Assessment Team Co-Chairs began compiling the results from the WC Tutor and QR Faculty Focus Groups into Summary Reports to share with the College community.

Written Communication

Team members compiled results from the Written Communication Tutor Focus Groups into the WC Tutor Assessment Findings document, as well as a Summary Presentation.

The following planning and working documents were developed to throughout the various qualitative data analysis stages:

QUANTITATIVE REASONING
Team members compiled results from the Quantitative Reasoning Assessment into the Quantitative Reasoning Assessment – Executive Summary and Quantitative Reasoning Assessment – Detailed Findings.
Overview/Planning Documents & Master Code Keys
Analytic Memos
Focus Groups
Scientific Reasoning

Members of the Scientific Reasoning (SR) subgroup finalized the Scientific Reasoning Rubric, to be used by faculty throughout the College in evaluating student work.


Academic Year 2015-2016

Team Members

Marie Basche, Bharat Bhushan, Becky DeVito (co-chair), Bonnie Edelen, Mary-Joan Forstbauer, Seth Freeman (co-chair), Ira Hessmer, Ricardo Martinez, Daniela Ragusa, Suzanne Rocco-Foertsch, Minati Roychoudhuri, Katie Schackner, Angela Simpson, Jenny Wang

In Fall 2015, Assessment Team members continued to collect Written Communication data from faculty teaching courses mapped to the Written Communication (WC) competency area. This included collecting completed Assessment Worksheets and Instructor Feedback Forms from faculty.

In the Fall 2015 semester, team members also began a qualitative analysis of the Written Communication data provided by ASC tutors and Quantitative Reasoning data provided by faculty in the previous academic year. As planned for in the overall research design, members used qualitative analytic strategies to support the Grounded Theory approach to the new assessment process. The Grounded Theory approach had been chosen because of its ability to identify unanticipated insights and solutions to improve student learning in the core competencies. The techniques of qualitative analysis that would be used by the entire team were introduced in an Overview of Qualitative Research Methods document and consisted of numerous phases, detailed in the Qualitative Data Analysis Schedule – Fall 2015 – Spring 2016. Extensive training materials to describe the function and purpose of each step in the process were created, and each meeting contained a workshop that explained the key elements of each procedure, so that team members would be able to engage skillfully in each step of the process.

Team members continued to work within two subgroups – QR Instructor subgroup and WC Tutor subgroup. Throughout the Fall 2015 semester, each subgroup completed the following activities:

  • Using the written data from the Feedback Form/Tutor Survey as input, apply emic and etic codes to create a coded transcript.
  • Collect all codes from the coded transcript into an individual Code Key.
  • Share the Individual Code Keys created by separate team members to create a combined, comprehensive Master Code Key that establishes a common language for coding to be used among multiple coders.
  • Re(Code) initial and additional transcripts using Master Code Key.
  • Develop Analytic Questions and identify groups of similar codes that apply to the topic of each Analytic Question to perform a search for data relevant to answering the Analytic Question. Using (re)coded transcripts, perform a search for relevant codes and collect all relevant data into a new document, called a Data Dump. Interpret and reorganize the data within Data Dumps under subheadings that create an Analytic Memo that identifies themes within the data pertinent to the Analytic Question. The Analytic Memos included the Analytic Question, list of Codes used in Search, data collected during the Data Dump stage organized into meaningful themes, and an Overview at a Glance section to identify key themes and data that should be followed up on in the next stage of data collection: the focus group sessions.
  • Using the Analytic Memos as input, develop Focus Group Questions to further probe and confirm topics and possible themes that arose from the written data.

In Spring 2016, Team Co-Chairs developed Focus Group Interview Guides based on the Analytic Memos and Focus Group questions developed by team members. The purpose of the Focus Groups were to engage additional members of the College in the assessment work, validate concerns identified by earlier participants, and prioritize and further identify recommended actions to improve student learning in the designated competency areas. (See QR Focus Group Guide and WC Focus Group Guide). Focus Group Sessions were scheduled in March, April, and May 2016 for both tutors (Written Communication) and faculty (Quantitative Reasoning).

In Spring 2016, the Assessment Team also initiated the assessment process for the Scientific Reasoning (SR) competency area. Members of the Assessment Team formed the SR subgroup to focus on this competency area. As with WC and QR in previous years, members of the SR subgroup initially reviewed the SR competency area and identified/evaluated existing rubrics the College could use to evaluate student work. Noting similarities between the outcomes in Quantitative Reasoning and Scientific Reasoning, team members incorporated some of the language from the QR rubric into a new SR rubric. Team members created a draft SR rubric and shared the draft rubric with faculty teaching courses mapped to SR. Members of the subgroup also worked with participating faculty to identify assignments, lab activities and projects capable of measuring student learning of SR.

As data collection from additional WC Instructors was completed by the end of the Fall 2015 semester, the WC subgroup applied the stages of analysis described above to the WC Instructor Feedback Forms, from initial coding through the stage of brainstorming possible Focus Group questions based on the analytic memos.


Academic Year 2014-2015

Team Members

Marie Basche, Leonel Carmona, Becky DeVito (Co-Chair), Bonnie Edelen, Seth Freeman (Co-Chair), Bonnie Edelen, Ira Hessmer, Daniela Ragusa, Suzanne Rocco-Foertsch, Minati Roychoudhuri, Katie Schackner, Angela Simpson, Jenny Wang

Summary of Activities

In the Fall semester the Assessment Team approved a new assessment process to increase faculty engagement in our assessment work and ensure faculty-driven actionable recommendations are generated as a result of the assessment process. The new process includes new activities such as instructor scoring of artifacts, instructor feedback forms and faculty focus groups. The new assessment process is detailed in the Assessment Process Diagram.

In support of this new process, team members developed and approved a new Instructor Feedback Form that instructors participating in assessment work will complete. The feedback form solicits observations and recommendations from faculty participating in the assessment, and yields rich qualitative data – in addition to quantitative data on student artifact scores – to inform the assessment process. The Assessment Team will later use responses from the Assessment Worksheet and Instructor Feedback Forms to develop an interview guide for faculty focus groups. The new process also included collecting feedback from tutors in the Academic Success Center (ASC), and surveying tutors and administering Tutor Focus Groups.

Members of the assessment team were split into two subgroups – one focusing on the Quantitative Reasoning (QR) assessment on one focused on the Written Communication (WC) assessment.

Members in the QR subgroup continued to work with their assigned faculty teaching QR-designated courses to conduct the QR assessment. Members collected completed Assessment Worksheets and Instructor Feedback Forms from some participating faculty, and worked with other faculty to prepare them to conduct the assessment in the Spring semester. Members in the WC subgroup continued developing the WC rubric started the previous Spring semester. Alongside completing the rubric, team members met with with English faculty to ensure the faculty are ready to participate in the assessment the following semester. In the Spring semester, members of the WC subgroup began assessing all sections of ENG 101, ENG 101P, and IDS 250. All ENG 101, ENG 101 P and IDS 250 sections were selected for participation because ENG 101 was recently revamped, and the assessment will help measure the effect of the new curriculum.


Academic Year 2013-2014

Team Members

Marie Basche, Femi Bogle-Assegai, Leonel Carmona, Becky DeVito (Co-Chair), Bonnie Edelen, Seth Freeman (Co-Chair), Ira Hessmer, Jason McCormick, Angela Simpson, Jennifer Thomassen, Jenny Wang

Summary of Activities

The Assessment Team developed the Quantitative Reasoning (QR) rubric during the Fall 2013 semester. Members first reviewed quantitative reasoning rubrics developed by other colleges and universities as well as the AACU value rubrics. Consideration was given to developing a rubric for quantitative reasoning that could be used across any academic discipline, as well as that could be used to score student assignments of different types, including multuple choice exams.

Alongside the development of the QR rubric, team members also identified and created sample assignments within each academic department that could be evaluated using the new draft rubric. Sample assignments arose from courses such as PSY 111, MAT 137, MAT 167 CSA 135 and PHL 111. Some these courses identfied assignments measured student achievement of all the QR outcomes, while others only measured student achievement of a subset of outcomes.

In December 2013, members completed the draft Quantitative Reasoning Rubric.

At the beginning of Spring 2014 the Assessment Team created the QR Assessment Cycle documents to forecast how multiple competency areas could be assessed simultaneously. The Cycle documents also specified how multiple rounds of assessment must be planned in advance for a given competency area to allow for curricular changes to be made after the first round of results and a second round of assessment to follow up to ascertain if the changes were useful.

Members continued to evaluate the new QR rubric by applying the rubric to new and existing QR assignments. Members also began identifying courses and corresponding faculty within their departments willing to participate in the 2014 – 2015 QR assessment. Team members met with faculty across academic departments and worked with faculty to tailor assignments to measure all or a subset of the QR outcomes. The focuc of this work was to ensure faculty were prepared to conduct the assessment in the Fall 2014 semester.

Assessment Team members also began development of the new Written Communication (WC) rubric. Members researched and evaluated existing WC rubrics used by other institutions as well as the previously developed Communicate Effectively rubric developed by CCC. Team members developed the draft WC rubric concurrently with the drafts of sample assignments in various subject areas, applying the rubric components to drafts of sample assignments in an iterative process.


Academic Year 2012-2013

Team Members

Femi Bogle-Assegai, Becky DeVito (Co-Chair), Bonnie Edelen, Seth Freeman (Co-Chair), Ira Hessmer, Bujar Konjusha, Daniel Tauber, Jennifer Thomassen, Jessica Vanderhoff, Jenny Wang

Summary of Activities

As governed by state law Public Act No. 12-31, a new Transfer and Articulation Policy (TAP) will govern student learning at the 12 community colleges, 3 state universities, and Charter Oak State College, all of which comprise the ConnSCU institutions. Statewide committees developed new core competencies (TAP Framework). Each of the ConnSCU institutions voted on whether to ratify this new TAP Framework, and if so, identified two competencies that would be required for all transfer students under Section B of the Framework. At Capital, the Campus-level Core Curriculum Design and Assessment Committee (C-DAC) facilitated discussions in the various academic divisions across the school (Departmental Votes on TAP), leading up to a vote of C-DAC members, which was then formulated as a recommendation to CAP and the Senate (C-DAC Recommendations). CAP and the Senate voted in line with C-DAC’s recommendations, so the TAP Framework was ratified at Capital in December of 2012, and the core competencies chosen for Section B of the Framework were: Social Phenomena and Aesthetic Appreciation.

The new core competencies identified in the TAP Framework have now replaced the General Education Goals at Capital, so the Assessment Team will be developing rubrics and identifying appropriate student artifacts to collect and assess in the coming years. In order to prepare the groundwork for such changes, both the Assessment Team and C-DAC began assisting the departments in mapping courses to the core competencies. The Assessment Team created guidelines and worksheets (Course Competency Matching, Course Competency Matching Department Summary, Curriculum Mapping Guidelines ).

A workshop was developed to sensitize Assessment Team and C-DAC members to issues involved in revising the learning outcomes of the Standardized Course Outlines (CUK Memo, CUK Syllabus, CUK Syllabus Key).

A specific guide was also created, Recommendations for Embedding the Continuing Learning/ Information Literacy Competency into Your Revised Course Outline, for the Continuing Learning/Information Literacy Competency Area.

In the Spring 2013 semester C-DAC began reviewing Standardize Course Outlines of courses that newly matriculating freshmen are eligible to take across the school, agreeing on the following criteria to guide their observations and comments to faculty (C-DAC Standards for Reviewing). Course outlines approved by C-DAC were then sent to CAP and Senate for approval in the usual manner. C-DAC review of additional course outlines, followed by the process of approval through CAP and the Senate will be an ongoing activity in the coming semesters.

Meeting Minutes/Agendas
Documents

Academic Year 2011-2012

Team Members

Becky DeVito, Bonnie Edelen, Seth Freeman (Co-Chair), Ira Hessmer, Bujar Konjusha, Catherine Schackner, Daniel Tauber (Co-Chair), Jennifer Thomassen, Jessica Vanderhoff, Jenny Wang

Summary of Activities

The Assessment Team conducted the Critical Thinking general education assessment in Fall 2011 and Spring 2012. The Assessment Team worked with faculty across all departments to collect student artifacts from selected courses in which faculty mapped assignments to the Critical Thinking rubric. The Critical Thinking Rubric used was originally developed by Valencia Community College in 2005. X different assignments throughout the college were part of the assessment. Due to the complexity of using one rubric so many different assignments, faculty teaching the courses created customized Critical Thinking Rubrics that mapped from the rubric to the specific assignment/project.

The Assessment Team used the Digication online assessment platform to store the artifacts, rubrics and artifact scores. Digication Training Sessions are held to prepare participating faculty and staff.

Critical Thinking Scoring Sessions were held in May 2012. The Scoring Sessions were interdepartmental, wherein faculty within each department scored artifacts generated within their department. Some departments input the artifact scores into Digication, while others gave it to the Assessment Team to complete this step. The artifact scores were aggregated and summarized into the Critical Thinking Assessment Results.

In Spring 2012, the Assessment Team began planning to assess the Global Perspective general education outcome. The Assessment Team evaluated the current Global Perspective rubric, and identified ways to improve the rubric. The team also began working with Program Coordinators to identify courses to participate in the next Global Perspective Assessment.

Documents
  • Critical Thinking Rubric (modified from Valencia Community College)
  • Critical Thinking Participating Courses/Assignments
  • Sample Modified Critical Thinking Rubric
  • Critical Thinking Assessment Results

Academic Year 2010-2011

Team Members

Winchester Brown, Karen DeLoatch, Becky DeVito, Bonnie Edelen, Evelyn Farbman, Seth Freeman, Bardh Hoxha, Daniel Ragusa, Kristen Swider (Co-Chair), Daniel Tauber (Co-Chair), Jennifer Thomassen, Jenny Wang

Summary of Activities

The Assessment Team laid the groundwork for the Critical Thinking general education assessment. The team first evaluated the Critical Thinking rubric previously created by CCC faculty and reviewed the Critical Thinking assessment conducted during the X-Y assessment cycle. The team decided to use authentic course artifacts in the assessment, and chose to use a new Critical Thinking rubric developed by Valencia Community College. This new rubric was shared with faculty and staff across the college. Team members met with Program Coordinators to review previously created degree program curriculum maps, that list beginning and terminal courses within the degree programs that align with Critical Thinking outcomes. Program Coordinators are asked to refine the curriculum map, by identifying and/or developing specific critical thinking activities (assignments, projects, etc.) within the courses, and detailing where and how the activities align with specific “Think Indicators” in the Valencia Community Collee rubric. This work was conducted throughout the academic year, to prepare for the actual Critical Thinking assessement the following year.

To follow-up on the Comunicate Effectively assessment the previous year, the Assessment Team administered the Survey on Faculty Writing to faculty across the college. This survey is used to determine the nature/make-up of writing assignments assigned to students, types of feedback given to students, faculty perceptions on teaching writing, and more. 83 faculty across the college participated in the survey. A Detailed Item Analysis is compiled.

In Spring 2011, the Assessment Team coordinates the first “Assessment Day”. This is a college-wide event to showcase the hard work done across the college measuring student learning outcomes and using results to inform curriculum and services. The event is modeled after Assessment Day held at Keane State College. Assessment measures at the course-level, programmatic level, and general education level are highlighted. X faculty present workshops, and Y faculty prepare posters.

Meeting Minutes/Agenda
  • May 2, 2011
  • Apr 4, 2011
  • Mar 21, 2011
  • Feb 28, 2011
  • Feb 7, 2011
  • Dec 1, 2010
  • Nov 17, 2010
  • Nov 3, 2010
  • Oct 10, 2010
  • Oct 6, 2010
  • Sep 22, 2010
Documents
  • Critical Thinking Rubric (modified from Valencia Community College)
  • Survey on Faculty Writing
  • Survey on Faculty Writing Detailed Item Analysis
  • Assessment Day Agenda
  • Assessment Day Video

Academic Year 2009-2010

Team Members

Leonel Carmona, Seth Freeman (Co-Chair), Evelyn Farbman, Kathy Herron, Bujar Konjusha, Art Kureczka, Daniel Tauber, Jennifer Thomassen (Co-Chair), Jenny Wang, Michelle White

Summary of Activities

The Assessment Team assessed the Written Communication general education outcome. The assessment strategy consisted of collecting authentic course artifacts from courses across the curriculum. This was a new approach taken to increase faculty participation and engagement in assessment activities … The strategy also ensure students will fully apply themselves … This approach, however, offerred new challenges, most significantly the need to measure types of assignments with the same rubric.

Early in Fall 2009, the Assessment Team distributed Memo to Participating Instructors and Memo for Participating Students to inform faculty and students participating in the assessment, and specifically aware students of the rubric that would be used in evaluating their writings. ….

The Assessment Team evaluated online assessment tools, and decided to use the Digication online assessment platform to store student artifacts, rubrics and artifact scores. To facilitate this and prepare faculty participating in the assessment to use Digication, Digication Training Sessions were held.

Artifacts were collected throughout the Fall 2009 and Spring 2010 semesters. Artifacts were collected from X courses, with Y students participating in the assessment. Artifacts were collected from courses identified as “beginning” and “terminal”. “Beginning” artifacts are writing samples created earlier in students’ programs of study, while “terminal” artifacts are collected form courses taken in the students last semester prior to graduation.

Three Scoring Sessions were held in Summer 2010 to score the writing artifacts collected during the academic year. A norming session was held at the beginning of each scoring session. All the student scores and artifacts were uploaded to the Digication online assessment platform. The assessment results were later analyzed and summarized into the …

Meeting Minutes/Agenda
  • May 3, 2010
  • Apr 19. 2010
  • Mar 22, 2010
  • Feb 17, 2010
  • Dec 16, 2009
  • Nov 18, 2009
  • Oct 21, 2009
  • Oct 7, 2009
  • Sep 23, 2009
  • Sep 9, 2009
  • Sep 2, 2009
Documents