Qualitative vs. Quantitative Data

Since there is some confusion as to the difference between qualitative and quantitative data in assessing student learning, I thought I would ask an expert: Chat GTP. Full disclosure, everything below comes from what the AI told me. Actually, I think Chat GPT did a pretty good job, but if you have any questions, feel free to reach out to me.

Qualitative vs. Quantitative Data according to Chat GPT: Retrieved 1/28/2023

“Qualitative data is descriptive and deals with non-numerical information, such as opinions, observations, or open-ended responses. It is often used to understand attitudes, behaviors, and motivations. Examples include interview transcripts, written comments, and video recordings.

Quantitative data is numerical and deals with measurement and counts. It is often used to test hypotheses and make statistical inferences. Examples include numerical survey responses, test scores, and counts of things such as number of customers in a store.

When measuring student learning outcomes, qualitative and quantitative data can provide different types of information.

Quantitative data, such as test scores and grades, can provide a numerical measure of a student’s knowledge or proficiency in a specific subject. It can be used to compare students to one another, as well as to track progress over time. It is particularly useful for measuring knowledge in subjects that have clear and objective answers, such as math and science.

Qualitative data, such as observations, interviews, and open-ended questions, can provide a deeper understanding of a student’s thought process, motivation, and engagement. It can be used to understand student’s learning process and evaluate the effectiveness of teaching methods. It is particularly useful for measuring skills that are not easily quantifiable, such as critical thinking, problem-solving and creativity.

Both types of data can be used together to get a more complete picture of student learning outcomes. Quantitative data provides a snapshot of student performance, while qualitative data provides a more nuanced understanding of how and why students are performing a certain way.

Regenerate response”

ARTS Annual Assessment Plan

ARTS requires schools to provide an annual assessment plan highlighting student learning outcomes.  This can sound like a daunting task, but it’s actually common sense. Let me break this down into 7 simple steps.

  1. Who is responsible?  The first thing your assessment plan should have is a statement of the person or persons at your institution who get together to discuss how well your students are meeting your learning goals.  Your plan should list those people, how they are organized, and how often they meet and where meeting minutes are kept .
  2. Do you have clearly stated learning outcomes?  All schools have program goals and learning outcomes, but when was the last time your reviewed and updated them?  Understand the difference between general goals, and specific learning outcomes.  Goals are big vision statements.  E.g. “Our goal is to equip men to be pastors”.  But there are many dimension in “equipping”.  If I were to ask you, show me an “equipped man” you should be able to list specific things that this person would be able to know, value or do – those specific items are your learning outcomes.
  3. What are the tools you use to measure your learning outcomes?  The one tool that always comes to mind is the student’s course evaluation form.  But there are other evaluation methods.  Measuring student completion and drop out rates is a form of assessment.  Peer reviews can be helpful.  Asking other faculty members to review completed assignments is another tool.  Reviewing rubrics used to evaluate student sermons, papers or portfolios is yet another.  There are several types of tools you can use to assess student learning.
  4. How do you analyze your assessment data?  While all seminaries collect course evaluations, not everyone takes the time to thoughtfully analyze the results.  A quick glance at what students say is not sufficient for an assessment plan.  Take the time to analyze, summarize, distribute and discuss the data.
  5. Do you conduct an annual review your assessment tools?  When was the last time you reviewed your assessment processes and tools?  Are you still using the same assignments, rubrics, surveys and course evaluation forms that you were 5 years ago?  Things change.  Have you changed to keep up?
  6. Are you improving based on your assessment results?  In assessment lingo, this is called “closing the loop”.  Your plan should explain how the data you’ve collected in your assessments is helping you to improve your teaching and learning results.  Each year you should list specific things you are doing better because of your assessment results.
  7. Are you telling the world about your learning results?  Where on your school’s web page is a link that says something like: “Educational quality”, or “Measuring student learning”?  This page should have the data you collect annually regarding graduation rates, results of student survey, testimonials, etc. 

When you answer the above seven questions you will have created an ARTS annual learning assessment plan.  The first time you create a plan, it will be relatively simple.  As assessment becomes part of your regular academic cycle you will find that the process becomes much easier and the results more valuable. 

In a future posting I’ll explore details of how to create meaningful learning outcomes.

A School’s response to a Site Visit Report

This posting talks about what is required of a member school after they receive a site-visit report.

First some background.  Our Policies and Procedures (and other documents) address specific processes and timeframes for the site visits.  While the process is documented in our Bylaws and Policies and Procedures, we are taking steps, (such as our new Blog – https://artseminaries.org/blog/ ) to further explain the process in plain language. 

After a site-visit is concluded, the report is finalized by the site visit team and sent to the school for their review and comments. If a site visit team marks as standard as “partially, or not met” then these are items that the school needs to respond with a plan for remediation.  That plan is then reviewed by the ARTS Executive Director and forwarded to the COA for approval.  For most items, Schools will tend to agree to the findings of the site visit team, and they will say something to the effect that:  “Yes, we recognize that this is a problem and here’s how we will fix it by <date>.” Schools have 12 months to remedy notations. 

The key is that the ARTS-COA requires compliance, not just a response to items which partially meet, or do not meet, ARTS standards.

However, what happens when a School does not agree with the COA’s findings?  The first step is to seek more information to understand the school’s perspective.  Usually, a short discussion clarifies, and the institution has time to come into compliance.  The submission of the next annual report is a logical “touchpoint” to monitor areas of improvement.  However, if an institution does not demonstrate correction of the notations and compliance with the standards within 12 months then the COA would take action as delineated in our Policies and Procedures.

The COA also takes into consideration the nature of the standard which has been partially, or not at all met.  Not all standards or lack of compliance are of the same importance or carry the same weight.  In other words, there are mortal and venal sins.

The complexity of standard non-compliance is also a factor.  Take standard 8.7 for instance.  “The institution shall include on its website, social media, and any other appropriate publication an explanation of accreditation and the dangers of diploma and accreditation mills.” If an explanation does not exist, it is a simple thing to add it in a relatively short period of time.  However if a school does not currently have a chief executive officer it might takes months, or longer, to find a suitable candidate.

As noted above, there are gradations in the process.  Some areas of non-compliance are clear cut.  E.g., Standard 8.5, “…does the institution publish and make publicly available an academic catalog…?  If a catalog does not exist, it is a clear failure on the school’s part.  Our new training for site-visit teams will help teams understand and apply rubrics for those areas in a standard which are less clear in interpretation.  Again, to take 8.5 as our example, the site-visit team may feel that the academic catalog’s structure or language does not fully explain to prospective students the precise nature of the programs.  The site-visit team has the obligation to make comments and suggestions with the goal of helping the institution improve. 

Finally, having said all of that, there may be items in a site-visit for which there are differing opinions on the ARTS-COA.  Given that there are such items, and the changing and evolving nature of the state of theological education, the COA must keep an open mind to potential revisions in the standards and/or further elucidation of our tenets and practices.  The COA needs to practice continual self-evaluation and ensure that we are applying ARTS standards consistently.

We still have some details to work out in this process to make sure that we are applying our metrics and rubrics for compliance equitably across all our institutions.  In addition, we want to make sure that we are pursuing any actions in a timely fashion.

The bottom line in all of this is the recognition that we are all involved in a process of continual self-improvement.

Welcome to the ARTS Blog

Introducing a new channel for news and information.

Welcome to the brand new ARTS Blog. In this space you will see announcements of ARTS activities, discussion about accreditation standards and other information that will be helpful to our member seminaries and to the public. If you have any questions please contact us at arts@artseminaries.org and/or use the contact form at: https://artseminaries.org/contact/

Pin It on Pinterest