Conference Theme: “Teaching in the 21st Century”

The Courage to Teach: Using a Faculty Learning Community to Reframe the Role of the Professor

Julia Metzker, Rajni Shankar-Brown, and Michael Eskenazi, Stetson University
[Prezi] [Planning Doc]

Abstract: Many colleges have embraced “engaged learning” as an antidote to disruptive forces in higher education. Engaged learning environments transform the classroom from a space where the professor is the deliverer of knowledge to the professor as a facilitator of learning, offering authentic opportunities to deeply engage with content and the learning process. Faculty involved in engaged learning need adequate preparation, resources and support. Three faculty members will report on their experiences as participants in a year- long learning community that uses Parker Palmer’s classic text, The Courage To Teach, as a guide to re ect on and develop impactful educational practices.

 

Enhancing Metacognition, Grit, and Growth Mindset for Student Success

Peter Arthur, University of British Columbia Okanagan
ArthurP-SSTL17-Handouts

Abstract: Research studies indicate a positive relationship between a student’s metacognition, grit, growth mindset and academic success. These traits can all be taught and enhanced through experience. Further, these traits all assist students with being successful lifelong learners. This session focuses on evidence-based strategies teachers may embed in their learning environments. Participants will then be able to evaluate multiple ways these strategies can be integrated into one’s teaching.

Notes:  This is the exam-wrapper SoTL work Peter Felton mentioned in the first SoTL workshop we had in January.  Metacognition, Grit, and Growth correlate to academic success (persistence and graduation).  His premise is that all three of these characteristics can be taught and scaffolded.  He does it by adding “wrappers” to exams and assignments that ask students to anticipate (before, reflect and plan (after) as metacognitive practice for exams.  Most of the session focused on metacognition.  I didn’t really get what he did to foster grit and growth mindset with students.

Resources

  • Carol Dweck’s work on growth mindset
  • Angela Duckworth’s work on grit
  • Butler, D. L., & Winne, P. H. (1995). Feedback and self-regulated learning: A theoretical synthesis. Review of educational research65(3), 245-281.
  • Zimmerman, B. J., & Schunk, D. H. (2001). Self-regulated learning and academic achievement : theoretical perspectives. L. Erlbaum.
  • Research based study tips (in handout)

The Marketing Mindset: Tools to Increase Participation in Professional Development Programs

Diane Boyd, Lindsay Doukopoulos, and Marisa Rodriguez, Auburn University
MarketingMindset-AuburnBiggioCenterHandouts

Abstract: This session will feature data supporting the benefits of cultivating a marketing mindset in professional development. Presenters will share a set of effective strategies for driving engagement in development programs with particular emphasis on a faculty interview campaign that resulted in a persona report. Since we began using this report to cus- tomize communications participation in Biggio programs has increased by over 172%. Participants will use case studies and have access to our methodology to assist them in identifying their own possibilities for increasing professional development “reach” even without a dedicated marketing position in the organizational chart.

Notes: This session showed how the Auburn Biggio center used a marketing hire to build audience for their center.  An interesting overview of the market principles was presented, with case studies and a very neat planning diagram.

Are Your Students Effectively Marketing Their Research Experiences?

Rosalie Richards, Stetson University

A student’s ability to successfully map her experiences to the competencies desired by employers and graduate schools sets her apart from other candidates. However, there is often a chasm between what is taught and what students must demonstrate. Undergraduate research as a pedagogy of engagement offers ample opportunity for mentors to evaluate a student’s level of academic preparation. But are faculty well-equipped to help students effectively articulate the competencies acquired from participation in research? During this interactive workshop, participants will work together to help students, in- cluding marginalized students, situate themselves in a position of competitive advantage in a global marketplace.

Notes: This is an extension of the session Rosalie did at CUR, thinking through how faculty can help students articulate their educational and work experiences into transferable skills.  We rewrote a cover letter.

Transparent Assignments That Promote Equitable Opportunities for Your Students’ Success

Mary-Ann Winkelmes, Transparency in Learning and Teaching in Higher Education Project, University of Nevada, Las Vegas

Transparent teaching/learning practices make learning processes explicit while offering opportunities to foster students’ metacognition, con dence, and their sense of belonging in college in an effort to promote student success equitably. A 2016 publication identi es transparent assignment design as a replicable teaching intervention that signi cantly en- hances students’ success, with greater gains for historically underserved students (Win- kelmes et al., Peer Review, Spring 2016). We’ll review the ndings as well as educational research behind the concept of transparent teaching/learning in this session.

Notes: Mary-Ann showed the data from her work on transparent assignments an UNLV.  She also showed the basics of a transparent assignment.  We looked at some assignments and improved them to be more transparent.

The Unwritten Rules of College: Transparency and Its Impact on Learning

Mary-Ann Winkelmes, Transparency in Learning and Teaching in Higher Education Project, University of Nevada, Las Vegas

Data from a 2014-2015 AAC&U study of students’ learning at seven Minority-Serving Institutions indicates that transparency in assignments boosts students’ success (and es- pecially underserved students’ success) signi cantly in three important areas: academic con dence, sense of belonging, and mastery of the skills employers value most when hiring (Winkelmes et al., Peer Review, Spring 2016). In this session, we’ll review the nd- ings about how transparent assignment design promotes students’ success equitably, as well as educational research behind the concept of transparent teaching/learning. Then we’ll apply that research to the design of class activities and course assignments. Partici- pants will leave with a draft assignment or activity for one of their courses, and a concise set of strategies for designing transparent assignments that promote students’ learning.

Notes: This was a very cool workshop where we worked with someone not in our discipline to make an assignment transparent.  We had to explain the assignment to someone and they had to tell us what they would do as a student in that class.  I’ve asked her to send her handouts, which are kept in a shared google drive.  We will do this at the next BIF.

A Five-Step Model for Deep Understanding: Facilitating Student Learning

Esther Zirbel, Consultant

How can an instructor motivate students to reflect deeply and think critically, expand their world views, and inspire them to think beyond a problem? This ve-step model utilizes constructivist learning theories and supplements them with insights gained from cognitive neuroscience while focusing on several leading questions, including: What are the students’ challenges as they move through all steps? What is likely to happen at each step on a neural level in the students’ brains as they deepen their understanding? What can an instructor do to support students as they move through each step?

Hello, My Name Is Sigmund Freud: Using Role Play Discussions to Facilitate Learning

Jessica Waesche, University of Central Florida

We are always looking for new ways to increase engagement and facilitate student learning, particularly in the online learning environment. I will discuss how, using Mark Carnes’ Minds on Fire, I was inspired to create and implement a role-play discussion assignment in my online Abnormal Psychology class. I will discuss how I created the assignment, how students reacted to it, and the lessons I have learned.

Diversity and Motivation: The Visual Arts Administration Course

Ilenia Colon Mendoza, University of Central Florida

The 21st century teaching strategies for my Visual Arts Administration course center in meeting the four conditions of the motivational framework as presented and discussed by M.B. Ginsberg and R.J. Wlodkowski, Diversity and Motivation, Second Edition (John Wiley & Sons, 2009). I focus on providing: Inclusion, Attitude, Meaning, and Compe- tence to maximize effective learning and cultural responsive teaching. Active partici- pation, choice, real-world experiences, and authentic performance tasks allow students to enjoy rst-hand interactions with art that foster success in the classroom and create long-term impact on their real lives and future careers.

Translating Midterm Student Feedback into Improved Student Evaluations and Learning

Douglas Holton, Hajara Mahmood, and Kathryn Cunningham, Embry-Riddle Aeronautical University

A Midterm Student Feedback survey involves an instructor or outsider asking students for comments and suggestions for improving a course in the middle of the semester rather than at the end. Research studies have shown how to conduct this service such that it may improve both student ratings and student learning. In this workshop, par- ticipants will learn how to design and implement their own Midterm Student Feedback survey. Then we’ll share and discuss the most common student survey comments and how to translate each of them into speci c strategies for improving student evaluations and learning.

Experiential Learning: An Integrative Process to Foster Appreciation of Nursing Research

Carrie Hall, Christy Skelly, Carrie Risher, and Beverley Brown, Florida Southern College

Students often identify the value of research within practice but nd it dif cult to con- ceptualize. In an effort to bridge this gap, nursing faculty at FSC actively worked with senior community health nursing students in the development and delivery of a com- munity educational event to increase female undergraduate college students’ knowledge of women’s reproductive health. The event included a student developed and delivered poster presentation, coupled with an IRB approved descriptive research study to evalu- ate pre/posttest knowledge. Students and faculty worked together in all aspects of the study implementation including: recruitment, consenting, delivery of the intervention, and testing.

Exploring Cohort-Based Delivery of Educational Programs in the State of Florida

Lou Sabina, Patrick Coggins, Chris Colwell, Glen Epley, Rajni Shankar-Brown, and Debra Touchton, Stetson University

This 15-minute presentation will discuss the model currently employed by Stetson Uni- versity to deliver cohort-based delivery of their Master’s Degree in Educational Leader- ship. Participating faculty will discuss the strengths, weaknesses, challenges, and oppor- tunities for growth in county-based delivery of educational leadership courses. While our subject area is educational leadership, our intention is to provide information regarding the model, which could easily be customized across other dimensions of education, in- cluding teacher preparation, curriculum and instruction, educational technology, ESOL, or reading and literacy education.

C. Allen,. (n.d.). 20-Bloom-Question-Cues-Chart.pdf. Retrieved from http://www.asainstitute.org/conference2013/handouts/20-Bloom-Question-Cues-Chart.pdf
Education, L. (2013, April 22). Transparency in Teaching: Faculty Share Data and Improve Students’ Learning [Text]. Retrieved February 16, 2017, from http://www.aacu.org/publications-research/periodicals/transparency-teaching-faculty-share-data-and-improve-students
McCuen@aacu.org. (2014, August 5). Transparency and Problem-Centered Learning [Text]. Retrieved February 16, 2017, from http://www.aacu.org/problemcenteredlearning
nicolasimmons. (2016, June 24). Motivation. Retrieved from https://researchsotl.wordpress.com/2016/06/24/motivation/
Review, P. (2016, July 6). A Teaching Intervention that Increases Underserved College Students’ Success [Text]. Retrieved February 16, 2017, from http://www.aacu.org/peerreview/2016/winter-spring/Winkelmes

Executive Summary

  • Start with a very brief to the point description of plan focus and strategy linked to mission.
  • Highlight the potential impact and the inclusive process early on
  • Clearly articulate goals and expected outcomes
  • Provide a brief overview of the strategy for achieving outcomes
  • Highlight timeline

Background

  • Provide institutional context.
    • Liberal arts
    • Strong volunteerism
    • need for more intentional assessment of impact on student learning
    • learning beyond the classroom
  • Describe mission revision – focus on engaged citizenship . . .
  • Pull pieces from QEP 101
  • Bring in strategic plan and findings from QEP task force reports.

Plan

  • Include table that shows how learners and builders will progress

Appearance & style

    • Plain language that avoids jargon
    • Bold colors consistently applied throughout the document
    • Simple graphics that illustrate meaning

`

Heartbreaking.  Indeed, academia is broken.

This morning, my professor handed me back a paper (a literature review) in front of my entire class and exclaimed “this is not your language.” On the top of the page they wrote in blue ink: “Please go back and indicate where you cut and paste.” The period was included. They assumed that the work I turned in was not my own. My professor did not ask me if it was my language, instead they immediately blamed me in front of peers. On the second page the professor circled the word “hence” and wrote in between the

Source: Academia, Love Me Back – TIFFANY MARTÍNEZ

[Status: DRAFT]

Dr. Steven Sheeley, Vice President, Southern Association of Colleges and Schools Commission on Colleges, Decatur, GA

ABSTRACT: This session will discuss components of an acceptable QEP as described in Core Requirement 2.12 and Comprehensive Standard 3.3.2 in the Principles of
Accreditation.

Resources

  1. Presentation (pdf) (prezi)

Summary

This presentation was by far the most useful regarding the QEP, and you know I love a prezi.  Starting with the good news – Just about everything Dr. Sheeley advised we have done!

HOWEVER, he insisted that having some baseline measurement for your QEP outcomes is an absolute MUST! Not having baseline data may likely lead to a recommendation from your evaluators. He specifically gave an example of knowing how many QEP courses/experiences you have before you start AND knowing something about where you are on student learning before you start. This is a compelling reason to launch our inventory this spring. That way we’d be able to say to the review committee that even though we don’t have a baseline right now, we will have one by the time we implement.   I am also going to troll through the program/core assessments in Compliance Assist and see if there are any program or course-level measurements that will help us establish a baseline.

Some highlights, I took away from his talk …

  • The QEP is ACTION RESEARCH.  Propose your plan as a research question.  It is OK not to know how it will turn out.
  • Don’t use NSSE has a direct measure of learning – at best it is an affective measure of how the student feels about their experience at the moment they complete the survey.
  • The plan should demonstrate a commitment to quality AND student learning.
  • The plan should have a clear thesis early in the document
  • The plan should clearly define success
  • The plan should address what will be enhanced AND that should thread throughout the document.
  • Don’t overdo it on the literature review.  Use only literature that supports your thesis.  You do NOT have to prove your thesis is unique.
  • The goals and assessment MUST be strongly aligned with the thesis
  • You should have both baseline data and defined targets for your assessment
  • The data you gather should be authentic, appropriate.
  • The plan should include a mechanism for ensuring the data will be used formatively to adjust the plan based on what we learn.
  • The scope of the plan should be relatively narrow – resist the temptation to solve all the university’s ills with this plan.
  • The budget should demonstrate adequate human, financial and physical resources for success
  • Establish a mechanism for disseminating successes to the campus and community as the plan progresses – don’t do the work in a vacuum!
  • Look for existing aligned programs that can be enhanced (e.g, experiential transcript)
  • Think about advice you would like from the on-site team when they visit – look forward!

I’ve taken a stab at addressing some of these issues below:

ENHANCEMENT: We are enhancing student engagement in learning AND GC’s relationship with the community.

THESIS: We will improve student engagement by developing structured, assessable, community-based learning experiences in both academic and co-curricuar settings.

SUCCESS: Success of the QEP is defined by …

  • demonstrated gains in SLO’s measured by direct and indirect measures of learning.
  • demonstrated gains in outputs (numbers of C-bEL courses and c0-curricular experiences, volunteer hours connected to these experiences, increases in reported activity on the NSSE, numbers of programs and faculty engaged in C-bEL)

FORMATIVE PLAN:  Include a management plan for committee review of assessment results and recommendations to the director.

 

 

Dr. M. David Miller, Director, QEP and Professor

Dr. Timothy Brophy, Director of Institutional Assessment

University of Florida, Gainsville, FL

ABSTRACT: This session will discuss the steps in developing a QEP assessment plan. The University of Florida’s QEP provides an example of a completed assessment plan that conducted an extensive review of existing instruments and developed instruments to measure the student learning outcomes (SLOs). The assessment plan includes direct assessments, indirect assessments, and outputs. We will discuss the process and criteria used to review existing instruments and to develop new instruments. Finally, we will discuss the overall assessment plan and show how we established the links between the assessments, the campus initiatives, and the SLOs.

Resources

  1. Handout (pdf)
  2. Presentation (html) – as of yet the presentation has not been posted.
  3. The U. of Florida QEP (html)

Phew!  This session was fast and furious.  They went through a ton of information in a short time.  Much of it is *VERY* relevant to our QEP.  This presentation essentially comprises Chapter 6 of their QEP.  They divided their assessment plan into three assessment categories:

  1. Direct assessments of student learning (rubrics)
  2. Indirect assessments of student learning (surveys)
  3. Outputs (measures of attendance, participation, etc)

Summary

Instrument evaluation rubricThe UF QEP topic is global citizenship and intercultural communication.  UF has a system where all programs have developed and assess SLOs in content, critical thinking and communication – so they aligned their QEP with the existing model.  They strongly emphasized the need for reliable direct measures of learning – this has certainly been a theme for the sessions I am attending.  Measuring outputs and indirect measures is all good and well but if you don’t have direct measures of student learning eyebrows will be raised.  Their learning outcomes  and assessments went through several iterations where they were reviewed by experts in international education and assessment experts but consistently emphasized the point not to lose the alignment.  The measures MUST align with the stated outcomes.  It is also important to note that they employ a course-embedded assessment approach.  For the measures, they provided the following advice:

  1. Make sure the measures are aligned with the SLOs
  2. Test the measures for reliability and validity
  3. Think about feasibility for your situation (e.g., UF has a LOT of students)

Direct Measures

They developed rubrics for the outcomes but ended throwing one out because they didn’t feel it provided useful information.  I wasn’t quite clear on the details.  The rubrics were developed by adapting the AAC&U VALUE rubrics to their purpose – another theme of this conference.  The adaptations they made were to (1) only use 3 levels, (2) change the mastery language to 3=consistently, 2=usually, 1=rarely, 0=never.  These rubrics were reviewed by committee several times and piloted.  It is important to note that they rubric is applied to a range of assignments/activities, which are designed and selected by the faculty/departments responsible for the course.  The university has some kind of approval process for these activities but they didn’t really go into detail about how that works.  They have a faculty lottery for assessment – an interesting idea.

Indirect Measures

UF gives the SERU to their students, so they added items to that survey to avoid adding another layer to their university indirect assessments.  They had graduate students troll the literature to find a dearth of available items that could go on a survey to assess international education and intercultural competencies.  This produced over 70 items.  They then sent these items through an expert (content and assessment) review process, to whittle the number down.  They then piloted the remaining items and did a statistical item analysis to determine the most reliable.  They decided that for reliability with their sample size 12-14 items is sufficient to get reliable results.  The handout contains some specifics on this analysis.

Outputs

Not much about this was shared.  However, they had some very nice flow charts that we should look at once the presentation goes up on the website.

In the question part, they recommended an annual review of assessment (SLO) data to make adjustments to the plan.

Materials

  1. Copy of presentation (pdf)
  2. Real-time eportfolio blog (html)
  3. Handout (html)
  4. ePortfolio wiki (html)

eportfolio - quadrant analysis

Summary

This session was about eportfolios defined as a collection of artifacts judged by experts.  The eportfolio is both an end product AND a learning process (formative/summative).  The presenters are assessment professionals at Duke’s Trinity college, their college of of liberal arts and sciences. The session went through several potential systems that ranged from being student centered and unstructured (e.g., WordPress) to very assessment-centered and highly structured (taskstream).  The image provides a quadrant analysis of potential systems.  Interestingly, Duke created a home-grown system 15 years ago which they recently had to abandon because they couldn’t keep it current with developments in social media.

The take home message I took from this session was two-fold:

  1. the primary criteria for your system should be that it promotes the students ability to showcase themselves and their learning in an authentic way.
  2. The “sweet spot” for an eportfolio system is where it hits the middle of the diagram in the quadrant – a criteria that most available systems don’t hit.

The speakers offered some useful guiding principles for an eportfolio system …

Guiding principles

  • enables reflection among both students and facilitators.
  • provides information and insights that can’t be obtained elsewhere.
  • is more than the sum of its parts.  The interaction between pieces of evidence is essential.
  • creating the ePortfolio generates unique knowledge about learning (metacognition).

The ePortfolio

  • must be adaptable and allow creative change.
  • should allow comparisons without standardization.
  • must be assessed iteratively to capture shifts in students’ thinking.

The presenters reviewed a few systems implemented at Duke and other campus, which represent the range of the spectrum; Chalk & Wire, Sakai, WordPress blogs, wikis, pinterest, and popplet (mindmap).

What I took away …

There were several things I found useful and/or of interest for our QEP:

  • The campus doesn’t have to have a unified system – the trade-off comes in complexity in data collection and analysis.
  • Using the available software products marketed for eportfolios typically reduces the creativity and flexibility from the student end
  • Using wordpress would be a cheap solution.  The speakers suggested developing a tagging scheme for the blog postings that would allow reviewers an “easyish” way of identifying evidence.  For example, you could have the students tag a post with particular learning outcomes.

 

In no particular order, these are things that need to go into the QEP.  I will add to this as I think of things, so as not to forget when writing …

  • Apprentice workshop application and review rubric, in development – Julia Metzker
  • Journeyman Minigrant application and proposal review rubric, in development – Ryan Brown
  • Master and Union level RFP and proposal review rubric, in revision – Cynthia Orms
  • Timeline for implementation, in development – I&B, contact Tom Miles
  • Budget, in development – Susan Allen & Jen Russell
  • Organizational Chart, in revision – Susan Allen & Jen Russell
  • Student assessment rubrics aligned to learning outcomes, in development – Julia Metzker
  • Indirect measure survey, in development – Steven Jones
  • Literature review, in revision – Steven Jones
  • History & context for topic selection, in revision – Jason Huffman/Steven Jones
  • Project Tier description, in revision – Julia Metzker
  • Assessment plan, in development – PDA, contact Cara Meade/Julia Metzker
  • Implementation plan, in development – I&B, contact Tom Miles
  • Data collection and review plan, in development – Julia Metzker/James Carlisle
  • SAS training plan, complete – Ryan Brown
  • Portfolio proposal/resources, in development – James Carlisle/Julia Metzker
  • Master/Union:  submit professional proposal and initial curriculum map (outcome map), in development – Julia Metzker
  • Course/co-curricular experience review and approval rubric (and procedures), in development – Steve Jones
  • Menu of potential workshops (curriculum mapping, rubric development, data review, best practices in community-based learning, building community partnership)
  • Professional development plan, in development – Julia Metzker
  • Mentor team, in development – Julia Metzker
    • Mentor develop the workshop series over summer with stipends.
  • Direct measures, in development – PDA
  • Indirect measures, in development – PDA
  • Output measures, in development – PDA

The outline for the document is in development, currently at workflowy.

Kathryne Drezek McConnell, Office of Assessment & Evaluation, Virginia Tech

Kimberly Filer, Office of Institutional Effectiveness and Assessment, Roanoke College

Steve and I attended a this all-day rubric workshop.  The workshop was very informative.  We didn’t have time to make progress on our own rubric at the workshop, but I am trying to steal time to draft the rubric throughout the meeting.  You can watch my progress by looking at QEP-OutcomesRubric_DRAFT.docx in the PDA subfolder of our QEP dropbox.  Below, I have listed the take-home points I took away from the workshop.  I think these will be very useful for our rubric work.  I asked Kathryne if she would be interesting in helping us conduct some train the trainers workshops.  She is willing and can recommend people if scheduling is an issue.  I found her a motivating and engaging facilitator, sensitive to the faculty perspective.

  • They highly recommend using the AAC&U Value rubrics to inform your rubric development, however it was patently clear that modification of the rubrics is necessary to make them understandable by students and usable by students.  She shared several iterations of the rubrics from her campus that they developed for the Virginia Tech QEP.
  • Virginia Tech has implemented a system where the rubrics are universal across disciplines but the artifacts and assignments are developed by the faculty or person designing the experience.  This model seems to be the new norm and there seems to be little similarity between how the assignments are reviewed and approved (if at all).
  • Rubric development is iterative and should regularly be cycling through the design, align, norm process.  The iterative process should rely on evidence to inform revisions of the rubric.
  • Give individuals the control as to how to best implement the rubric in their learning experience.  The important consequence is that the assignment is aligned closely with the rubric.
  • Rubric analysis is labor intensive and people doing the work should .…
    • see how the work is consistent with their values and concerns
    • have the opportunity to develop expertise in rubric design and analysis in a way that doesn’t make inordinate demands on their time
    • see rewards for their participation (examples include stipends, travel grants, implementation grants, course reduction.)
    • see an obvious institutional commitment to provide on-going support for the rubric analysis
  • Neither of these institutions collect student work as evidence of the learning – they only collect the faculty reviews.
  • Training is crucial.  Rubric users need to be trained in two capacities: (1) defining the rubric for their situation (2) norming the scoring with other users.
    • After attending this workshop, I think I would be able to conduct this training.
  • The rubric MUST be directly aligned with the student learning outcome – this is one reason why you are likely to have to modify the AAC&U rubrics for your situation
  • If you are implementing a rubric assessment process (or any other innovation), the presenters strongly recommend using the Concerns-based Adoption Model (CBAM) from Change in Schools: Facilitating the Process (Hall & Hord, 1987) or some other model of making sure you are regularly auditing the users to identify and address concerns as soon after they arise as you can. – I think it would be good to include this type of monitoring in our plan.  The book contains surveys that you can use.
  • When reporting rubric results avoid the temptation to report an average.  Report distribution – it is more meaningful and can highlight rater issues.

The speakers recommended the following resources:

  • Change in Schools: Facilitating the Process (Hall & Hord, 1987)
  • Learning to Think: Disciplinary Perspectives by Janet Gail Donald (2002)
  • Dr. Mary Allen, Professor Emeritus for Psychology, former Director of the Institute for Teaching and Learning, Calfiornia State University, Long Beach, CA, and author of Assessing General Education Programs
  • The University of Hawaii, Manoa assessment website.

An exciting milestone!  The Engaged Student Learning Outcomes were endorsed by the University Senate on Friday, December 6!  In addition, I provided an update on the QEP.   I am working on a written update, which will become part of the official record of the meeting.  No questions were asked.  I don’t know if that is good news or bad but I think the presentation went well.

[View the presentation]

Welcome to my blog.  I am an assistant professor of chemistry at Georgia College, Georgia’s public liberal arts university.  I am very familiar with blogs and use them heavily for my courses.  I decided to repurpose a blog on our department server that I haven’t used in some time for this course.  Otherwise, I look forward to seeing how this course unfolds.