Tuesday, April 12, 2016

Report of the Senate Task Force on Academic Support and Retention

Faculty and Professional Staff Senate
Task Force on Academic Support and Retention Report

Overview
Bristol Community College’s Faculty and Professional Staff Senate is comprised of full-time and part-time faculty and staff who in the academic year of 2015-2016 heard the call of the College to examine more thoroughly student retention in response to a decline in student enrollment. Given the nature of the Senate and our unique lens as individuals working in the trenches directly with students, senators voted to specifically examine academic supports in relation to student retention. A Task Force was formed in the Fall of 2015 to take-on this work.

This Task Force is comprised of five active members, including Jean-Paul Nadeau, Deborah Palumbo, Debra St.George, Ron Weisberger, and Julie Jodoin-Krauzyk, Chair, all individuals who do currently or have previously provided academic support at Bristol. We met regularly in the spring 2016 semester to consider ways the College could better understand the impact of academic support upon retention. Our charge was to gather data, both qualitative and quantitative, on the allocation of resources for academic supportacross campuses and the impact of such allocation on student retention. We were limited by our work coinciding with work-to-rule and the loss of two individuals from the Task Force.

Our Process
Retention, of course, is impacted by every facet of the College, academic support being just one. We at times struggled to maintain boundaries, as a discussion of academic support is a subset of academics more generally.

Defining “Academic Support”
To clarify said boundaries, we began our work by defining what we meant by the term “academic support.” We take this term to refer to all resources provided by the College that are intended to support students’ classroom learning. These include such areas/services as the Learning Commons, Advisement (including the Advisement Center, Connections, and Guided Pathways), the Office of Disability Services, and the QUEST program—but also faculty office hours, library/technical literacy support, supplemental instruction, the College Success Seminar (CSS) course, learning communities, the ESL program,  support for students taking online courses through the CITE Lab, and SmartThinking online tutoring.

In addition to fostering student learning relative to course content, reading, writing, thinking--and what successful students and learners do--academic support can also serve a retention function because it connects students with others on campus with whom they can resolve problems, share successes, and vent frustration. We attempted to examine these forms of support as they exist uniquely at Bristol across our individualized campuses and satellites in order to identify opportunities and inequities.We identified our campuses as including Fall River, New Bedford, Attleboro, Taunton, and eLearning.

Defining “Student Retention”
To conform with definitions commonly used by the College and in research, we defined “Student Retention” as the rate of students who are actively enrolled from Fall to Fall of over the course of an academic year. Similarly, we defined “Student Persistence” as the rate of students who return from a Fall semester to matriculate in the subsequent Spring semester.

Our Research Questions
The next step was to focus on the first part of our charge, to consider how academic support resources are allocated across campuses. These allocations could be measured in terms of funding, space allocation, and staffing. Amongst the questions we had were: On what basis are monies allocated across campuses for academic support services? What type of equity do we offer students across campuses? How does the culture of each campus impact its individualized needs?

We then turned toward assessment of those services and resource allotments. Once we know what we’re doing, it is important, we thought, to ask some key questions on a regular basis:

1. What are we currently doing?
2. What is working?
3. What do we need to do better?
4. How do we know?
5. And how do we share this information?

Before any assessment of benefit can take place, we thought it prudent to ask for information relative to what we provide in terms of academic support across campus. We wanted to next affirm what we already do well.  By this, we don’t mean to suggest we’re asking which programs and services are working, exactly. The “what” could pertain to individual aspects of a program’s operations. To offer a hypothetical example, it might be that the system for addressing “no show” appointments at the Learning Commons works to prevent future such occurrences. Question three functions similarly. For instance, hypothetically, it might be that the group tutoring for history is a great service but that communication of that service to history faculty and students could improve. Answering such questions would bring us closer to an understanding of the impact of academic support as well as help us guide the development of those supports. The “how” questions aim squarely at the need to consider how decisions are made in order to ensure they are consistently based on current and relevant, quantitative and qualitative data.

A Review of Prior Efforts
We begin by recognizing that there have been a number of attempts by different parties to study and analyze academic support at the college. In fall 2008 an Academic Support Task Force was created consisting of 19 faculty, professional staff and administrators. Despite meeting all year they were unable to agree on a final report that would define and analyze the effectiveness of academic support on campus. Subsequently, in the fall of 2009 a scaled down committee consisting of the TASC, Writing Center, Advising, Testing and ODS was created. Although meeting for nearly two years this committee failed to issue a report. We then appreciate that although a crucial factor in retention at the college there is a history of failure in regard to both defining and measuring academic support at the college. Nevertheless, the Senate believes that this needs to be done and this document is a start in trying to accomplish this important task given the unique positioning of Senate constituents primarily being those who directly deliver many academic support services.

Our Data Collection
While a literature review verifies a correlation between student engagement and retention, we wanted to also consider how the College could illustrate that relationship with data, both quantitative and qualitative. We thought it critical to examine not just usage of academic support (i.e. the number of tutoring sessions), but to have short- and long-term qualitative data measures as to the impact of that usage. Receiving immediate, post-session reflections from a student is one step toward collecting such qualitative data. Upon reflection, weeks, months, or semesters after receiving support, though, students may very well have a different perception as to its efficacy. Numbers are easier to collect and graph, but we suspect aren’t allowing for a true look at students’ experiences.

Given that we had identified over ten sources of academic support that may or may not be delivered across all five campuses and in the spirit of expediency and efficiency, we went directly to Administration with questions around budget allocations and current college-wide data collection related to the effectiveness of sources of academic support as pertaining to student retention. We were encouraged to meet with Executive Vice President, David Feeney, who was managing similar, simultaneous data collections efforts. Julie, Chair of this Task Force, met with Feeney; Vice President of Administration and Finance, Steve Kenyon; and Vice President of Institutional Research, Planning, and Assessment, Rhonda Gabovitch on February 25, 2016 in response to our request for data specifying funding allocations for specific academic supports and current findings around correlations with student retention.

During this meeting, consensus was found around a historical lack of data being systematically collected to assess the availability of academic resources across campuses, the usage and equity of these services provided by faculty/staff, and/or reasonable data-driven outcomes that could potentially measure the effectiveness and quality of services. Administration is currently working on identifying these data points and collecting such information to aide in future funding allocations of support services. However, there was some disparity as to how academic support services were being defined and measured by administration and by the Task Force. In other words, the services identified at this meeting did not include all the sources of academic support our Task Force has highlighted and service quality was not yet being measured systematically.

Based on this request, we were provided a number of documents from several sources. These included:
1.      a SmarThinking (online tutoring service) summary 2012-2013 (Appendix A), four-year comparison from 2011-2015 (Appendix B), as well as a study of that same service conducted by the Writing Center (Appendix C),
2.      a Learning Commons report from fall 2014/spring 2015 (Appendix D),
3.      the recent Community College Survey of Student Engagement (CCSSE) survey (Appendix E) along with a list of BCC-specific questions (Appendix F),
4.      Technology Planning and Policy Committee (TPPC) survey results designed to “understand the rapidly changing technology needs and literacy of our BCC students” (Appendix G),
5.      slides presented at a recent professional meeting titled “Retention and Student Success” (Appendix H),
6.      a survey on Equity of Support Services by Administration (forthcoming) (Appendix I)
7.      a survey on Availability by Administration (forthcoming) (Appendix J).

Though we asked for information as to resources allocated to academic support, in terms of funding, space, and personnel, we have that data only in terms of funding for item number one. Anecdotally, we understand that areas of academic support do formally and informally assess their services and report use and effectiveness in different ways including grant reports, monthly reports, and licensing or certification reviews. Yet - this information is not readily available campus-wide and not yet being used to inform funding, staff or space allocation decisions. This information seems critical as decisions are regularly made as to hours of operation, staffing, and space allocation with or without transparency.

Our Results and Findings: What Are We Doing? What is Working? What isn’t? How do we know? And How do we Share this Information?
The surveys and reports we were provided had some information that was helpful in our effort to answer the charge of the Task Force.  Here we will highlight those portions that seemed most relevant in our quest to uncover what we are doing, as well as what is and isn’t working, relative to academic support.

Numerous SmarThinking reports were provided by the Dean of E-learning, April Bellafiore, aimed, it seems, on investigating what is and isn’t working relative to tutoring services (Appendix A). One document, titled “Smarthinking Summary -- AY 2012-13,” provided a series of tables with data from fall 2012 and spring 2013, though the data was missing a record of TASC tutoring appointments in spring 2013:

Outcomes of Tutoring -- AY 12-13


Total #
Students
% of students who used tutoring services
AVG Term GPA
Percentage change in AVG GPA
Average Overall GPA
Percentage change in Overall GPA
% Persistence

Fall
2012
No Tutoring
7845

2.50

2.64

72%
TASC
463
6%
2.57
+2.8%
2.83
7.19%
81%
Smarthinking
204
3%
2.93
+17%
3.01
14%
80%
Spring 2013
No Tutoring
8829

2.55

2.73

43%
Smarthinking
171
2%
3.12
+22.36%
3.07
+12.45%
59%
(“Smarthinking Summary -- AY 2012-13” Table 1, Appendix A)

Conclusions were drawn in this report regarding the correlation of usage of TASC services and SmarThinking tutoring with gpa and persistence. One conclusion was that students who used SmarThinking had higher GPA’s than students who didn’t use the service. Another was that students who utilized TASC tutoring services were more likely to persist. There is no assessment within the report as to any measure of learning occurring through students’ use of these services. For the group of students using SmarThinking, the report seems to suggest that their higher GPA seems to be attributed (solely) to their use of this service. The other SmarThinking report offers a comparison across four academic years, including AY 2011-2012, 2012-2013, 2013-2014, and 2014-2015 (Appendix B). The data presented is purely in terms of usage, and so qualitative analysis isn’t possible:

Smarthinking 4-Year Usage Comparison
Academic Year
# Students
# Sessions
# Hours
2011-2012
425
1,364
824
2012-2013
445
1,261
836
2013-2014
468
1,188
805
2014-2015
420
1,109
734
(“Smarthinking Usage 4 Year” Table 1, Appendix B)

Dean Bellafiore also provided some information relative to current expenditures for this academic support service, explaining projections for the 2015-2016 academic year are that Bristol is “on track to use approximately 800 hours this year. Given the current and historical usage, we are requesting 750 hours at $29 per hour. This works out to $21,750 + $1450 annual fee = $23,200.” (Bellafiore) This information helps us begin to answer “what we are doing” relative to online tutoring support.

Another study of the Smarthinking service, conducted by the Writing Center, adds to the quantitative data presented in the aforementioned reports with some qualitative information (Appendix C). We noted that the study participants were both professionals (Karl Schnapp and Genie Giaimo) who work at the Writing Center as opposed to students. With this particular service the question was raised as to whether it conflicted with the mission and philosophy of the physical Writing Center(s) existing on each campus.

One of the more promising sources of data was the Learning Commons report examining student persistence from fall 2014 to spring 2015 (Appendix D). What seems to be the key data highlighted in the report is found on page two: “The Persistence rate from fall 2014 to spring 2015 for students who received TASC or WC services was 83.04%. The data indicates that during this period, students who engaged in either TASC or WC services persist at a higher rate (83.04%) than students who do not (69.92%).” This data was presented in the following table:

Tutoring Persistence Data

N
Percent
Fall 2014 Registered TASC-WC Students
672

TASC/WC fall ‘14 to spring ‘15 persistence
558
83.04%
Fall 2014 Registered Students minus TASC-WC Population
8517

TASC-WC Population Fall 2014 to Spring 2015 persistence
5955
69.92%
(“Learning Commons Report,” Table 1, Appendix D)

The influence of these services on student persistence as presented in the Learning Commons report of TASC and Writing Center usage, however, remains unknown. It is possible that those students who actively seek out such academic supports tend to be more successful than those who do not. It would be interesting to know persistence rates of users delineated by student GPA. Of further interest would be whether these students who persist are accessing other campus resources (academic or otherwise).

The CCSSE survey offers a statement that promises to identify what is/isn’t working with Bristol’s academic supports from students’ perspective: statement #13, specifically a, d, e, and k--relative to our definition of academic support (Appendix E). There students reveal how frequently they use certain supports, rank their satisfaction, and state the “importance” of the support. The only supports identified are “academic advising/planning,” “peer or other tutoring,” “skill labs (writing, math, etc.),” and “services to students with disabilities.” Admittedly, this offers a broad look at students’ perceptions and use of these supports, assuming their reading of the categories offered matches our own.

In addition, the CCSSE survey includes a few statements that refer to activities that could possibly be related to students’ use of faculty office hours, specifically #4, parts l and m, which refer to discussing grades, assignments, and career plans, but it wasn’t clear whether those discussions actually occurred within that context. Also, all that is asked for is frequency of occurrence in regard to these phenomenon. Interestingly, #4k asks students to reflect on whether they e-mail faculty, but there isn’t a similar statement specific to office hour visits. The one statement on the CCSSE survey that best targets office hours as academic support is #4n, which asks students to identify the frequency with which they, “Discussed ideas from [their] readings or classes with instructors outside of class.” Even this question could involve interaction outside of office hours, however, such as talking briefly at the end of class. The additional, BCC-centric questions don’t get at the area of academic support to a great extent (Appendix F). Question #6 asks, “While attending this college, how did you most often use academic advising?”, followed by a list of options including the source of that advice only (e.g. “Advisement Center,” “Fellow Students”). The only other question that touches on academic support is #7, which reads, “During the past year, how often did you receive academic advice from instructors?”. The information requested gets at frequency only and doesn’t allow a determination of whether that advice came in an out-of-classroom setting.

The TPPC survey revealed students’ access to and uses of technology but few questions drilled down to students’ (academic) technical literacy skills relative to that technology or corresponding need for support in that regard (Appendix G). In other words, the report didn’t allow us to answer the “what are we doing” and “what is/isn’t working” questions we had. Toward the end of the survey, students were asked how easy they thought it was to work with online course systems, though there weren’t specific follow-up questions. Such data would be relevant to the work of the Task Force in so far as it impacted students’ ability to perform course-related tasks; complete coursework; and return for further studies in subsequent semesters.

We thought that we might gather some answers to our research questions at a recent professional staff meeting through the agenda item titled, “Retention and Student Success” (Appendix H). A review of the slides presented at a recent professional meeting revealed more of a focus on enrollment dips and causes than on retention, and didn’t specifically emphasize the role of academic support in that enterprise (Maslow’s hierarchy of needs was mentioned). There was a slide, however, graphing “retention of entering students,” presumably from fall to fall, from 2010 to 2014 for both part- and full-time students:
                            (Slide 5, Appendix H)

The retention rate for full-time students didn’t fluctuate much, hovering at about 60%, but there was more volatility in terms of the part-time retention rate which ranged from a low of 40.4% to a high of 49.7%. These rates weren’t compared to other institutions for comparative purposes.

The Task Force received limited information, and we suspect this was a factor of both time and availability. It was present for some supports--more accurately, services--and not others. It seemed as though the information was collected, assembled, and accessible primarily by those overseeing the particular support service. We would also argue that a systematic approach to data such as we will propose would allow for timely information available to a wider audience. 

One particularly notable area of academic support where data wasn’t provided was the usage and benefits of faculty office hours. The major issue is that most courses are taught by part-time faculty who aren’t required/paid to have office hours: though we don’t have current figures relative to the percentage of courses taught by full- versus part-time faculty, we do have data from the Massachusetts Community College Council (MCCC) that in 2008 there were 96 full-time faculty and 381 adjunct faculty. In other words, 80% of the faculty was part-time. We don’t have reason to believe this ratio has changed significantly. This situation complicates efforts to examine the importance of that support mechanism. A related issue is that these faculty without office hours would have less opportunity to connect with students and direct them to other academic supports.

We also noticed that the data provided didn’t help us to draw any conclusions as to equity across campuses. What we hoped to discover was whether we were making intentional decisions relative to the allocation of resources toward academic support at our Attleboro, New Bedford, Taunton, Fall River, and on-line sites. Unfortunately, we are unable to make that determination. However, it is clear based on Julie’s meeting with Administration that “equity” across campuses has been identified as a critical concern and is in the initial stages of being measured by surveying stakeholders in Fall River, New Bedford, Attleboro, and Taunton (not including eLearning) (see Appendix J for a copy of the survey form being used). Only six sources of student support are being analyzed by Administration at this time, namely: Advising, Career Counseling, Transfer Counseling, Disability Services, Financial Aid, and Tutoring. Additionally, the primary measure of equity is the total number of full and part-time staff hours available on each campus for each service area identified per full-time enrolled (FTE) student per semester, specifically the Fall of 2015. Data collection has not yet broken down Equity by other factors that we would have expected to include such as availability of day, evening and weekend hours; length of appointments; space allocation; ease of appointment-making; type of service provider (i.e., faculty, staff, student) and types of services available per campus within one department (i.e., type of advising session or subject matter covered by tutor).

Additionally, we found that Administration is currently working to identify common, measurable outcomes that could potentially examine effectiveness of support services. Various services are being asked to list outcomes that could be correlated with support service use through a survey (see the survey tool in Appendix J). These support services include Advising, Career Counseling, Transfer Counseling, Disability Services, Financial Aid, Student Activities, Student Services, Tutoring, Admissions, Bookstore, Mental Health Counseling, and the Testing Center. Possible outcomes include graduation rate, GPA, persistence, student retention, student employment, number of transfers per headcount, student debt level, percent of student who receive financial aid, and enrollment yield. To date, data is not available that can clearly correlate use of any of these student support services (including those designated as academic support services) to a measurable outcome. Administration is looking for such a correlation to guide future funding allocations for such support services, working within a “best return on investment model.” Unfortunately, many of the sources of academic support identified by this Task Force are not currently being asked to establish consistent outcomes or measured for effectiveness in this manner. Those missing included QUEST, faculty office hours, library/technical literacy support, supplemental instruction, learning communities, the ESL program, support for students taking online courses through the CITE Lab, and SmartThinking online tutoring. We also share Administration’s concern with the feasibility of finding a significant relationship (whether it be correlational or causal) between service use and outcome. This may also not be a fair way to measure support service effectiveness given the multiple factors that impact individualized student success, retention, and persistence. Nonetheless, alternate measures are not currently available.

Another part of our findings was there didn’t seem to be standardized terminology used to refer to certain academic support services across campuses. This could lead to confusion as students and faculty/staff transition among campuses. For example, traditionally in Fall River - tutoring support could be found at TASC (Tutoring and Academic Support Center). In New Bedford - tutoring was available in the Costa Academic Support Center which also offered more intrusive advising and coaching similar to the Connections Center in Fall River. In Attleboro, similar academic supports or tutoring were available but not within one center and by staff who had other titles. One example is the renaming of Tutoring and Academic Support (TASC) to Learning Commons (LC).  This created some confusion, particularly at the New Bedford campus where LC services are housed within the Costa Academic Support Center (ASC) vs. an independent location at other BCC campuses. The ASC supports students who are utilizing the room for non-tutoring related reasons, such as printing, studying, working on course assignments, etc., as well as those students seeking tutoring services from the LC.  Correspondence was being sent to faculty, staff, and students referring them to the “Learning Commons” for academic support and tutoring.   As a result, the distinction between the ASC and LC became unclear.  

Our discussions also touched upon anecdotal evidence, situations where procedural, staffing, or spatial allocation changes to academic support services were not based on transparent data. As a result, we thought it necessary to develop a system that would make such changes transparent, as well as the rationale behind them. For that to happen, sufficient data is needed, accessible to key stakeholders. Once change has been made, we also thought it critical that the results be assessed and monitored.

During our informal inquiries, we have also learned of faculty and staff’s concerns over misadvisement and unclear information on the College’s website. The Search feature on our website does not search material listed in our College Catalog which can hinder new applicants and current students from finding valuable information about programs and course listings. Additionally, other information upon registration such as information around Learning Communities or Supplemental Instruction are also not clearly noted on various AccessBCC registration screens. Often notes from the main webpage are different than those in AccessBCC or Degreeworks links. Additionally, while there may be data available on the use of Degreeworks to which we are not yet privy - advisement happens informally every day on our campus and is most likely not recorded or documented in any routine fashion. Students may be advised by each other as often as they are advised by faculty/staff. Occasionally, this information is incorrect or misleading.

Furthermore, we found lacking evidence of the effective use of literature-based and nationally recommended models of holistic academic support. While there are developing meta-majors and Guided Pathways in General Studies and smaller programs/majors that are cohort driven - a similar college-wide approach to academic support is not yet available. This would include prolific use of a Learning Community in a student’s first semester that is connected to her/his College Success Seminar, CSS 101 course.  In fact, we have anecdotally learned that currently Learning Communities are not clearly identified on registration screens or in advisement materials and are therefore consistently under-enrolled and have often been cancelled in recent semesters.

We found the available data to be sparse and inconclusive as to allow us to understand whether there is relative parity of services across campuses or, to examine especially, the impact of those services on student retention.

We also found an emphasis on quantitative data in the materials provided, and an underutilization of a student case approach or focus groups that would help us deepen our understanding of student usage of academic support.


Task Force Recommendations

1. Create a systematic, holistic  approach to data collection - both quantitative and qualitative.

To collect useful data, we need a systematic approach, one that regularly, intentionally, and consistently gathers information as to usage of academic supports as well as perceived benefits from students that may also be potentially correlated to student retention. This data should include the campus (i.e. Attleboro, New Bedford, Taunton, Fall River, elearning) where the support is provided. There needs to be buy-in by all stakeholders to consistent participation in this systematic approach to data collection. The potential benefits of this approach are many. Such data could be used to help students understand the importance of extending their time on campus beyond class time. Stakeholders could also take a proactive approach to solving issues that arise and make for informed decisions as to whether to maintain or modify programs and services. Data collected by such a systematic approach could finally begin to answer questions of availability and equity of academic support services across campuses; the quality and effectiveness of such services; and develop potentially significant correlations between services and student retention.

2. Implement a systematic, holistic approach toward measuring the relationship between academic support and student retention utilizing pre-existing tools.

We considered two ways to implement a more systematic process of compiling and sharing data regarding academic support. The first method was to use the monthly reporting system to gather information, regularly, as to student usage of academic support services. Ideally, reports would examine the same time period across Departments.  Reports could routinely identify staffing; hours of operation; space allocation; length of appointments; and type of services delivered.

Beyond usage data, we recommend that Departments gather feedback, both short- and long-term, as to the efficacy of academic support services utilized and include this in ongoing monthly reports. This type of reporting would more regularly and more transparently outline and continually update the What We are Doing; What is Working; What is Not Working; How Do We Know; And How Do We Share Questions originally identified above. Of particular use would be data relative to what students learn during their engagement with academic supports, what transfers from those moments to future academic tasks.

The other approach discussed was to utilize MapWorks to document academic support usage for individual students identified as at-risk based on self-disclosure, helping us gather information beyond just the aggregate. In doing so, MapWorks trainings should not only review the program but explain how its data will be used in a hope to increase faculty/staff referral to encourage students to complete this survey. This information would be shared across the network of academic supports. It might even be used for the identification of student focus group participants or for intervention purposes.

3. Create an academic support advisory board.

We also thought it prudent to establish an advisory board of academic support personnel (at both the Director and direct-service-provider levels) to oversee and coordinate this data collection as well as assessment methods. This board could reach out to Divisions and Student Services Departments for assistance with promotion and development of programs and services.

4. Institute Learning Communities for first semester students connected to the College Success Seminar course or equivalent.

Literature-based and nationally recommended models of holistic academic support identify Learning Communities as a key tool in aiding retention. While there are developing meta-majors and Guided Pathways in General Studies and smaller programs/majors that are cohort driven, a similar college-wide approach to academic support is not yet available. This would require Learning Communities to be clearly identified on registration screens and within advisement materials to ensure these courses enroll.

5. Consider mechanisms for examining student usage of faculty office hours and consider offering part-time faculty incentives for providing office hours.

Students have direct access to full-time faculty during contractually-mandated office hours and to part-time faculty during their optional office hours or individualized appointments. While there is no data yet available to describe the frequency at which students attend faculty office hours or the range of ways in which faculty use this time to assist students, this is a potentially underused opportunity for students to connect with individuals devoted to students’ success, retention, and persistence. In office hours, faculty can listen to student’s needs; re-explain material in an individualized manner and pace; make referrals to other forms of academic/student support services; and continually advise students. Minimal fiscal investment would need to be made to more effectively tap into this valuable, personal resource for a larger number of faculty.

6. The College should continue to discuss next steps relative to these recommendations at the upcoming, campus-wide Retention Summit.

Task Force members are willing to continue to assist administration with their continued efforts around this valuable work. After the Task Force had been formed and had begun its work, we learned of a “Retention Council” that consisted of members of the Administration, primarily from Student and Enrollment Services. Involving faculty and staff who are providing these resources and working directly with the students we are trying to retain is critical, and we are eager to be part of this effort once work-to-rule is over.

Our Task Force Conclusions
Bristol’s many academic support mechanisms aim to help students make learning gains and to help them to overcome obstacles that stand in the way of course, program, and degree completion. This Task Force hopes its work has aided the College in helping to reach that goal.


Works Cited

Bellafiore, April. “Re: eTutoring Usage.” Message to Jean-Paul Nadeau. 29 Mar. 2016. E-mail.

Massachusetts Community College Council (MCCC). “Research Vital to the MCCC Mission.” MCCC News 6, no. 2 (February 2008): 4.





Appendices


Copies of all reports are provided within the report folder in GoogleDrive: https://drive.google.com/open?id=0BwzE7OR0QNQoaEJXME1GR1lSZFU