Competency 9

Evaluate individuals, families, groups, organizations, and communities.

In social work practice, the assessment component is essential. In our industry, evaluation is essential since it guides and shows how any activity is performing in terms of helping others. the aspects that are working well, those that require improvement, and the possible future steps. This tool is designed to examine our practice for the future and analyze the prior intervention competencies that can direct our intervention. It enables us to adjust our strategy in light of our own development. I demonstrate my commitment to this ability by highlighting its significance in my work and conducting frequent assessments of its organization and functionality. I’ll investigate the appropriate literature on assessing social work practice in order to interact with this competency. 

Evidences 

9.1 Select evidence-based evaluation strategies with specific client systems.

Course Evidence: New Organization

As part of an assignment to create a new organizational proposal focused on reentry services, I developed a comprehensive staff evaluation plan designed to assess job performance, promote professional growth, and align individual development with the organization’s mission. This plan demonstrates my ability to select and implement evidence-based evaluation strategies with a specific client system—namely, the staff team working directly with individuals transitioning from incarceration.

Grounded in social work knowledge of supervision, performance evaluation, and trauma-informed care, this evaluation plan incorporates annual reviews, mid-year check-ins, and continuous feedback mechanisms. Drawing on literature such as Burke’s (1998) strategies for improving staff performance and Smith & Beno’s (1993) guide to staff development evaluation, the design reflects established best practices in fostering reflective, constructive performance environments.

Core social work values, including the importance of human relationships, dignity and worth of the person, and the commitment to competence, are embedded in the plan. Staff are encouraged to reflect on their strengths and challenges through self-assessments, while supervisors are guided to provide compassionate, constructive feedback that promotes a culture of accountability and growth.

I utilized professional skills in policy drafting, evaluation design, and collaborative planning to create clear metrics, goal-setting templates, and development plans. The plan also includes optional 360-degree feedback for leadership roles, emphasizing inclusivity and diverse perspectives in assessing effectiveness.

The theoretical foundation of this plan is grounded in empowerment theory and systems theory. By involving staff in their own development and feedback processes, the evaluation promotes shared responsibility and systemic improvement. It acknowledges that staff success is deeply interconnected with client outcomes and overall organizational effectiveness.

Engaging both cognitive and affective domains, I reflected on how evaluation can serve as both a motivator and a stressor for staff, especially in high-stakes, emotionally demanding roles like reentry work. I aimed to design a system that fosters psychological safety, ongoing dialogue, and a clear path for growth that values critical in sustainable social work practice.

This project demonstrates my commitment to ethical, evidence-informed evaluation strategies that support the wellbeing of both staff and the clients they serve. Through this plan, I contributed to building a workplace culture where professional excellence and human-centered care are mutually reinforcing.

Field Evidence: DHSI Evaluation Report 

During my field placement under the Title V STEM Success Grant at Southern Adventist University, our grant collaborated with external evaluators to analyze program outcomes and contribute to the first-year external evaluation. My involvement helped assess whether the grant’s objectives such as expanding mentoring efforts and supporting underserved students were being met. I supported data analysis related to student retention, financial aid access, and internship participation. For example, I contributed to collecting and organizing qualitative data from mentoring groups, which showed 100% positive feedback and helped inform future planning. I applied evidence-based frameworks to interpret outcome metrics, such as the 14.4% decrease in student loan debt and the increase in 4-year graduation rates from 33.9% to 38.1%. This process deepened my understanding of impact evaluation and the use of procedural knowledge to support institutional improvements that benefit individuals, families, and communities.

9.2 Evaluate the efficiency and effectiveness of practice outcomes.

Field Evidence: Needs Assessment and Program Proposal 

As part of my Advanced Administrative Practice course, I worked with a team to develop a comprehensive teen pregnancy prevention and parenting support program housed within the Real Life Pregnancy Center in Marshall County, AL. A key component of our project was designing an evaluation plan to assess both the efficiency and effectiveness of our proposed services, which included support groups, targeted workshops, and a parent mentor initiative.

We utilized a logic model and stakeholder feedback to define clear outputs, benchmarks, and desired outcomes for each intervention. For instance, we anticipated that 70% of support group participants would attend regularly and report increased use of coping strategies. We also created measurable goals for workshop attendance, parenting knowledge, and mentoring outcomes, demonstrating our ability to plan for program evaluation using both quantitative and qualitative indicators.

Drawing on knowledge of program development and public health evaluation theory, I helped develop SMART goals, survey tools, and a mixed-methods data analysis plan. We incorporated social work values by designing inclusive, culturally informed strategies and ensuring community voice guided all stages of planning. My work required professional skills in research, logic modeling, and stakeholder communication, and deepened my cognitive and affective understanding of how structural inequities impact health outcomes in marginalized communities.

This experience strengthened my ability to evaluate practice outcomes through data-driven, equity-focused methods aligned with social work’s commitment to effectiveness and social justice.

Field Evidence: Barbershop LifeGroup Evaluation, Winter 2025 

Evaluating the Barbershop LifeGroup allowed me to measure the effectiveness of a strengths-based, culturally affirming support group designed for at-risk male students. Using structured feedback tools developed mid-semester and post-semester, I assessed participant engagement, leadership effectiveness, and perceived benefit of the group format. This aligns with the social work values of service, integrity, and the importance of human relationships, ensuring that participant input directly informs how future groups would be adapted. My analysis drew on empowerment theory and systems theory, recognizing how personal development, institutional belonging, and academic persistence are interconnected. The cognitive process of data analysis helped identify patterns in engagement and satisfaction, while affective reflection allowed me to process the vulnerability shared in open-ended feedback. By closing the feedback loop through adaptive group planning, I demonstrated the critical role of ongoing evaluation in improving group-based interventions and ensuring their cultural and contextual relevance.