Summer wrap-up for the 2023 Summer Fellows

This summer flew by, as it always does. The 2023 Summer Fellows first came to Chicago for their orientation in mid-June to kick-start their research projects (read our blog entry on the Summer Fellow’s orientation). Last week, The Learning Partnership staff, CPS teachers and administrators, and research partners gathered at DePaul University’s Jarvis College of Computing and Digital Media to hear the fellows’ presentations on their research findings in a symposium on, “Measuring Computer Science in Chicago Public Schools & Milwaukee Public Schools.” The fellows shared enthusiasm and nerves prior to their presentations, but also a sense of relief at how welcoming the audience was toward them. In this blog, these researchers explained what they took away from their experience, and they each expressed a greater interest to have the chance to grow with the in-depth mentorship process.
Willow Kelleigh is an undergraduate student at Mount Holyoke College. This was her second summer at The Learning Partnership and the symposium was both a culmination of her work thus far and an official announcement that she will join TLP full-time next year after her graduation. Her research examined how to connect students with CS. Here is what she had to say about her experience this summer:
How did you feel about the Symposium overall?
I really enjoyed the symposium, getting to see everyone’s research together in context and in conversation with both researchers and practitioners was really fantastic. With my own research, I was excited to find that the scaffolding practices I was studying were leading to students valuing CS more, hopefully these practices can be expanded to more CPS students in the future.
This summer I learned about multiple imputation and structural equation modeling, and it was great being able to learn new methods. After this summer, I will return to Mount Holyoke College for my senior year and complete my undergraduate degree in Statistics, after which I am happy to be returning to The Learning Partnership full-time as a data analyst. I look forward to supporting the researchers collaboratively, and to be working on more projects.
Chungsoo Na researched how to provide diagnostic information from the ECS assessment. Chungsoo was surprised that so many people attended, especially public school teachers who specifically teach computer science. This provided further proof to him that the work he and other fellows were conducting was relevant, which was also seen in the presentations by two students who shared their experience taking CS courses. Here is what he has to say:
How could students benefit from continuing education CS courses?
I think providing tailored assessment information is so critical. Cognitive Diagnostic Modeling (CDM) provides more personalized assessment information, such as what students have mastered or have not mastered yet. From this information, teachers can provide more adaptive instruction to students based on their area of growth. Furthermore, it is important to help teachers integrate this diagnostic information into their teaching practices and refine their instruction.
What surprised you the most in your research? Were the results different from what you expected?
Yes, black and Hispanic students showed a substantially lower mastery probability of CS knowledge when they entered the ECS curriculum, but they benefited more from the ECS curriculum than any other ethnic group. I think it is provides evidence that the ECS curriculum contributes to reducing disparities in CS education.
Tony Kirkosian focused on how the elements of effective teaching could predict student’s expectancy, value, and cost as it relates to CS course. This was measured by using the Tripod 7C+. Now having returned to Washington State University to finish his doctoral studies, he hopes to have the opportunity to complete another internship similar to that of TLP. Although his dissertation focuses on math education and math attitudes and how that has changed over the years based on gender and ethnicity, the takeaway from his research at The Learning Partnership is relatable and gives him a larger scope of practice. Here is what Tony thought of the symposium:
How did you feel about the Symposium overall?
I was excited and nervous, but once things got started, I felt more at ease. Up to now, I haven’t had a chance to speak to practitioners at this level before, so this is something I’d like to do again.
What surprised you the most in your research? Were the results different from what you expected?
For the most part the results were what I thought they would be. However, some elements of effective teaching weren’t as significant since they didn’t contribute as much to predicting expectancy, value, and cost.
One of the high school teachers I spoke with helped me make sense of the results that were a little surprising within the context of a controlled classroom environment. Too much control could adversely affect the expectancy, value, and cost, although I’d need to look further to confirm this, it’s another possibility that I hadn’t considered.
Would you change any way about which you analyzed the data?
I would look more into missing data, for example what percent of missing data is acceptable and what is expected in preparing the data for analysis. In essence, trying to figure out ways to leave as much of the data as possible in the context of missing data.
Rachel Zhou presented her work using learning analytics and log data from code.org to predict students’ assessments. Utilizing machine learning, she examined students’ patterns in regular learning activities to predict their final scores. Following her collaboration with The Learning Partnership, she expressed interest to continue pursuing data mining and analysis. A highlight of her experience was working with large datasets in a real-world context and the opportunity to connect with educators, practitioners, and various stakeholders from Chicago Public Schools. This is what she had to say:
What surprised you the most in your research? Were the results different from what you expected?
Initially, I hypothesized that students nailing questions on their first attempt (likely those with a strong knowledge base) would achieve higher assessment scores. However, the data painted a different picture: it was the productive, persistent students, those who faced challenges, but made gradual improvements, who ultimately excelled. This emphasizes how students learn is more important than what they know.
In what ways did your research connect to the others fellows?
My approach leaned more towards exploratory research, delving deep into fine-grained process data to uncover intriguing patterns. While I shed light on “this is what students do” and identified certain patterns, my findings alone only offer a piece of the puzzle. When combined with the insights of other fellows, such as the survey evaluations from Willow Kelleigh, we can jointly provide a more comprehensive understanding of students’ learning behaviors.
Lavare Henry spoke at the symposium about factors that correlate with student failure in the Exploring Computer Science (ECS) course. He particularly enjoyed the symposium. The friendly and engaged audience allowed him to relax and not worry about his presentation. After his time at The Learning Partnership, he will return to his home country, Jamaica, to continue in his work as an Assistant Principal at a public high school. He shares what surprised him most below:
What surprised you the most in your research? Were the results different from what you expected?
My study ended up being a follow up to one that was done in 2018, that study looked at the factors correlating with failure prior to the implementation of the CS graduation requirement. My data looked at 2016 to 2022 post-policy implementation in regards to the graduation requirement.
The previous study had these factors being significant to predict failure rate: the rate of attendance, cumulative GPA, females were more likely to pass, males were more likely to fail, if students took ECS in their freshman year they were more likely to pass; Special ed., ELL, and Hispanic students had a higher likelihood of failure. Nevertheless, years of teaching and attending a PD session correlated with less likelihood of failure.
From my study some significance in pre-policy results were not significant in my results. Those that were significant were cumulative GPA, attendance, a student taking ECS freshman year, and those receiving special education services remain the same; Hispanic students, ELL, years of prior ELL and PD session attendance were not predictors using the main model used in the evaluation.
The number of years teaching and attendance at PD not being significant was surprising. That was surprising because intuitively you would think the number of years the teacher was teaching the course would be predictors because those were before, but this factor was not significant in my study.
A key takeaway from this is to look at the ECS PD’s to see how they can improve. This means that perhaps more could be done to reach those students who are in the bracket of failing. In both studies, the failure for male and female remain relatively the same and boys continue to struggle post-policy implementation. Thus, since students are more likely to dropout if failing a core class and if more boys are failing ECS as a core class, this requires some attention.