Research: Experiments

Make it. Break it. Test it. Show it.

Creating and conducting experiments forms the core of any academic worth their salt. We design spaces to capture information, we create circumstances to test theory, and we establish evidence to support our claims—it’s elementary.

Over the centuries, evolution of the experimental method has been tumultuous. For a time, scientists only observed objective qualities and relegated everything else as noise—or worse—not scientific. Technology to some extent has helped expand what is observable. Brave pioneers have championed human centric and socially derived methodology in intellectual revolt. But the twentieth century lab culture carries on in highly controlled and unrealistic experiments.

Anything less than such a setup undermines your academic status and makes your results suspect. The problem is that this mindset excludes the living social domains that cannot be captured in a lab setting. Most concerning, findings in a lab don’t always transfer to a natural world riddled with complexity.

This is not to say that such controlled and limited methods are worthless, but rather to say that academia needs to expand proficiency into methods that better describe real and complex phenomena.

Purpose of my work

My research know-how is in-between the worlds of old science and new-age practice. I have research projects that occur in controlled settings and focus on studying objective data, and others that trade strict constraints for realistic samples and genuine activity.

My heart lives for complexity, so I am building up my toolset in qualitative methods. I want to study active systems, not lab conditions. I am progressing in my ability to isolate theory in dynamic settings, helping to establish a connection between academic inquiry and professional practice.

Outline of projects

My research began in controlled laboratory experiments and has opened up into dynamic university classrooms. My goal is to improve skills in participatory research, getting closer to understanding complex worlds and leading design initiatives.

[Click titles below to view article PDFs]

Dimensional nameability in category learning

Seen below as: Language and learning

We developed a learning task involving a special type of problem—Physical Bongard Problems. We examined the relationship between language and the difficulty of these problems. We set up a series of experiments using participants from Amazon Mechanical Turk. We ran three experiments to tease out the relationships between language and difficulty.

Instructor perceptions of active learning

Seen below as: A space for active learning

Active learning is a buzzword among university faculty, but what does it mean and what does it look like? We worked with a multi-department active learning group at UW-Madison to run an analysis on course instructors. We analyzed survey responses from instructors to study the relationships between tools, strategies and experiences in the active learning courses.

Student perceptions of active learning

Seen below as: Perceptions of active learning

A continuation of the previous study. We focused on understanding students in these environments—more akin to a user study. We developed a number of surveys, conducted interviews with TAs, and organized a student focus group. We wanted to connect guiding principles of active learning to what students were actually doing.

Language and learning

Below are excerpts from my paper with added commentary. Each addresses a main theme and presents a partial view of the whole discussion. Check out the full text for an in-depth reading.

A language problem

CommentaryThe focus of this research was language. We wanted to see if describing something with an increased amount of language translated to increased difficulty. We needed to find a type of problem that didn’t drastically change the nature of the solution when moving from easy to hard. Meaning that a harder answer would have a similar composition to a simple answer, but be longer. We decided to use a problem that maintained similar language from easy to hard but modulated the required length and relationships of language.

PaperThe nameability of a dimension can be exemplified with physical Bongard problems (PBP; Weitnauer, 2013; see image below). PBPs have multiple dimensions that a learner can pay attention to, however, a single rule separates the scenes and creates a categorical boundary.

To solve the PBP, a learner must find the rule that distinguishes the left and right scenes. We test whether the degree to which the problem-relevant dimensions are nameable mediate the difficulty of the PBP (i.e., less nameable dimensions are more difficult to solve).

The PBP involve complex problem solving, where forming a relevant hypothesis is non-obvious, but critical to solving the task. This makes PBPs a good standard for studying how nameability interacts with active and passive category learning.

Two Bongard problems. Left: rolls vs doesn't roll. Right: squares vs circles
Establishing a baseline

CommentaryWith the problem type selected, we created a pilot experiment to record the difficulty and generate response patterns from participants. This allowed us to establish a baseline for our future experiments and assess the validity of the problems.

PaperParticipants’ descriptions were compared against the normative solutions in the PBP documentation and coded for correctness. The mean performance on PBP descriptions is visualized in the image below. We calculated the within-subject standard errors using the Loftus and Masson (1994) method.

PBP mean accuracy ranged from 12% to 81%, showing a wide range in accuracy on the 11 items. Response times ranged from 23 seconds to 66 seconds. Decreasing accuracy on PBPs correlated with longer response times, r(259) = -0.344, p < .001.

Mean accuracy of participant PBP descriptions across the 11 PBP.
Connecting language to difficulty

CommentaryWe have baseline data. Now we need to establish a measure that represents the level of language involved in a given problem. The baseline data and measure can be assessed with predictive models to establish a relationship.

PaperIn order to tap into how easily participants can verbalize problem relevant dimensions, we need to create independent problems that isolate the dimensions relevant to the solution as equally as possible. We then test how easily these dimensions are named.

We can then use this nameability measure to predict performance from Experiment 1 on the original PBP task.A relationship would provide evidence that nameability of the dimensions is related to how solvable the problem is.

In this experiment, participants examined specially designed stimuli that isolated the dimensions relevant to solving the original PBPs and we’re prompted to verbalize these dimensions.

Does dimensionality predict performance?

CommentaryThe study itself was a pilot to examine this research area. We did establish a relationship between language and difficulty, green lighting additional research. We also identified concerns in our methodology that would require more attention in future work.

PaperThe difficulty of the PBP used in this study can be understood from multiple perspectives. One is that PBP vary in the complexity of the physical laws, dimensions of various properties, and the rarity that one encounters certain properties. Another way to think about our results is that harder problems may be more verbally complex due to the increasing complexity of non-linguistic properties, and thus one needs more words to describe the problem.

Our results are not able to disambiguate between these two possibilities. However, the fact that verbal response complexity in a more simplified version of the PBPs relates to accuracy on the original problems was consistent with the hypothesis that language has a causal role in finding solutions to PBPs.

This account predicts that the verbalizability of dimensions should affect participants’ performance differently depending on the extent to which the learning task promotes hypothesis formation.

Correlation between dimensional nameability and performance on PBP.
Active versus passive engagement

CommentaryOur previous experiments established a relationship between language and difficulty in PBP. We created a final experiment to play with different methods of engagement. In the previous experiments, participants only focused on describing and solving the PBPs. We designed an interface and experience to see if the language effect was still present and if we could mediate learning with our design.

PaperWeb-based category learning tasks were constructed to display the PBP scenes in the center of the screen (see image below). Participants were asked to sort individual images into two categories by learning what rule distinguished them. For every participant, PBP scenes were randomly positioned in the center of the screen.

Sorting boxes were positioned to the left and right sides of the PBP scenes. When a single PBP scene was dragged and dropped into one of the dark gray sorting boxes, a red “incorrect” or green “correct” message provided user feedback. If participants placed the PBP scene incorrectly, it would move back to the center of the screen.

A correctly placed PBP scene would be locked in place and could not be dragged again. In the active sorting task, participants could move any scene to the left or right and receive feedback. In the passive sorting task, participants were only allowed to move a single randomly determined scene. The single scene was highlighted with a yellow border.

Participants could move any scene to the left or right and receive feedback.

After sorting the PBP scenes, participants advanced to the rule generalization phase. In this testing phase, participants sorted 8 additional PBP scenes into right and left positioned dark gray sorting boxes. The scenes from the previous sorting task were left up during the testing phase as a reminder of category membership on the right and left sides (see image below).

Participants moved the center scene to the left or right boxes with no feedback given when dropped.

In Experiment 2, PBP performance was determined when participants named the rule after the sorting and testing tasks. We confirmed the relationship between dimensional nameability and PBP performance by fitting a logistic mixed-effects model of dimensional nameablility predicting PBP performance using the dimensional nameability measure from Experiment 2, b = -2.07, t(184) = -870.6, p < .001.

The PBP were added as a random effect to the model. Although there were quantified relationships between PBP difficulty and our dimensional nameability measure, active and passive learning showed no interaction with nameability in solving PBPs.

A space for active learning

Below are excerpts from my paper with added commentary. Each addresses a main theme and presents a partial view of the whole discussion. Check out the full text for an in-depth reading.

What is active learning?

CommentaryA small crack team of ambitious grads networked with a group on campus and set up a research contract. Initial research would be exploratory, helping us get oriented to the literature and research environment.

PaperGenerally, active learning involves small group and interactive activities to foster student engagement, which facilitates the learning process [3,9]. The goal of this educational stance is not to redefine learning, but to transform a passive lecture into an active learning experience and engage students on a deeper level.

The literature on active learning is far from mature; scholars still argue over what constitutes active learning [2,10]. Experts agree on certain strategies that are integral to active learning, namely cooperation and problem-based learning.

The strategy of cooperation reduces the focus on individual learning and places a strong emphasis on students learning together through interaction. In cooperation, students have individual accountability, mutual interdependence, face-to-face interaction, interpersonal skill development and understandings of team functioning [9].

In problem-based learning (PBL), focus is placed on student activity, small groups are common, teachers take on roles as guides and facilitators, learning leans towards being self-directed, and the work is to solve meaningful problems [10].

What kind of tools are in active learning?

CommentaryA goal of our research was to inform the group’s director about relationships and activities occurring in the space. Our report would be used to improve their workflow and help establish guidelines for instructors teaching in that environment.

PaperGreater understanding of how instructors use tools and perceive instructional strategies will provide insight into the theories underlying active learning and into ways to better support instructors. To explore this relationship, we chose a selection of instructional strategies and tools that we believe are representative of the wide range of tools and technologies currently in use (active and traditional).

For the instructional tools, we chose technologies that we thought span the spectrum between tools mainly associated with traditional lecture-style classrooms and those generally associated with active learning classrooms. One such tool is a document camera, a real-time image capture device for displaying objects to a large audience using a camera and projector.

Similar to the overhead projector, it is often used to present content to a large group whose focus is directed to one place. The document camera can scale from small room use to large lecture halls and is fairly common in most traditional lecture-style classrooms.

On the other end of the spectrum is the use of monitor displays, which are generally associated with active learning classrooms. These large monitors are located around classrooms and can display content from personal or class computers. These are used by groups and individuals to work on different material, share resources, and display information.

The last tool we included were fixed whiteboards (mounted on walls near tables and desks). The whiteboards represent a middle area on the tool use spectrum, as it is easily associated with both types of classrooms.

Such a tool is most assuredly equipped with markers and erasers, and their placement could be around the room or near the instructor podium (similar to monitor displays). We believe these instructional tools are representative of the different tools associated with the spectrum between passive and active style classrooms.

What kind of strategies do instructors use?

PaperFor instructional strategies, we selected strategies that are fairly established in the active learning literature [3]:

  • Whole class discussion: all students discuss questions or share ideas with the whole class. Open discussion can help facilitate problem solving approaches from different student perspectives. A large group discussion may be a gathering of preexisting small groups.
  • Assigned student groups: students collaboratively generate questions, address problems, or develop solutions. Groups could be encouraged to share with other groups and ask other students, teaching assistants, or instructors for feedback and insight. Groups include pairs or 3-6 students per group.
  • Student(s)-to-instructor questioning: students ask instructors and teaching assistants about the learning material. Students can identify misconceptions and receive corrective feedback.
  • Online instant feedback: Students receive instant feedback on the course material from online quizzes and exercises. Computerized feedback can help students check learning progression and correct misconceptions.
Research question

PaperHow does the use of particular instructional tools (e.g. whiteboards) relate to perceived effectiveness of instructional strategies (e.g., small group learning, instructor scaffolding)?

The method

CommentaryWe had uncovered some pretty interesting findings. We used qualitative data to support and understand the quantitative relationships.

PaperTo investigate the perceived effectiveness of instructional tools and instructional strategies, we examined survey data collected from WisCEL instructors regarding their course design. We conducted a correlational analysis using the spearman-method and pairwise deletion. We also examined the response data for trends to help explain the relationships uncovered by the correlational analysis.

Data
Instructor responses for instructional tools and instructional strategies.
Correlation between instructional tools most frequently used and most effective instructional strategies.

The investigation of WisCEL provided a deeper insight into the interactions occurring between the instructional tools and strategies. From our data, many of instructors used the monitor displays (42.5%) and fixed whiteboards (32.5%). Fewer instructors reported using the document camera (20%).

Many instructors reported student-to-instructor questioning (50%) and online instant feedback as most effective (55%), while fewer chose whole class discussion (22.5%) and assigned student groups (20%) as most effective. This is surprising, as discussions and group work are fairly typical in active learning spaces and are commonly mentioned in the active learning literature [3,7,9].

A possible explanation is that instructors were less aware of the effectiveness of class discussions and group work, where their role and interactions were reduced. While this explanation makes sense in terms of the instructor’s high involvement with student-to-instructor questioning, it conflicts with the reduced instructor’s involvement in online instant feedback.

More work would need to be done in order to uncover why more instructors identified online instant feedback as highly effective as compared to the number who chose class discussions and group work.

Understanding nuances in the space

CommentaryOur analysis would help direct future work in this space. By uncovering relationships, researchers can design experiments around them to answer specific questions.

PaperOur main goal for studying this space was to understand and document the interactions between the instructional tools and strategies. The most significant interactions involved the usage of the document camera and fixed whiteboards. Use of the document camera was strongly associated with the effectiveness of whole class discussion (see Table above).

The usage of the document camera (20%) and perceived effectiveness of the whole class discussion (20%) were the lowest when compared to all other active learning techniques, but a strong relationship between these suggests that they were often used together. Using the document camera may be a common way to support whole class discussions and engage students in learning.

However, document cameras were negatively correlated with online instant feedback. Instructors who found online instant feedback effective were unlikely to be those who used document cameras. We think that document cameras did not align well with implementations of online instant feedback.

Unexpected results

CommentarySome relationships suggest that certain tools are underused or not used as intended. This may be related to a lack of training, curriculum design, or floor plan design. Findings like this can be the most helpful for the director, as correcting them could have a significant impact on learning outcomes.

PaperMonitor displays showed significant, but weaker correlations (see Table above). However, the trends in usage of the monitor displays were perplexing. Use of the displays was negatively correlated with assigned student groups.

This finding is confusing as the monitors are setup to support group work (this conflicts with our first hypothesis). A possible explanation is that instructors used monitor displays, and students looked to them as a class, rather than as groups (this was commented on by one of the WisCEL instructors).

Qualitative substantiation

CommentaryQuantitative relationships help untangle anomalies—such as monitor usage. From the collected qualitative data, we could weave a story to explain those anomalies and generate a meaningful understanding. With a balance of both methods, we can build consistent and rich interpretations.

PaperInstructors who used the monitors for presentation were often sharing fixed and unchanging information, as suggested by comments that “the computers and monitors throughout the room were most effective for communication and learning” and “the Monitor displays throughout the room were helpful during exams if we wanted to convey any information to them.”

Instructors who used the traditional whiteboard as their means of communication used it to convey changing and unfolding information, as suggested by comments that “the whiteboards allowed slowly unpacking complex content” and “the whiteboards ... were most effective because ... instructors used them to clarify material or introduce activities”.

The comments around the effectiveness of the document camera were that “the document camera allowed us to keep a large class all on the same page by displaying the group work problems.” Other instructors used the document camera to “display sketches and diagrams of problems and solutions.”

These comments show how instructors shared changing and fixed types of information through their use of the three instructional tools. Additionally, document cameras and whiteboards served a separate purpose for group work.

In this case, the document cameras were used to display information to the groups, as one instructor stated: “We could display a problem set on the monitors and students in each group ... would solve the problems. As instructors, it was useful to get around and catch these mistakes.”

Meanwhile, the whiteboard was used by the groups collaborative work, as stated in the comment: “The whiteboards were particularly helpful in getting students in a group of six focused on the one problem.” From the qualitative data, instructors are using instructional tools at different times in alignment with different instructional strategies.

Perceptions of active learning

Below are excerpts from my paper with added commentary. Each addresses a main theme and presents a partial view of the whole discussion. Check out the full text for an in-depth reading.

Getting to know the students

CommentaryAs follow-up study, we collected and analyzed qualitative data from the students—an important stakeholder in the active learning courses. Research in this area would provide support for which strategies and tools were working for students, and how the space could improve student experiences. We administered multiple surveys over the semester, conducted a focus group, and interviewed a TA.

PaperThe survey asked questions about (1) whether students used the learning strategies, (2) how often they used the strategies, and (3) why students find a given strategy useful.

Focus groups asked about how students used or did not use the strategies.

An interview with a teaching assistant (TA) asked about the ECE course design, how the TA interacted with students, and how the TA perceived students’ strategy use.

The domain of interest

CommentaryThe active learning space had a mission to instill specific learning strategies in the students. Our task was to assess student strategy adoption, how strategies were useful, and identify pain points in course experiences.

PaperLearning strategies were introduced through a handout and with repeated reminders throughout the semester. They were also introduced in a lecture video in ECE 330. Students reported using learning strategies during class and outside of class time.

For example, before class, students wrote down key ideas while watching lecture videos. During class, students wrote solutions to practice problems. After class and in preparation for exams, students would re-do problems.

In the survey, students reported which strategies they used often and why strategies were helpful. In the focus groups, students reported how some strategies were difficult to adopt on a regular basis, but were useful for particular purposes (e.g., exam prep). The table below summarizes student responses for each of the instructors’ suggested strategies.

Strategy 1: Write out key ideas from each lecture video

PaperStudents are using their notes from the video lectures for a number of purposes: exam prep, in-class exercise help, and paying attention during the videos. Students mention that writing out the key ideas helps them understand the process of what they are learning as well as reinforcing important concepts.

Students in the focus groups mentioned that they took notes during the video lecture, most saying that they preferred to use the powerpoint slides (instead of those notes) during in-class exercises. Students mentioned they enjoyed the quality of the videos, and many attributed their successful use of the videos to the quality.

What students said (survey)

  • “I can refer to it on class exercise”
  • “I can review the material before exam”
  • “Taking notes on the lecture videos. It helps me pay more attention and actually process the videos.”
  • “It helps me fix things in my head, and when I try to derive the same things that I watched, I get to see if I can do it on my own or if I hadn't understood it yet.”

What students said (focus group)

  • “If I don’t take notes during the video, I tend to zone out. Taking notes helps me stay on track during the video. Then I can reread my notes later.”
  • “The video length is fine. They're like 15 minutes. Sometimes they are 20 minutes, and there will be 3 of them. It’s the worst when all of them are 20 minutes. Nobody wants to be sitting there for an hour watching a video.”
  • “For me, english is my second language. It is difficult for me to understand the videos. Can’t understand it, so we can only know it from the exercises.”
Strategy 2: Write out solutions for classroom/homework problems

PaperStudents used this strategy to learn the process of problems and understand fundamental ways to reach the correct answers. Having the worked solutions saved also proved beneficial for later exam prep. However, writing the solutions out takes a great deal of time and effort.

The focus groups revealed that a lot of stress was associated with these parts of exercises and homeworks. Students expressed interest in having more time to perform this strategy and having more examples to see and learn from.

What students said (survey)

  • “I like having documentation of difficult problems so that I can go back and review that problem before the assessment.”
  • “Making notes along the way of why I'm doing what I'm doing. The notes help me understand where I might go wrong if I do the problem again and it helps clear up confusion.”

What students said (focus group)

  • “It’s like a ticking time bomb. You know you're going to take more time than most people. But your time ends when the class ends. You're stressing out about the limited time given to write out solutions.”
  • “There are so many terms to carry out, and I know what I am doing. At the beginning I tried, but there is not a ton of room to have your notebook out. The laptops are huge, and you don’t want to be inconvenient, it’s a pain to write stuff, and it’s a lot to write out.”
  • “The class time is limited. So if you feel like you can do it, you just write the answer.”
Strategy 3: Re-do problems without looking at the answers

PaperStudents primarily used this strategy during exam preparation. In the additional comment sections of our survey, students expressed interest in having more ungraded practice problems and exercises available.

What students said (survey)

  • “Reinforces the concepts, and make sure I've learned from my mistakes that I may have made earlier.”
  • “Reworking problems without looking at the solutions has helped in identifying common errors I tend to make, as well as assisting memorization of formulas.”

What students said (focus group)

  • “I do it for review for quizzes and exams, but don’t do it during other times. I think it would help me stay on track, but I don’t do it.”
  • “If I feel like I am good at this kind of problem, then I will go to a new problem. There’s also a comprehensive review, which is followed by TA help sessions. You are given all these chances to figure it out on your way to the exam.”
Strategy 4: Jump between topics or types of problems

PaperStudents did not have much to say about this strategy. We would be interested in evaluating the structure of the course and organization of the learning material to understand how students are interacting with the interleaving strategy.

What students said (survey)

  • “I jump around and do different problems instead of the same type repetitively. This helps me learn a more broad spectrum of the problems that will be on the exams.”

What students said (focus group)

  • “There’s also a comprehensive review, which is followed by TA help sessions. You are given all these chances to figure it out on your way to the exam.”
Strategy 5: Study regularly instead of cramming

PaperFrom the data, it is hard to draw a concrete conclusion how students are interacting with strategy. The information from the focus group lead us to believe that the course structure helps support learning in a way that made cramming less of a problem—students knew what was required and did not need to cram.

What students said (survey)

  • Note: Overall, this question had high variability in student responses. Many students did not submit a coherent response. This question should be redesigned to better assess its intended purpose.

What students said (focus group)

  • “There is no way to get through the course without having a basic understanding. So there is not a lot left to understand as it is covered in class and the homework. You have a good understanding of most of the concepts. So I review a day before only, it’s not that hard”
Strategy 6: Come to office for help with any course topics

PaperWhile quite a few students report not using office hours, those that did found it helpful. During the focus group, the weekly email reminders were said to act as a motivator to actually go and ask questions.

What students said (survey)

  • “Gave me a lot of insight into how I should be approaching homework/assessments as well as teaching me tricks and lessons that were valuable towards understanding the material.”

What students said (focus group)

  • “Every week she would email us office hours to remind us that she exists and we could come and ask questions. Even though it was the exact same time, every week she would do it. And when you read those emails, you say, "maybe I should go." It was helpful and a good reminder”
Strategy 7: Review the educational objectives

PaperThe strategy is valuable for keeping track of what is expected to be learned. Students also used the educational objectives to set goals and monitor progress. However, for some students the amount of effort and coordination reduced the use of this strategy. During the focus group, a student expressed interest having weekly print-outs to use.

What students said (survey)

  • “The PDF that includes the educational objectives gives a clear indication of what is going to be on an assessment. This allows me to determine what I know and what I need to review.”

What students said (focus group)

  • “I did it at first, but gave up because it was too much work. But when I did, I would go over the objectives and answer them as best I could to see if I knew the concept.”
  • “I use to tick off each objective as I understood what it meant. Eventually, there’s a lot of stuff to do, and you forget to keep that up. I guess I will try it again next semester.”
  • “The list was uploaded in the beginning of the semester. Its like 6-7 pages. It had demarcations according to the exams and midterms. I stopped printing it out when I had large time commitments and didn’t pick it up again.”
Return to top