This chapter is about making it real—launch, momentum, and supportive dialogue. You’ll learn a straightforward kickoff script that explains purpose, time cost (10–15 minutes/week), and what participants get back. We start with one or two easy actions that fit ongoing work so people experience success immediately. The Sampling craft is practical: write right after doing, three to eight sentences answering up to three clear reflection prompts, then pick 1–4 tags and a simple feeling score. Study leads answer, preferably within 48 hours, mirroring what they read, asking one curious follow-up, and celebrating micro-wins to sustain energy. The tone matters: respectful, concise, and bias-aware. We cover logistics—weekly cadence, gentle reminders, calendar protection for reflection, and basic privacy routines. Finally, we address hard moments: criticism in reflections, low participation, or comment bottlenecks.
The invitation to let others take action and reflect afterwards is for very practical reasons tightly connected to the digital support tool tailored to DAS, the app called Loopme. Imagine inviting 50 participants to complete 10 actions each, letting them all reflect and pick 1-25 tags on each action, and then as a study leader commenting back on each of the 500 resulting reflections, without IT support. An administrative nightmare! Therefore, a DAS study is in practice deployed by inviting all participants to log into a group in the Loopme research instrument where they will then find the 5-15 action tasks waiting for them. Some study leaders have tried to use other digital tools such as Google Forms, but engagement levels have so far been disappointing. DAS relies to a large extent on the digitally mediated trustful social relation between study leader and participants, in stark contrast to traditional surveys, hence the term Scientific Social Media.
Participants get from half an hour up to a year to complete and reflect upon some or all of their assigned action tasks, depending on study design. For each reflection, participants are also asked to grade their experience emotionally. They do this by choosing one of five emojis on a five-graded likert scale from negative (-2) to positive emotion (+2). Reminders play a crucial role to maintain engagement and achieve a high response rate. The only known way to get 100% response rate is to make action-based reflection a mandatory part of student assessment in a course or programme. For employees as participants, we have never seen 100% response rate. It varies from 15-85% depending on perceived purpose and meaningfulness of the action tasks, management support or even requirement to participate, reminder approach, commenting swiftness and engagement style among study leaders.
Having study leaders who comment upon each received reflection during the data collection phase is crucial in DAS. In addition to motivating participants who appreciate the personal feedback, it also builds relational trust, invites dialogue and deep reflection, and allows for follow-up questions on particularly interesting reflections. Study leaders who show that they genuinely care about participants’ inner thoughts will be rewarded with trust, honesty, improved reflective depth and thus higher overall data quality. The comments themselves are also data, containing interesting participant insights and a kind of researcher field notes.
Examples of data collection
First we look at three student studies, and then two employee studies. In economics education, 60 students each year are invited to use themselves as experiential cases (Westerberg, 2022). The 500 reflections received over four months on eight different action tasks show that such micro-challenges and the feedback provided help them tap into their own thoughts and feelings triggered by taking action outside their comfort zone. This allows for powerful experiential learning. DAS also helps these students establish habits of deep reflection upon action. In an example from younger students, around 200 upper secondary schools across Sweden have adopted DAS, involving around 2.000 teachers as study leaders and 20.000 students reflecting through DAS, submitting around one million reflections per year. Many teachers at these schools have taken DAS to their hearts. It helps them assess and support students on work placements in a structured way, and thus significantly increases the overall quality of the experiential component in vocational education (Lackéus and Sävetun, 2021). The most detailed study on business students is my own work in entrepreneurship education. Each year since 2015 I ask my 45 students to complete 10 out of 35 possible action tasks of their choice, upon which they reflect afterwards (see Lackéus, 2024). My colleagues also give them various tasks around teamwork, identity, ethics, work environment and programme evaluation. Over nine months from September to May, these students submit around 1.000 reflections, and this becomes a unique material which helps us discern how and why they become more entrepreneurial.
Turning to data collection with employees, we first go to an example from social work. 156 reflections were collected bi-weekly from 12 participants over six months, and were complemented with longitudinal interviews (Tjulin and Klockmo, 2023). This helped uncover causal mechanisms, made participants reflect more deeply on their practice and strengthened their ability to analyse their own practice (Tjulin et al., 2023). For larger DAS studies on employees as participants, we turn to school developers in primary schools. Most of these studies are unpublished, since study leaders are teachers, school developers and principals who lack time to write publications. However, one of the largest initiatives in Åstorp in southern Sweden has been documented. In a study on embedding computer programming in education, 116 teachers reflected upon using various programming methods in 116 different lessons. Each teacher tried out one of eight methods, thus allowing for a comparison of outcomes depending on which method was used (Viebke, 2020). In a study on language development, 204 teachers in seven different primary schools around Åstorp tried out six different language development strategies for one year, and reflected around how it worked in their classrooms (Brandt and Viebke, 2023). Effects were quantitatively probed for through 16 different tags aiming to capture effects on teachers and on students.
Common challenges in the data collection phase
Many study leaders struggle with participant engagement, response rates and sustained use of DAS after initial novelty is over (Lackéus, 2025). Newcomers often ask about how to explain the purpose and process of a DAS study to prospective participants, how to comment upon incoming reflections in a good way and how to secure management buy-in and active support. After their first DAS study is completed, they often conclude that technical challenges around the Loopme app were fewer than they had feared, but that commenting was more time consuming than expected. New routines for weekly commenting often need to be established and scheduled into busy calendars during an ongoing study. Many DAS study leaders are also practitioners and thus struggle on a more personal level with imposter syndrome – “who am I to do research on my colleagues?”. As they grow more experienced they instead often express pride in being an action researcher of their own practice.
The end result of the action phase
If many of the invited participants take action and then reflect as planned, we get a highly structured dataset consisting of around 100-1000 qualitative reflections. Each reflection is connected to a designed action task, an emotion rating, a comment thread and a few tags – those tags participants felt were aligned with their experience of carrying out that particular action. This dataset can be downloaded from Loopme and analyzed in Excel or in more advanced qualitative data analysis (QDA) software.