Storytelling With Data Let' S Practice Pdf
LINK ===> https://urllio.com/2ta37U
…it is not unusual for interviewers to adopt a series of techniques in order to reach ‘saturation’. Whilst such methods may prove successful, they may also encourage participants to behave in ways that are not representative of their typical responses. The interviewer can sense that the interview is going nowhere, and resort to a technique that is intended to seek out a response. This technique then becomes a ‘stick’, and all the participant’s initial responses are squeezed into the desired answer, thus failing to provide any truly ‘fresh’ data.
For example, in a qualitative interview with a participant who had provided some data under test, saturation would be achieved when the interviewee began providing data that, whilst not contradicting previous data, offered new insights or information. If a participant provided no new information, then at some point, the interviewee would have provided data sufficient to ‘saturate’ the interview.
This criticism was confirmed in our own research, when we applied these principles to data collected in our own recent study into the phenomenon of data saturation (Hill et al. 2014). We took a different approach, by assessing the effect of data saturation on the efficacy of a particular interview technique, as reported in seven pilot interviews. We found that saturation had no significant effect on the interview technique, either in terms of the amount of data collected (p = 0.40) or the quality of data collected (p = 0.60) (Hill et al. 2014).
Legard et al. (2003) have also attempted to measure data saturation by calculating the average number of interview techniques a participant uses, given that a particular technique is successful. This calculation is based on the premise that: ‘If a technique is successful, it must be repeated, in order to demonstrate that the technique is effective. In such cases, the interview will continue until the technique is no longer effective, or the technique is either switched to a different technique or abandoned altogether’ (Legard et al. 2003: p. 3). However, Legard et al. (2003) recognise that this calculation may not be a reliable measure of data saturation, with critics (e.g. Kitzinger & Brooker 2006) arguing that:
We adopted a new format to tell this story, and we called it the Spotify Data Story, which we hope will strike a chord of recognition with those who have been part of the data-storytelling movement. Hopefully, we’ll inspire the next generation of data storytellers, and we want to give back and share our insights so they can build on our work.
These data stories are a vital part of the Spotify customer experience, and we wanted to include them in every offer we made to our users. Our core goal was to provide a seamless experience that responded to actions and identified trends in the listener’s activity. We wanted to help Spotify users discover their most connected connections — their Spotify friends — and to provide personalized recommendations based on what they’d already listened to. 827ec27edc