What do children think about mobile apps that allow their parents to monitor children’s technology use? How would children re-design such apps? Below, I share findings and recommendations from a paper I co-wrote with colleagues from UMD’s Human Computer Interaction Lab (HCIL). Today, lead author Brenna McNally presents this paper at the 2018 ACM Conference on Human Factors in Computing Systems (CHI).
What did we do? Children use mobile devices every day, and mobile monitoring applications enable parents to monitor or restrict their children’s mobile use in various ways. We explored to what extent do children consider different mobile monitoring activities appropriate and what other mobile monitoring solutions do they envision?
How did we do it? We held two co-design sessions with Kidsteam, a team of children ages 7-11 and adults who meet regularly at the University of Maryland to design new technologies. At both sessions, children filled out a survey about their opinions on various features of a commercially available mobile monitoring app, noting whether they felt parents should or should not be able to control each one. They then drew mock-ups that redesigned the features they felt parents should not be able to control. The second session included a design activity where our child design partners created a mobile interface to help children handle two common mobile risk scenarios: a content threat in which a child accidentally saw inappropriate material and a contact threat in which a child experiences cyberbullying via instant messaging.
What did we find? Most children were comfortable with various monitoring features, including letting parents see a child’s location or contacts. Comfort with parents seeing a child’s search/browsing history, social media posts, and text messages varied, with some noting that this information could be taken out of context. With regard to restriction features, most felt comfortable with parents seeing what apps are downloaded on a device, but fewer wanted parents to be able to restrict a child’s internet access or camera use. Child partners re-designed these features to support more active mediation, for example, creating “Ask Child” or “Consult Kid” buttons to prompt conversations between parents and children before a parent unilaterally restricts or blocks something on a child’s device.
In the activity on helping children handle common mobile risk scenarios, children’s designs emphasized automatic technology interventions, such as filters that would block “bad” text messages or contacts who used bad language. Their designs also included features that provided children immediate assistance when they encountered a concerning situation. These focused on helping the child feel better, for example, by showing cat videos or suggesting that the child play with a sibling.
What are the implications of this work? By incorporating children’s perspectives into the design process, this work suggests how mobile monitoring applications meant for parents to oversee their children’s technology use can be designed in ways that children find acceptable and beneficial. The children we worked with understood and even welcomed certain types of parental oversight, especially related to their physical safety. However, they questioned other types of parental monitoring and restriction of their mobile activities, re-designing these features so that the tools supported children through automated means or by helping children develop their own strategies for handling negative situations. Most mobile monitoring technologies emphasize the role of parental control, but this study suggests that children are eager for tools that help them learn about online risks and develop skills to navigate and cope with risky situations as they experience them.
Read the CHI 2018 paper for more details!
Citation: Brenna McNally, Priya Kumar, Chelsea Hordatt, Matthew Louis Mauriello, Shalmali Naik, Leyla Norooz, Alazandra Shorter, Evan Golub, and Allison Druin. 2018. Co-designing Mobile Online Safety Applications with Children. In Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems (CHI ’18). ACM, New York, NY, USA, Paper 523, 9 pages. DOI: https://doi.org/10.1145/3173574.3174097