Back to Parent


Intelligent assistants need to be weary of the risks posed by children. This likely requires an understanding of who it is talking to and how it should adjust. For example, a parent most likely does not want their child to have the ability to make purchases. There are also more nuanced situations, such as protecting a child from sensitive or inappropriate information. Amazon's Alexa address access with parental controls that require manually entered passwords, but this seems like more of stopgap than a true solution.

If designing specifically for children, it's important to rigorously explore what the implications might be. A child's cognitive development is complex and there is no regulatory body that will tell you whether or not your app may be harmful. For example, perhaps the distinction between human and computer speech should be more stark for young children.

Beyond cognitive effects, it's important to understand how a tool will affect the child's relationships. Will this product augment their relationship with their parents? Or is it simply creating a path of least resistance? Will this facilitate beneficial interactions with other children? Or will it be isolating?

While these are important concerns, it's also important to remember that these products can positively impact people if they're designed in a thoughtful and empathetic manner.

Content Rating

Is this a good/useful/informative piece of content to include in the project? Have your say!