Carl Stillner (9:50)
Yeah, happy to. So Bright Canary is an application currently exists for iOS. We don't have an Android version yet, it's on the roadmap. But it's a service that helps parents connect with their kids digital lives. So the way it works right now is kids are spending upwards of five to six hours a day online outside of school. So this is the majority of time outside of school of their waking hours is spent online. And for whatever reason, kids do not share what they're doing online with their parents. So it becomes a black hole. So parents have no insight into what the kids are doing, what kind of content they're consuming, who they're interacting with. And speaking from a parent, I'm a parent of two kids, it's very hard to be an active, engaged parent if you don't know what the kids are doing for the majority of time they're outside of school. And so what we're trying to do is shift some of the gravitational center back towards parents so that they actually have some understanding what the kids are doing. And so what we do is we provide insights into the activity. So what kind of YouTube videos kids are watching, what are they searching on, who are they communicating with in terms of messages, text messages or dms? What are they, what are they typing into gaming platforms? And we run that all through AI layers, which do a very good job of a couple of things. One, for content moderation, AI does a very good job of detecting things like subtle bullying. Machine learning models in the past, we're not good at that. They're really good at detecting things like profanity and pornography, not good at detecting things like subtle bullying. So we run this all through an AI layer for helping Alert parents as to what's going on and when things are going off the rails. Right? So you can imagine getting a text message or a push notification to your phone saying, hey, there's been some suicidal ideation or there's been some references to drug use in some of your kids text messages. And the parent can go in and actually look at those particular interactions or those digital messages and see for themselves if the AI is doing a good job of alerting or not. You know, it's not perfect, but getting better obviously very quickly. And then we provide parents with kind of guides and coaching on how to actually handle some of these difficult conversations. So what do you do if your kid's actually looking at pornography? Parents aren't well equipped to actually have that first conversation. So we provide AI recommendations. It's not the replacement for a therapist or psychologist, but does a really good job of providing kind of a foundational, conversational layer for that first and second conversation with the child about this. And so the parent is able to be connected with their kid not just on things that are concerning, which of course is a big focus, but also things that are positive in the kid's life. So if the kid's interested in astronomy, as a parent, I want to know that if my kid is looking at YouTube videos with astronomy, that's a point of interaction I can have with my kid, especially as they get into the adolescent ages where I can indulge that interest and feel more connected to the kid if I have that information at my disposal. We, as a platform, we provide a graduated approach to how much you want to be able to monitor your kids. So for your 10 year old, you probably want to see everything they're doing. And I have an 11 year old, he has no notion of privacy. He doesn't care if I'm looking at all his text messages. In fact, he kind of likes it. He's told me he's heard the Internet's a scary place and he kind of likes that as a comfort blanket. I also have a 14 year old who has a notion of privacy. And so I don't want to read every one of my 14 year old's text messages. I mean, I have no interest in that. And he, and I don't think that forges a great relationship with him either. But I do want to know if things are going off the rails. And so I can rely on the AI to provide me that without having to go and read every one of his exchanges he's having with his friends. And so we're Trying to build a platform that recognizes that an 8 year old is vastly different than a 16 year old in terms of maturity, in terms of the needs associated with that. And AI allows us to do that quite readily. So it's a subscription app that's available in the App Store and we charge on a monthly and annual basis. And we provide that kind of qualitative understanding what your kids are doing. There's plenty of tools that provide quantitative restrictions, including Apple screen time for iOS, which allows you to block certain times a day, et cetera, which we think is important part of it. But we're really focused on the qualitative side of things. And then to the second part of your question. In terms of what are the concerns that, you know, kids are facing right now? I mean, there's, there's a litany, unfortunately. There's a lot we could talk about on this. On this point. We've seen a huge mental health crisis that's affecting kids. It really started around 2013, which is when devices start propagating and social media start propagating to younger ages. This is not, this is not something that is, you know, questionable at this point. This is a fact. And data suggests that kids are facing increased incidents of depression, anxiety, loneliness, body image issues, in particular with girls. And this is all being driven by, you know, the algorithmic nature of these platforms. They're acting as very addictive to kids who, you know, don't have a developed prefrontal cortex, so they don't have developed executive function. And kids, we don't let kids who are under 18 drink or smoke or gamble because of this exact reason. But we're giving them a highly addictive tool and device and saying, good luck with it. And so that's ultimately the addictive nature that is really ultimately at the center of all this. But, you know, it drives lots of things like sleep disruption, having a device in a kid's bedroom at night. I mean, there's been studies on how much that impairs the quantity of sleep and also the quality of sleep, which is hugely. At an age where there's huge cognitive development happening, there's cyberbullying that happens in the kind of early adolescence that's very common on digital platforms. And I think that's probably. It's a lot easier for someone to cyberbully someone when they're not speaking to their face. It kind of depersonalizes it. And so I think cyberbullying is. We're seeing a much higher incidence of that versus traditional bullying in the real world because again, it's somewhat anonymized and it's easier. And so we're seeing a lot, lot of that going on. There's social comparison and insecurity that's predominantly kind of in the middle, middle school years where kids are going through a lot of change and are highly, highly susceptible to kind of the social comparison and the feeling of left out, being left out as well. We're seeing with older kids a lot around dangerous content and drug use. There's lots of drug dealers who are using Snapchat as a distribution platform. There's plenty of high profile cases you can read about, unfortunately, where kids have been contacted by drug dealers on Snapchat and bought things that they thought were something or laced with fentanyl. And there's been all kinds of tragic outcomes in that course. That's more commonly associated with kind of the high school or late middle school. And something I'm particularly concerned about is the crowding out of interest. I don't think this is talked about enough, but if a kid's spending four hours a day on social media, that's four hours a day they aren't spending playing sports or playing an instrument or engaging in the debate team or a variety of other things they could be doing. And so it really does cannibalize their time. And additionally, it's been shown that there's a lot of attention fragmentation going on. So you'll see with kids that they're not able to read a book anymore, they're not able to even watch a full length movie anymore because they have been conditioned to really want to see short form videos that really have an effect that we don't know the ramifications of. Like, I don't know what that's going to be like five, ten years from now when these people are entering the workforce. Maybe it won't be relevant, maybe it will be, but it's certainly a huge change has happened very quickly.