Steve Gibson (53:34)
So, okay. Three days ago, the BBC carried some news about the arrest of a pair of teens who were members of the Scattered Spider hacking collective, which, you know, we've been talking about so much recently since. It's not worth losing sight of the fact that. Or I should say it's worth not losing sight of the fact that hackers are being caught and held responsible. You know, I don't say that often enough. I see the stories go by. These are those people, you know, they, they got nabbed and everything. But it doesn't often make the podcast. So I thought, let's, let's just pause here for a second to make sure people understand that these kids, hackers are not getting away with this, like forever. Although it is weird what time delay there is. I'll explain this. So, the BBC reported on this incident three days ago. They wrote, two teenagers having appeared in court facing computer hacking charges in connection with last year's. Last year's cyber attack on Transport for London TfL. The 18 and 19 year olds were charged with conspiring to commit unauthorized acts under Computer Misuse act, rather broad. They appeared at a hearing at Southwark Crown Court on Friday and spoke only to confirm their names. Judge Tony Bomgartner scheduled a further hearing for 21st November with a trial date set for June 8th of 2026. The cyber attack caused three months of disruption to Transport for London last year and affected Live Tube information, online journey history and payments on the Oyster app. Don't know what any of that is, but I guess if you're in in London, you do. The teenagers were recently arrested by the National Crime Agency. So recently arrested, meaning a lot of time went by during which they thought they'd gotten away with this. Recently arrested by the National Crime Agency and City of London police on 16 September. So, you know, a few weeks ago, and were charged two days later. The NCA said it believed that the hack, which began on August 31 last year, was carried out by members of cyber criminal group Scattered Spider. TFL said the hack cost it £39 million in damage and disruption. Following the hack, TFL wrote to around 5,000 customers to say there may have been unauthorized access to their personal information, such as bank account numbers, emails and home addresses. So again, 18 and 19 years old and now they'll have an adult computer criminal crime record for the rest of their lives. They presumably have some software skills and enjoy computing technology. But in, you know, an environment where software skills are not scarce, who in their right mind would hire either of them to do anything that was computer related, you know, flip burgers, fine, but stay away from our point of sale terminals because you guys are computer criminals and they always will be. So boy, you know, sad that, that, that, you know, they messed up by, by doing that. Last Thursday, the day before the European Union found that Facebook, Instagram and Tick Tock apps were and are in violation of terms of the EU's DSA, which is the Digital Services Act. The act has some teeth in it for this breach since Meta and Tick Tock could be fined an attention grabbing 6% up to 6% of their total global revenue, which is some cash and that'll get their attention. The EU's press release explained what's going on, they wrote Today, the European Commission preliminarily found both Tick Tock and Meta in breach of their obligation to grant researchers adequate access to public data under the Digital Services Act. Again, the dsa. The Commission also preliminarily found Meta for both Instagram and Facebook in breach of its obligations to provide users, their users simple mechanisms to notify of illegal content, as well as to allow them to effectively challenge content moderation decisions. Right. There should be an easy way to do that as a user of the platform, both to notify Meta and and to challenge a decision that Meta has made. The Commission's preliminary findings show that Facebook, Instagram and TikTok may have put in place burdensome procedures and tools for researchers to request access to public data. Right. We wouldn't want that because researchers might, you know, get up to some research. This often leaves them with partial or the researchers with partial or unreliable data impacting their ability to conduct research such as whether users, including minors, are exposed to illegal or harmful content. Allowing researchers access to platforms data is an essential transparency obligation under the DSA as it provides public scrutiny into the potential impact of platforms on our physical and mental health. When it comes to Meta, neither Facebook nor Instagram appear to provide a this is still the U the European Commission speaking neither Facebook nor Instagram. This is the eu the European Commission's opinion on this after lots of research into this appear to provide a user friendly and easily accessible notice and action mechanism for users to flag illegal content such as child sexual abuse material and terrorism content. The mechanisms that Meta currently applies seems to impose several unnecessary steps and additional demands on users. In addition, both Facebook and Instagram appear to use so called dark patterns or deceptive interface designs when it comes to the notice and action mechanisms. And of course, anybody who was trying to resist the Upgrade from Windows 7 to Windows 10 a few years ago knows all about dark patterns. Would you like to update now or later, as opposed to never? Such practices, they wrote, can be confusing and dissuading. Meta's mechanisms to flag and remove illegal content may therefore be ineffective under the dsa. Notice and action mechanisms are key to allowing EU users and trusted flaggers to inform online platforms that certain content does not comply with EU or national laws. Online platforms do not benefit from the DSA's liability exemption in cases where they have not acted expeditiously after being made aware of the presence of illegal content on their services. Okay, so on one hand, you can kind of see where the platform would like to put up some resistance, a little bit of back pressure, like, you know, the same way insurance companies do of denying your first claim. And then you've got to fight them a little bit and then they go, okay, fine, well, yeah, we'll, we'll honor that because, you know, that that reduces the, the, the, the influx and the flood at the same time. If they don't, if they could be shown not to be responding in a timely fashion, that opens them to action under the DSA and they lose their liability protection. So they're walking, you know, a thin line here. The EU wrote. The DSA also gives users in the EU the right to challenge content moderation decisions when platforms remove their content or suspend their accounts. At this stage, the decision appeal mechanisms of both Facebook and Instagram do not appear to allow users to provide explanations or supporting evidence to substantiate their appeals. This makes it difficult for users in the EU to further explain why they disagree with Meta's content decision, you know, arguing for its restoration, limiting the effectiveness of the appeals mechanism. Essentially, Facebook and Instagram don't want to, you know, spin up a big mechanism for doing what the DSA requires it requires them to do. It's not going to be easy to do this. They'd rather just kind of push back a lot, the Commission writes. The Commission's views related to Meta's reporting tool, Dark Patterns and complaint mechanism are based on an in depth investigation. These are preliminary findings which do not prejudge the outcome of the investigation. Facebook, Instagram and TikTok now have the possibility to examine the documents in the Commission's investigation files and reply in writing to the Commission's preliminary findings. The platforms can take measures to remedy the breaches. In parallel, the European Board for Digital Services will be consulted. If the Commission's views are ultimately confirmed, the Commission may issue a non compliance decision, which can trigger a fine of up to 6% of the total worldwide annual revenue of the provider. The Commission can also impose periodic penalty payments to compel a platform to comply. New possibilities for researchers will open up on October 29, tomorrow of 2025 as the delegated Action on Data Access comes into force. That's the next part of the dsa. This act will grant access to non public data from very large online platforms and search engines, aiming to enhance their accountability and identify potential risks arising from their activities. Okay, so my takeaway from this is that, details aside, what all of this amounts to is more evidence of a significant changing tide for the entire online tech industry. The next 10 years are not going to look like the last 10 years. Up to this point, the online world has been, and anything goes free for all. This state of affairs has existed since the world began to discover an alternative to using their telephone modems to dial into aol. It's called the Internet. In retrospect, it has taken a surprisingly long time, right? I mean, we've had decades of this for the political class to recognize that it's able to create and then enforce regulations on the behavior of these global online behemoths. And it's probably the fault of the tech companies who have for so long thumbed their noses at polite governmental requests for online app behavioral changes. We've been covering that throughout the life of this podcast. The legislatures finally grew tired of asking for voluntary changes and decided to enact some laws with teeth. I expect we're going to be seeing the government compliance departments of these large companies becoming much larger. And there's going to be a need for a culture change, a change in thinking about what we get to do with tech companies online. Somewhere along the road to success and world domination, when an app's reach becomes sufficiently influential, that service begins to more closely resemble a public utility and its influential behavior is going to be regulated. Now, every week we cover various aspects of this struggle because they're in the news, they're what's happening, and they are determining the shape of our future. Until now, Big Tech has had total freedom to do as it pleases in a lawless and unregulated playground. I think it should be clear to everyone by now that this status quo is changing Leo.