B (49:49)
I mean, it's like, oh, the more data we have, you know, who knows what we'll be able to do with this in the future, right? So why delete what. What might be of some value to us? Yikes. Okay, so I suppose it was only a matter of time before Microsoft would decide to move its GitHub property over to their own Azure cloud infrastructure. But the details behind the move will likely be of interest to many of our listeners. The publication the Next Stack provided the background for this move. They wrote, after acquiring GitHub in 2018, Microsoft mostly let the developer platform run autonomously. But in recent months that's changed. With GitHub CEO Thomas Domke leaving the company this August and GitHub being folded more deeply into Microsoft's organizational structure, GitHub lost that independence. Now, according to internal GitHub documents, the new stack has seen the next step of this deeper integration into Microsoft structure is Moving all of GitHub's infrastructure to Azure, even at the cost of delaying work on new features. In a message to GitHub staff CTO Vladimir Fedorov notes that GitHub is constrained on capacity in its Virginia data center. He writes, quote, it's existential for us to keep up with the demands of AI and co pilot. Well, of course, Leo, you can, you know, if you're going to have AI and co pilot, you got to move to a facility that's big enough for that, he said, which are changing how people use GitHub. The plan, he writes, is for GitHub to completely move out of its own data centers in 24 months. Quote, this means we have 18 months to execute with a six month buffer. Fedorov memo says he acknowledges that since any migration of this scope will have run in parallel on both the new and old infrastructure for at least six months, the team realistically needs to get this work done in the next 12 months. So during 2026 to do so, he's asking GitHub's teams to focus on moving to Azure over virtually everything else, Fedorov wrote, quote, we will be asking teams to delay feature work, to focus on moving GitHub. We have a small opportunity window where we can delay feature work to focus, and we need to make that window as short as possible. While GitHub had previously started work on migrating parts of its service to Azure, they write, our understanding is that these migrate migrations have been halted, halting, and sometimes failed. There are Some projects, like its Data residency initiative, internally referred to as Project Proxima, that will allow GitHub Enterprise users to store all of their code in Europe that already solely use Azure's local cloud regions, fedorov writes. We have to do this. It's existential for GitHub to have the ability to scale to meet the demands of AI and Copilot, and Azure is our path forward. We have been incrementally using more Azure capacity in places like Actions, Search, Edge Sites and Proxima, but the time has come to go all in on this move and finish it. Unquote, the next stack said. GitHub has recently seen more outages, in part because its central data center in Virginia is resource constrained and running into scaling issues. AI agents are part of the problem, but it's our understanding that some GitHub employees are concerned about this migration because GitHub's MySQL clusters, which form the backbone of the service and run on bare metal servers, won't easily make the move to Azure and lead to even more outages going forward. In a statement, a GitHub spokesperson confirmed our reporting and told us GitHub is migrating to Azure over the next 24 months because we believe it's the right move for our community and our teams. We need to scale faster to meet the explosive growth in developer activity and AI powered workflows, and our current infrastructure is hitting its limits. We're prioritizing this work now because it unlocks everything else for us. Availability is job one, and this migration ensures GitHub remains the fast, reliable platform developers depend on, while positioning us to build more, ship more, and scale without limits. This is about ensuring GitHub can grow with its community at the speed and scale the future demands for open source developers. Having GitHub linked even closer to Microsoft and Azure may also be a problem, though for the most part, some of the recent outages and rate limits developers have been facing have been the bigger issue for the service. Microsoft has long been a good steward of GitHub's fortunes, but in the end no good service can escape the internal politics of a giant machine like Microsoft, where executives will always want to increase the size of their fiefdoms. So to me it makes sense for Microsoft, but it also sounds as though it's going to be more easily said than done. You know, and the thing about unforeseen consequences is that they're unforeseen. And outages for any service? Yeah, outages for any service such as GitHub, upon which so much depends are going to be a big problem, you know, but no one sees another way. So I have the feeling that some future security now podcasts will be reporting on the consequences of this move. And boy, if like moving to Azure gets screwed up from a security standpoint, that's going to be a big nightmare for GitHub because that's, that's not something anyone wants to see happening. So I've been noting that the proper place for consumers to specify how they would like the Internet to treat them is their browser. You know, that was what I love so much about the original DNT that you know, the do not track beacon with just the flip of a switch. Just once a user could configure their web browser to always append a DNT header to every Internet resource request. And had anyone ever cared to honor that request, which of course was the big problem, that would have been their one and done prohibition against tracking. Because this broad concept has merit, the newer incarnation of DNT is GPC, the Global Privacy Control. And remember, Global PrivacyControl.org is a site you can go to and right up at the top of the page you are notified whether your browser is broadcasting the GPC signal saying no thank you. But even though the GPC signal has been around since it's since its release for like quite a while, sorry, only today, only Brave, the Duck Duck Go and Tor browsers are broadcasting that signal by default. Firefox, since its release number 95 has supported GPC, but it needs to be turned on. And of course I went over to global privacycontrol.org with Firefox and yep, right up at the top I get a little green light saying your browser is GPC enabled. Sadly and perhaps not surprisingly, there's no support for GPC from the various other Chrome based browsers. Chrome, Edge, Vivaldi and Opera. Anyone wishing to emit the GPC signal from any Chromium based browser other than Brave will need to install an add on. And there's also been no sign of GPC from Safari, which I find kind of surprising. Okay, so all that make all makes news. All of that makes news because of California's new legislation last Wednesday, which Gavin Newsom signed. The the the the Record reported California Governor Gavin Newsom on Wednesday signed a bill which requires that's right requires web browsers to make it easier for Californians to opt out of allowing third parties to sell their data. The California Consumer Privacy act signed in 2018 gave Californians the right to send opt out signals. But major browsers have not had to make opt outs simple to use. The bill signed Wednesday requires browsers to set up an easy to find mechanism that lets Californians opt out with the push of a button instead of having to do so repeatedly when visiting individual websites. Privacy and consumer rights advocates have been nervously waiting for Newsom to sign the bill which passed the California legislature on September 11th. This is the first law in the country of its kind. The governor vetoed a similar but broader bill last year which also applied to mobile operating systems. Matt Schwartz, a policy analyst at Consumer Reports, said, quote, these signals are going to be available to millions more people and it's going to be much easier for them to opt out, unquote. Until now, Schwartz said individuals who want to use a universal opt out have had to download third party browser extensions or use a privacy protective browser, you know, meaning Brave duckduckgo or Tor or Firefox if you flip the switch on. Other bills signed by Newsom on Wednesday also give Californians important data privacy rights. One of them requires social media companies to make it easy to cancel accounts and mandates that cancellation lead to full deletion of consumers data. A second bolsters the state's data broker registration law by giving consumers more information about about what personal data is collected by data brokers and who can obtain it. So I did some additional research and found that this was measure AB566 relating to OPT out preference signals. Unfortunately it appears that we're not going to be getting it for another 14 months since the new law doesn't take effect until January 1, 2027. But at that time all web browsers will need to include functionality for Californians to send an opt out preference signal to businesses they visit online through the browser. The law follows the California Privacy Protection Agency CPPA announcement of a joint investigative sweep with privacy enforcers in Colorado and Connecticut to investigate potential non compliance with global privacy control the GPC signal. So at least we have some progress. Chromium browsers will need to get with the GPC plan, which just basically means that Chromium, the Chromium core is going to have to support it as will Apple's Safari browser. And once we have GPC available, privacy enforcers or will then be able to start investigating who is and who is not honoring the clear preference setting that will be sent to all browsers. As we saw, do not track, you know, never got enforcement, you know, and just having a GPC signal means nothing if there's no penalty for ignoring it, you know, and While we're at it, how about we also allow our browsers to send a cookie to acceptance preference signal so that we can also dispense with all of those ridiculous cookie permission pop ups. That would be a step forward for the, you know, the world's user interfaces. Anyway, it's, you know, it's progress. We, we have the technology and then we have the legislation to, to require its use and that it actually be honored. So you know, it takes years but we're getting there. Last Tuesday, Open AI posted a piece titled Disrupting Malicious Uses of AI. This this their little blurb pointed to a detailed and lengthy 37 page report about their efforts to block many different abuses of their technology. Among those cited Open AI moved to disrupt prc, you know, People's Republic of China espionage operations. Their security team banned chat GPT accounts used by Chinese state sponsored hackers to write spear phishing emails. The accounts were allegedly used by groups tracked by the infosec industry as UNK, Drop, Pitch and UTA0388. The emails targeted Taiwan's semiconductor industry, US, US academia, US think tanks and organizations representing Chinese minorities. OpenAI says threat actors primarily abuse its service to improve phishing messages rather than write malware, which is also what the threat company that the threat intel company, Intel 471 has observed. Okay, now this is certainly a good thing for them to do. We've noted how phishing email is no longer obviously grammar impaired, making you know, you know, basically removing the first obvious sign that you should hit delete in your email and instead of bothering to even read what the nonsense that it's spewing. But I suppose I'm not hugely impressed because OpenAI was likely being used only because it was among the lowest hanging of the myriad of available fruit. There are so many other sources of the same or similar generative AI assistance. I mean the threat actors could even spin up their own as we know that this feels like a battle that will always be lost. You know, just it it seems to me that there's just no way that AI is not now going to always be used to improve the quality of fishing males. Regardless of what barriers any of the commercial providers put up, there will always be, you know, alternatives available. So I guess I'm not hugely impressed. It would be difficult to find a better example of the need to continue supporting long past its prime website code than is evidenced by the fact that Microsoft continues, believe it or not, to need to offer the option to reload very old web pages under its creaky old ie Mode. It's true. And in fact, this email went out yesterday afternoon and I have already heard from one of our listeners who said he just the other day encountered an instance where some government websites would not run under Edge and he had to switch to IE mode and then the page rendered. So they're still out there. What's also true, unfortunately, is that IE's old Chakra JavaScript interpreter contains known exploitable flaws that bad guys want access to. We're talking about this ancient history today because it's apparently less ancient than we might hope. IE mode is still being exploited to the point that Microsoft's most recent iteration of Edge has removed all of the easy to click buttons from the browser's ui. An unknown threat actor has been tricking Microsoft Edge users into enabling Internet Explorer mode in Edge to run their malicious code in the user's browser in order to take over their machine. These mysterious attacks have been have been conducted since at least August of this year. According to the Microsoft Edge security team, IE's legacy mode, or IE mode, is a separate website execution environment within Edge. It works by reloading a web page, but running the reloaded page and its code inside the old Internet Explorer engines. And as we know, Microsoft included IE mode in Edge when it retired its official IE predecessor. And I guess we had like, IE 11, I think was the last version of IE, but there were still some code that was dependent upon it. So to access a site under IE mode previously, users would have to press a button or a menu option to reload the page from Edge into the old IE execution environment. Microsoft has said that it has received credible reports that hackers were using clones of legitimate websites to instruct users to reload the clone pages in ed IE mode.