B (56:37)
That is fingerprinting. That. That's the fancy way of saying yeah, like you know, deleting your cookies and getting and then getting your browser re identified, you know, otherwise known as script blocking they said is a feature that will block scripts engaging in known prevalent techniques for browser re identification in third party contexts. These techniques typically involve the misuse of of existing browser APIs, meaning, you know, JavaScript stuff that we've talked about like browser like battery level and canvas Drawing, you know, subtle changes in what the pixels end up being set to. They said to extract additional information about the user's browser or device characteristics. In other words, a fingerprint. They said this feature uses a list based approach, okay. Where only domains marked as impacted by script blocking on the master domain list, the mdl, we'll get all. We'll explain all this in a minute. In a third party context will be impacted in otherwise, in other words, blocked. They don't want to say that for some reason, when the feature is. Yeah, when the feature is enabled, Chrome will check network requests against the block list. We says Google will use Chromium's sub resource filter component, which is responsible for tagging and filtering sub resource requests, meaning third party based on page level activation signals and a rule set is used to match URLs for filtering. So this is a little inside baseball, you know, develop developer jargon. They said the enterprise policy name is Privacy Sandbox. Fingerprinting Protection Enabled. Okay, so the section headlined Motivation says Browser Re. Browser Re identification techniques have been extensively studied by the academic community, highlighting their associated privacy risks. We want to improve user privacy in incognito mode, but not otherwise, by blocking such scripts from loading. Okay, so just to be clear, this is not at all what, for example Safari, to your point Leo, or the Brave browser is doing. Brave is deliberately fuzzing the results of various fingerprintable modern browser techniques to prevent any and all known and unknown first and third party fingerprint tracking against a user's wishes. What Chrome is doing, it was better than nothing, but it's a far cry from what Brave is doing in the first place. Chrome is only doing anything for users who are in incognito mode. And when in incognito mode, based upon Google's description, Chrome will cross reference the domain names of any third party resource fetches against their, what they're calling their mdl, their masked domain list. And if a cross reference is found, if a match is found, then they will proactively block the execution of any scripting by any resource returned from a fetch from any of those domains. So on the one hand, it's better than Brave in that all potentially troublesome scripting is blocked, completely blocked. You know, scripting just doesn't work rather than allowed to run and be fuzzed. But on the other hand, it only applies while the user is viewing websites in incognito mode and it only blocks previously known and, you know, blacklisted troublesome domains. So you know it's better than nothing. And it also makes sense that Chrome would do this, since Chrome's MDL is already being used to deliberately obscure the user's ip, which is an extremely cool and useful feature, which I don't think Google and the Chromium developers have received enough credit for. I previously noted that despite any other measures we users might take, our IP addresses are likely still providing the strongest possible of all tracking signal since they so very rarely change. Given that, it's reasonable to ask what's the point of jumping through all those other hoops with anti fingerprinting and cookie erasing and all if all of our browser fetches to third party trackers will be made from the same ip? The Chromium developers clearly understood this. The MDL team, meaning the masked domain list team, that whole thing. It's a list of domains from which someone using incognito modes Internet IP address will be masked. In other words, Google actually takes it upon themselves to proxy any requests a user in incognito mode might make to any third party domain on the mdl. Meaning that Chrome doesn't request that domain, it requests it through Google so that the domain sees Google making the request, not the user that that MDL is a public GitHub hosted list of domains that Chrome treats as as as higher risk for cross site tracking. So when one of those domains loads in a third party context in incognito, Chrome provides extra identity protection by routing the request through privacy proxies so that untrusted third party sites what they see are requests arriving from it's what Google calls a male masked IP rather than the user's actual Internet address, which is extremely cool. As for the mdl, Google defines their inclusion criteria for participating in the list and disconnect me evaluates and maintains the list on you know, for the Chromium project following the criteria that Google laid down, it's published publicly and maintained on GitHub. You know that that naughty list contains domains that commonly run as a third party across multiple sites and you know basically their trackers and either participate in ads and marketing data flows so you know, serving targeting, measuring ads or collecting user data or which appear to collect device and user data that might be useful for for cross context reidentification. I mean so they're working hard to the degree that they are to shut that down in third party contexts. And additionally Chrome also detects independently widely used JavaScript fingerprinting patterns which can also get a domain listed. The IP proxying has been in place for most of this year, but someone must have also noticed that it would still be possible to run a powerful fingerprinting script through a proxy, which would only be obscuring the user's IP address. In other words, sure, the proxy is good for masking the ip, but if you are still allowing fingerprinting through the proxy, then you're still allowing some way of tracking. So what's being added in two weeks to Chrome 140 is that in addition incognito mode will be blocking third party scripting in addition to the existing IP proxying. So props to Google and the Chromium team. You know, these are useful, good additions. And then we're getting a couple other things in two weeks from Chrome140. Anyone who's ever been annoyed, as I have been, by the need to explicitly write JavaScript to encode text or binary data into URL safe base64 ASCII text and also go the other direction, decode base64 back into its original form essentially by hand in JavaScript. We'll be happy to see Google writes base64 is a common way to represent arbitrary binary data as ASCII JavaScript has uint8 arrays to work with binary data, but no built in mechanism to encode that data as base 64, nor to take base 64 data and produce a corresponding U8 int array or uint8 array. They said this is a proposal to fix that. It also adds methods for converting between hex strings and uint 8 arrays. So that's a handy new feature coming to JavaScript in two weeks in Chrome. And you know, it is part of the W3C standard. So you know, I mean, W3C just keeps throwing all this stuff out there and the various browsers are, you know, moving forward at whatever pace they are to to, you know, incorporate the, the standard as we go. Which is why there are always tables of which browser versions support which features and not because they're, they're, you know, everybody's always playing a game of catch up because the W3C never stops throwing new stuff out there. And here's a second biggie regarding something we just been talking about recently. A web browser directly accessing the network of its own hosting machine right through local Host, Google writes Chrome 140 restricts the ability to make requests to the user's local network, not just local host, but its local network, requiring a permission prompt. They wrote A local network request is any request from a public website to a local IP address or loopback or from a local website, such as an intranet to loopback gating. The ability for websites to perform these requests behind a permission mitigates the risk of cross site request forgery attacks against local network devices such as routers. It also reduces the ability of sites to use these requests to fingerprint the user's local network. This permission is restricted to secure contexts. If granted. The permission also relaxes mixed content blocking for local network requests since many local devices cannot obtain publicly trusted TLS certificates for various reasons. So all of this is great. This means that IPs within the same network as the browser's host machine will require an affirmative granting of permission before Chrome 140 and later will fetch anything from that local IP. For example, I currently access my cable modem at 192-16-81001 and my pfsense firewall is at 192168.0.1 and our ASUS router is at 192.168.1.1. So in two weeks any attempt to access those devices through my browser, which is the way we access them, is through browser uis, should produce some sort of are you sure? Permission request like telling me what my browser is trying to do and saying something about this is on your own network, do you want to go there? You know, is this what you're intending? So that seems given you know, how infrequently we need to do it from our browser. Minimally intrusive and definitely worthwhile. One of the things that the testers of the DNS benchmark, you know, the one I'm working on, have noticed since the benchmark has always tested remote DNS resolvers to see whether they would block or resolve private ips. None should is that the once common prevention of what's known as DNS rebinding attacks has apparently disappeared. It's fallen by the wayside from the public Internet. A rebinding attack is something which we actually talked about a few months ago when a public domain name was returning the IP127001 that can be used as a type of black hole to kill traffic. But doing that is not safe and that was a malicious domain that we were we were talking about at the time. Returning 0000 is a much better solution for null routing a domain name. If a public domain were to return for example 192.168.1.1 then asking a browser to con a browser page to connect to a public appearing domain name would cause it to connect to a network's local ASUS router In my case, which is almost certainly not what you would expect or want to have some JavaScript running in your browser to be doing. So the abuse of this is known, as I said, as a DNS rebinding attack. And there is no clear reason for resolvers of public DNS domains to return non routable IPs which have been reserved for use within private networks. But unfortunately now all of them are doing that, except just a very, very few which exist out on the public Internet. So I'm glad that Chrome is now taking proactive measures and hopefully Firefox and other browsers will, will follow because there are, there was, there were, there was an attack we talked about a few years ago involving other protocols which routers were involving themselves in. Essentially routers were proxying some other protocols and there was a way of using, if you could determine what the address of the router was, that is actually the user's gateway on their local network, then you would be able to use other ports on that gateway and, and, and create some, some, some security vulnerabilities which, you know, put all this on the map. And people were saying, okay, browsers should not be poking around behind their users back on their own local networks. And look how long it's taken for anything to happen to, to begin to fix that. The Markups headline was we caught companies, and this is not surprising, but the number of companies is somewhat surprising. We caught companies making it harder to delete your personal data online. Now I suppose we shouldn't be surprised, but I thought it was interesting. The articles Tease said dozens of companies are hiding how you can delete your personal data. The markup and Cal Matters found after our reporters reached out for comment. Multiple companies have stopped the practice. So this is why it's good to have people like, you know, poking at things and looking at things and reporting on things and basically embarrassing companies into changing their, their practices. Unfortunately, unless we have that, you know, companies will do it until they're found out. The Markup wrote explaining what they found. They said data brokers are required by California law to provide ways for consumers to request their data be deleted. But good luck finding them. Yep, and wait till you hear the number of them. Leo. They wrote more than 30 of the companies which collect and sell consumers personal information hid their deletion instructions from Google, according to a review by the Markup and Cal Matters of hundreds of broker websites. This creates one more obstacle for consumers who want to delete their data. Many of the pages containing the instructions listed in an official state registry Used code to tell search engines to remove the page entirely from the search results. Not something that can happen by mistake. Popular tools like Google and Bing respect the code by excluding pages when responding to users. Okay. So upon reading that, I was tempted to suggest that users ask perplexity. But anyway. Data brokers nationwide must register in California. Get this. Data brokers nationwide must register in California under the states Consumer Privacy act, which allows Californians like you and me, Leo, to request that their information be removed.