Transcript
A (0:04)
Welcome to the Tech Brew Ride home for Tuesday, January 6, 2026. I'm Brian McCullough. Today Nvidia launches its next chip platform AMD says. Hey, us too. Also, did you know Nvidia is building self driving car platforms? Did you know Dell is bringing the XPS brand back? And did you know that the most interesting product launch I've seen so far has come from Lego? Here's what you missed today. In the world of tech. You may have noticed that your customers love webinar and video content. But if you've ever put together a webinar or a video, then you know that it can eat up a lot of your time and budget. But now, thankfully, there's a singular tool that can streamline your team's video and webinar workflows. Wistia Wistia can scale your content output with AI powered tools that help you create, edit and repurpose videos and webinars quickly. And speaking of webinars, you can host engaging, easy to set up webinars in Wistia too, complete with built in analytics. With Wistia you don't have to pay for multiple video tools, hop between platforms, or constantly re upload files, create, edit, collaborate and publish all in one place. Head to wistia.com brew to learn more. That's W-I-S-T-I-A.com brew with Wistia you can expect less work and more plays. Well, Nvidia Comms did not rate me as being cool enough to attend the keynote in person, I guess. But yesterday here in Vegas, Jensen Huang and Nvidia launched the Vera Rubin platform, saying it will offer dramatic reductions in inference and training costs compared to Blackwell across six new chips. Good to see somebody is still treating CES as the platform for launching big things. Quoting the Verge, Nvidia is kicking off 2026 with the early launch of its new Vararubin computing platform, following a record breaking year for the Rubin GPU's predecessor, Blackwell, fueled by the AI boom or bubble. During a press briefing ahead of today's KeyNote, Dion Harris, Nvidia's senior director of HPC and AI Infrastruct solutions, described vera Rubin as six chips that make one AI supercomputer. Those six chips include the Vera CPU, Rubin GPU, NVLink, 6th Gen Switch, ConnectX9 NIC, Bluefield 4DPU and Spectrum X102.4T CPO. The platform will support third generation confidential computing and according to Nvidia, will be the first Rack scale trusted computing platform Nvidia claims the Rubin GPU is capable of delivering delivering five times as much AI training compute as Blackwell. The Vera Rubin architecture as a whole can train a large mixture of expert's AI model in the same amount of time as Blackwell while using a quarter of the GPUs and at 1/7 the token cost. The Rubin launch was originally expected for late this year. Its arrival today comes just a couple of months after Nvidia reported record high data center revenue, up 66% over the prior year. That growth was driven by demand for Blackwell and Blackwell Ultra GPUs, which have set a high bar for Rubin's success and served as a bellwether for the AI bubble. Products and services running on Ruben will be available from Nvidia's partners starting in the second half of 2026. Jensen also said yesterday that the Vera Rubin chips are in full production already and that Rubin can train some LLMs with roughly one fourth the chips Blackwell needs. Oh yeah, says AMD. They yesterday teased their next generation CDNA 6 based Mi 500 AI chips built on a 2 nanometer node, claiming 1000x performance games over predecessors. So yeah, take that Nvidia. Quoting Reuters Advanced Micro Devices CEO Lisa Su showed off a number of the company's AI chips on Monday at the CES trade show in Las Vegas, including its advanced Mi455AI processors, which are components in the data center server racks that the company is selling to firms like ChatGPT maker OpenAI. Sue also unveiled the Mi 440X, a version of the Mi 400 series chip designed for on premise use at businesses. The so called enterprise version is designed to fit into infrastructure that is not specifically designed for AI clusters. The Mi 440X is a version of an earlier chip that the US plans to use in a super supercomputer. AMD is one of Nvidia's strongest rivals, but has struggled to have as much success. In October, AMD signed a deal with OpenAI that, in addition to the financial upside, was a major vote of confidence in AMD's AI chips and software. But it is unlikely to dent Nvidia's dominance as the market leader continues to sell every AI chip it can make, analysts said at the Monday event. OpenAI President Greg Brockman joined Su onstage and said chip advancements were critical to OpenAI's vast computing needs. Looking to the future needs of companies like OpenAI, Su previewed the Mi 500 and said it offered 1,000 times the performance of an older version of the processor. The company said the chips would launch in 2027, end quote. But back to Nvidia for another second. They have their hand in basically everything now, right? Including robotics and such. But did you know that they've built a bunch of Autom autonomous driving platforms too? The Verge says Tesla should be worried. Quote the vehicle is using Nvidia's new point to point level two, or L2 driver assist system that is getting ready to roll out to more automakers in 2026. This is the chip maker's big bet on driving automation, one it thinks can help grow its tiny automotive business into something more substantial and more profitable. Think of it as Nvidia's answer to Tesla's full self driving for roughly 40 minutes, we navigate a typically chaotic day in San Francisco, passing delivery trucks, cyclists, pedestrians and even the occasional Waymo robo taxi. The Mercedes, under guidance from Nvidia's AI powered system as well as its own built in cameras and radar, handles itself confidently traffic signals, four way stops, double parked cars, and even the occasional unprotected left. At one point it makes a wide right turn to avoid a truck that's blocking an intersection, but not before allowing a few slowly moving pedestrians to cross in front of Tesla. Fans would likely scoff at Nvidia's demonstration, arguing that full self driving is orders of magnitude more capable. Nvidia hasn't been working on this problem as long as Elon Musk's company, but what they showed me absolutely would go toe to toe with FSD under the most complex circumstances. And thanks to the redundancy provided by Mercedes radar, some could argue it's safer and more robust than the camera. Only FSD the invitation to test out Nvidia's new system came as a bit of a surprise. After all, the company isn't exactly known as a CEL self driving leader, and while Nvidia has long supplied major automakers with chips and software for driver assist systems, its automotive business is still relatively tiny compared to the billions it rakes in on AI. Its third quarter revenues were $51.2 billion, but its automotive division only made 592 million, or 1.2% of the total haul. That could change soon as Nvidia seeks to challenge Tesla and Waymo in the race to level four Autonomy cars that can fully drive themselves under specific conditions. Nvidia has invested billions of dollars over more than a decade to build a full stack solution, says Xin Zhao Wu, the head of the company's automotive division. This includes system on a chip hardware along with operating systems, software and silicon. And Wu says that Nvidia is keeping safety at the forefront, claiming to be one of the few companies that meets high automotive safety requirements at both the silicon and the software levels. That includes the company's Drive AGX system on a chip similar to Tesla's Full Self Driving chip or Intel's Mobileye I. The SoC runs on the Safety certified Drive OS operating system built on the Blackwell GPU architecture that's capable of delivering 1000 trillions of operations per second of high performance compute, the company says. Jensen always says the mission for me and my team is to really make everything that moves autonomous, wu says. Wu outlines a roadmap in which Tesla will release Level 2 Highway and Urban driving capabilities, including automated lane changes and stop sign and traffic signal recognition in the first half of 2026. This includes an L2 system in which the vehicle will be able to navigate point to point autonomously under driver supervision. In the second half of the year, urban capabilities will expand to include autonomous parking, and by the end of the year, Nvidia's L2 system will encompass the entirety of the United States, wu said. For L2 and L3 vehicles, Nvidia plans on using its Drive AGX Orin based SoC. For fully autonomous L4 vehicles, the company will transition to the new Thor generation software. Redundancy becomes critical at this level, so the architecture will use two electronic control units, a main ECU and a separate redundant ECU. A small scale level 4 trial similar to Waymo's robotaxis is planned for 2026, followed by partner based robotaxi deployments in 2027, Wu says. And by 2028, Nvidia predicts its self driving tech will be in personally owned autonomous vehicles. Also in 2028, Nvidia plans on supplying systems that can enable level 3 highway driving in which drivers can take their hands off the wheel and eyes off the road under certain conditions. Safety experts are highly skeptical about L3 systems, by the way, what makes this particularly notable is how quickly progress has been made. Tesla took roughly eight years to enable urban driving with fsd, whereas Nvidia is expected to do the same within about a year. No other passenger car system besides Tesla's has achieved this, Connie boasts. We're coming fast, he says, as the Mercedes slows itself down at another intersection. I'd say we're very close to fsd, end quote. Maybe Jensen isn't just blowing smoke when he says he envisions AI going into everything and thus Nvidia chips going into everything as well. Maybe into every last car. At least. It's the holidays, which means you're probably trying to figure out what to get the people in your life who live in back to back meetings. This isn't some sci fi concept. It's Plaud P L A U D. It snaps onto the back of your phone and records phone calls, meetings and conversations. This isn't just note taking though. It can summarize meetings, generate to do lists, draft emails, extract insights, analyze perspectives, and help you make better decisions, all with full contextual awareness across your past conversations and meetings. Black Friday is coming and Plod is giving Tech we Write home listeners 20% off. Search P L A U D on Google or Amazon and get 20% off I always try to report on Dell and Lenovo news for you when pertinent because I figure a lot of you use those brands as your daily driver computers. So news that Dell has revived the XPS brand with new XPS 14 and 16 inch laptops offering new designs, Intel Core Ultra Series 3 chips and tandem OLED screens, but no dedicated GPU use. Quoting Gizmodo at last year's ces, Dell made the eyebrow raising decision to axe all its legacy laptop brand names and instead opt for Apple like conventions instead of XPs. We were forced to comprehend the differences between a Dell, a Dell Pro, a Dell Premium and a Dell Pro Max. Now Dell is admitting it made a mistake. Whether or not you're happy with Your Dell Pro Premium Plus Max Crunchwrap supreme, the XPS line is back and now includes 14 and 16 inch models sporting up to an Intel Core Ultra X7 or X9 chip for more GPU capabilities. The revised XPS also includes the brand name stenciled on the laptop lid. Otherwise, the notebook is getting back to basics. The 14 inch model starts at 3 pounds in weight and is barely more than half an inch thick, plus it contains a 2.8k OLED screen as an option. The new versions of the XPS laptops no longer have the glowing touch function row keys of recent editions that proved terrible. For the sake of accessibility, Dell has maintained the seamless trackpad and squared keyboard keys that help make the XPS laptops visually unique. If you're not a big fan of keyboards with no space between keys, you probably won't enjoy the XPS's big reunion tour. Whether or not this is Dell playing 5D chess and creating demand for XPS by taking it away, the next Dell XPS 14 starts at $2050, while the 16 inch model will demand 2200 starting January 6th. If you think that's high, that's because Dell revised its price point from 1650 and 1850, respectively. Just as Dell hinted to Gizmodo back in December, it's taking targeted price adjustments in response to the ongoing chip shortage. A new XPS 13 will come later this year. End quote. Let's close today out by hitting some actual CES stuff. Chinese robot vacuum maker roborock unveiled the Saros Rover, a concept robot vacuum with two wheel legs that allow it to climb steps as robotics is dominating ces. More on that in a second, quoting Bloomberg. It's the first robot vacuum cleaner with two wheel legs, according to the company, which is formerly known as Beijing Roborock Technology. Those legs can be raised and lowered independently of each other, the firm said in a statement Tuesday, allowing it to climb steps and navigate other uneven surfaces while while making sudden stops and small turns along the way. Roborock was a surprise hit at last year's CES when it unveiled a different robotic vacuum, the Saros Z70, which had a mechanical arm that could pick up stray socks. While the company dazzled onlookers at the show with a tightly choreographed demo, the device was met with a lukewarm reaction from tech reviewers when it went on sale a few months later for $2,599 in the U.S. aside from the high price, a common complaint was that the Z70 could only recognize a small handful of items such as tissues and shoes, paper and slippers, but not, say, kids or pet toys. Following that ill fated launch, roborock is taking a different approach with the two legged rover, which does not have a confirmed launch date, according to the company. The rover navigates using a combination of artificial intelligence, several motion sensors and 3D spatial information. In a demonstration for media ahead of Tuesday's announcement, it successfully climbed several steps, rolled down a ramp and pulled off a small jump, a maneuver it might use to go downstairs or bypass certain obstacles, the company spokesperson told Bloomberg. It was not immediately clear from the carefully staged presentation what happens if the rover falls and whether it can reorient itself without the help of a human. In the event of an accident, the robot will try to get up by itself, the spokesperson said. Now, one keynote I did attend yesterday was Lego. Yes, Lego. Because listen to this. Lego has unveiled what it calls the most significant change to its iconic building system in nearly 50 years. It's called the Smart Brick, a fully functional computer that fits inside a traditional 2x4 Lego brick. This new component is at the heart of LEGO's upcoming Smart Play platform, which aims to blend the classic tactile experience of LEGO with dynamic interactive responses, all without screens. The Smart Brick uses built in sensors, a speaker and wireless technology to bring physical creations to life. Essentially, it responds to special NFC enabled smart tags embedded in tiles and mini figurines, detects motion and gestures, and can communicate with other smart bricks via a Bluetooth mesh. This lets sets create effects like lightsaber hums, roaring engines, blaster sounds, or even music when characters are positioned in particular ways. The bricks are wirelessly rechargeable and designed to hold their charge even after long periods of inactivity, a big shift from past LEGO tech that my kids have had that relied on replaceable batteries that you basically have to replace all the time. Unlike previous electronic LEGO products such as LEGO Mario. See my previous complaints about reliability there? The Smart Brick doesn't use a camera or AI, and its onboard microphone isn't for recording. It's just another sensor that reacts to simple sounds like claps or blowing. The first Smart Brick sets will launch in March, starting with three LEGO Star wars models that incorporate these interactive elements. They're slightly smaller than traditional minifigure scale Star wars ships, in part due to the added cost of the technology. LEGO says Smart Play is just the beginning of a broader vision to make brick play even more imaginative, social and responsive, potentially extending into future themes beyond Star wars and more. On this from the Verge quote In a 30 minute interview, Julia golden, the LEGO Group's top executive in charge of product and marketing, hints again and again that Smart Play, in which bricks have light, sound and the ability to detect movement, isn't an experiment, but rather a tremendous opportunity for the future. And that it isn't just for kids. Golden presided over the LEGO Group's massively successful expansion into sets for adults, including $50 flower bouquets and $ Millennium Falcons. And she says she has no doubt in my mind these bricks will eventually make their way into adult sets as well. And when they come to adult sets, she hints that the smartplay initiative might bring a different meaning to them. She says the company sees this as an optional part of the future, something that LEGO fans can engage with or not. The smart bricks and tiles will be a multipurpose tool and a LEGO builder's tool chest. Today a bunch of Lego hot dog pieces can be turned into flowers when you attach some petals. Tomorrow a medical scanner SmartTile can turn into an alarm when you attach it to your Lego city's police jail. And a minifigure tries to escape. Golden says there may be all sorts of intriguing combinations in the future as Lego produces more interactive tiles and minifigures too. She asks me to imagine what happens when smart LEGO Star wars minifigures meet smart LEGO Marvel minifigures a crossover event. Perhaps LEGO partner Disney controls both universes, so it seems quite plausible and quite quote. All right, was really running late today. Too many things to do. I am finally heading out to the convention floor, so check my x account, Ryan McC, because I'll just post a whole bunch of pictures of all the cool things I see today, especially the robots. I'm going to find the robots from first. I would cross post to Blue sky as well, but that's probably too much work. So Ryan MCC on Twitter. Talk to you tomorrow.
