News
How Motorsports Teams Use Big Data To Drive Innovation On The Racetrack
Discover how the best motorsports teams in the world use the vast volumes of data they generate to achieve an edge over the competition.
Motorsports — some may not view them as real sports, but nowhere else can you see man and machine working together in perfect harmony, pushing to the absolute limit of performance. While the best racing drivers in the world are battling it out on track, there’s another race going on behind the scenes: a battle of minds with some of the brightest engineers in the world working to extract every ounce of performance out of their machinery. Motorsports are as much a competition for the engineers and crew as it is for the drivers themselves.
At their very core, motorsports are all about finding an advantage over your competitors, however large or small, because every little bit counts. And the best way to gain a competitive edge over your rivals is to use data — tons and tons of it.
Using Data To Unlock On-Track Performance
Racing teams generate and analyze huge volumes of data per race; we’re talking tens of terabytes measuring every single aspect — even the most minute — of not only the vehicle’s performance but also the driver’s.
There are many different categories and classes of motorsports, ranging from road cars to purpose-built racing cars like in Formula One or bikes in the case of MotoGP. These two motorsports have the most popular championships in the world, but for simplicity’s sake, we’re going to stick with Formula One (F1), described as the very pinnacle of motorsports.
Teams collect data for three main reasons: to measure the vehicle’s performance on track, to measure the driver’s performance, and to help the engineers identify and understand key areas of improvement on the car.
F1 cars have thousands of sensors monitoring parameters such as tire temperature, brake temperatures, engine performance, component wear, and so on in real time (known as telemetry data). These teams can also use the data gathered, along with feedback they receive from the drivers, to make minor real-time adjustments to the car during the race, such as engine power settings. This telemetry, along with the weather information the teams gather, can also enable them to devise effective race strategies to decide exactly when to pit and change tires and what compound of tires to switch to, especially when weather conditions are unpredictable.
If this wasn’t impressive enough, the race engineers can also view the driver’s exact inputs: when they’re braking, accelerating, and turning into a corner, alongside a host of other information like heart rate and other biometric data. The engineers can then give them feedback on what is working and what isn’t, enabling the driver to adjust their approach to extract even more performance out of themselves and the car. It’s safe to say that in modern F1, even the cars are data-driven.
Data-Driven Development In The Factory
The petabytes of data gathered by racing teams on the track are then analyzed after the race to determine what areas of the car need improvement. Since F1 greatly restricts on-track testing, teams are forced to rely on incredibly complex simulations to develop the car. The more accurate data they use, the more accurate these simulations.
This data is also used by the team to develop F1 car simulators that are used by the drivers. These simulator rigs are much more accurate, complex, and unsurprisingly expensive compared to consumer simulator rigs. This simulator testing plays a major role in not only helping the engineers understand the characteristics of the car without having to perform on-track testing, but also in helping them set up the car for a race. Each track is different, and the car setup varies depending on the track and weather conditions during the race weekend.
Data Is King
In motorsports, every little advantage can make a difference. And with F1’s recently introduced budget cap, teams can no longer dump huge amounts of money to fix any issues with their cars, meaning data is now the most valuable currency in F1.
Big data analytics will only continue to play an increasingly prominent role in motorsports as has been the case since the early 80s. The most competitive teams are those that know how to effectively use the vast amounts of data at their disposal to drive innovation on the racetrack.
News
NVIDIA Puts GPT-5.5 Codex In Hands Of 10,000 Staff
The chipmaker has significantly expanded OpenAI’s latest model across teams from engineering to HR under tight internal controls.
NVIDIA has started rolling out OpenAI’s GPT-5.5 model through the Codex coding agent to more than 10,000 employees, extending the tool well beyond software teams and into core business functions.
The deployment covers engineering, product, legal, marketing, finance, sales, HR, operations and developer programs. Staff are using Codex for coding, internal research and routine knowledge work as companies test whether AI agents can move from demos to daily use.
GPT-5.5 is running on NVIDIA’s GB200 NVL72 rack-scale systems, linking OpenAI’s newest model directly to the chipmaker’s latest infrastructure push. NVIDIA said the systems cut cost per million tokens by 35 times and raise token output per second per megawatt by 50 times versus earlier generations.

Inside the company, it says the effects are immediate. Debugging work that once took days is being finished in hours and experiments across large codebases that used to stretch over weeks are now handled overnight. Teams are also building features from natural-language prompts with fewer failed runs.
In a company-wide note urging staff to adopt the tool, CEO Jensen Huang wrote: “Let’s jump to lightspeed. Welcome to the age of AI.”
Security remains central to the rollout. Codex can connect through Secure Shell to approved cloud virtual machines, allowing agents to work with company data without moving it outside approved environments. NVIDIA said it assigned cloud VMs to employees so agents run in isolated sandboxes with full audit trails.
Also Read: Deezer Says AI Tracks Now Make Up 44% Of Uploads
The company added that the setup uses a zero-data-retention policy. Access to production systems is read-only through command-line tools and internal automation layers.
The move also highlights NVIDIA’s long relationship with OpenAI. NVIDIA said the partnership began in 2016, when Huang personally delivered the first DGX-1 AI supercomputer to OpenAI’s San Francisco office.
The two companies have since worked across hardware and model deployment. NVIDIA also said OpenAI plans to deploy more than 10 gigawatts of NVIDIA systems for future AI infrastructure.
For Gulf markets pouring money into sovereign AI and enterprise automation, the signal is clear: internal AI agents are moving from pilot phase to standard tooling.
