Intel's 'continual compute' tech could turn normal laptops into gaming powerhouses

Intel continuous compute comparisonImage: Intel

Intel has begun showing off a “continual compute” demonstration that essentially takes the concept of an eGPU — an external graphics card connected to a laptop via Thunderbolt — and abstracts it further over your network to a remote PC.

Intel presented the concept at the Realtime Conference on Monday, which, like virtually every company within the past few weeks, latched on to the concept of “the metaverse” as an opportunity to sell more products. Raja Koduri, senior vice president and general manager of the Accelerated Computing Systems and Graphics Group, presented continual compute as a solution to the problems posed by the metaverse. Those problems being capturing real-world objects in real time and then translating that environment into the virtual space and vice versa.


Essentially, though, continual compute has a more immediate impact on PC gaming. That’s how Intel chose to show the technology off in a blog post on Tuesday. Using the game Hitman 3, Intel showed off how a thin-and-light laptop could connect to a local gaming PC and use its GPU to offload some of the graphics workload. (Intel hasn’t posted the embedded video to its YouTube channel, so you’ll have to watch it on its site.)

The video shows the game running on the laptop and taxing the CPU and GPU to its limits. After remotely connecting the external gaming PC, the same game ran at higher frame rates and visual quality. However, Intel didn’t enable a framerate counter and it’s not quite clear what the visual settings were set at during the demo. The video did appear to run more smoothly and at a better quality.

Clarification: The video voiceover does note that the “local experience” is being run at low resolution and low quality settings, through without the actual resolution. The enhanced experience is presented at “high resolution and full HD settings,” without further clarification.)


Intel’s secret sauce here isn’t hardware, but software. It’s an abstraction layer than can detect what Intel calls “ambient computing,” or the presence of a more powerful PC that can be applied to the task at hand. “This is precisely what Intel’s infrastructure layer delivers,” the video’s narrator says. “It sends us additional compute resources available within the network and intelligently allocates it to me delivering the best user experience possible.”

“As the game launches at runtime, the infrastructure layer determines that a better experience is possible using ambient computing resources from my gaming rig,” the narrator continues. The improvements are part of what Intel calls “system resource abstraction,” where the game’s file system is being abstracted and delivered over the network.

To be fair, we’ve seen some of this before. In 2015, Microsoft enabled game streaming from Xbox Ones to PCs running Windows 10 even over Wi-Fi. In that case, the console was handling the bulk of the work. Intel’s blog post implies that its technology could be a bit more collaborative. Companies like Steam and Moonlight have offered similar experiences, though. Cloud gaming on Windows, of course, simply abstracts these shared resources into the cloud.

Still, it’s good to have top chip vendors supporting an (optimistic) future where we simply have underutilized GPUs lying about and waiting to be taken advantage of for gaming. We already have network-attached storage, could we eventually see network-attached GPUs too?

This story was updated at 4:45 PM with additional clarification.

As PCWorld’s senior editor, Mark focuses on Microsoft news and chip technology, among other beats. He has formerly written for PCMag, BYTE, Slashdot, eWEEK, and ReadWrite.

Recent stories by Mark Hachman:

No, Intel isn’t recommending baseline power profiles to fix crashing CPUsApple claims its M4 chip’s AI will obliterate PCs. Nah, not reallyIntel says manufacturing problems are hindering hot Core Ultra sales

Leave a Response