How Intel Core i9 Processors Enable Megatasking

How Intel Core i9 Processors Enable Megatasking
HIGHLIGHTS

Raw Compute Power of New Intel® Core™ i9 Processor-based Systems Enables Extreme Megatasking

The earliest computers were often pushed to the limit performing even a single task, between hammering the hard drive, swapping memory frantically, and crunching through computations. With Microsoft Windows* 3.1 and then Windows 95, multi-tasking began to take form, as systems were finally able to handle more than one program at a time. Now, with the advent of double-digit cores in a single CPU, the concept of "megatasking" is gaining traction. The latest entries for the enthusiast are in the Intel® Core™ X-Series processor family, ranging from 4 to 18 cores. These Intel® Core™ i9 processors can simultaneously handle tasks that previously required multiple complete systems-enter extreme megatasking.

Intel® Core™ i9 Processor ExtremeConsider the challenge of simultaneously playing, recording, and streaming a Virtual Reality (VR) game. Game studios rely on video-trailers to spark interest in new VR titles, but showing off the experience of a 3D game in a 2D video has always been a challenge, as a simple recording of what the player sees offers only part of the story. One way to solve this – mixed reality – captures the player against a green screen, and then blends the perspectives into a third-person view of the player immersed in that world. (For more information about this technique, refer to this article.) This often requires one PC to play and capture the game, and another PC to acquire the camera feed with the gamer. Add the idea of streaming that complete session live to a global audience of expectant fans, and you could be looking at a third system for encoding the output into a high-quality uploadable format. But an Intel team recently demonstrated that production crews can now complete all of these CPU-intensive tasks on a single Intel® Core™ i9 processor-based system, with each engaged core chugging merrily along.

Moore's Law and System Specs

When originally expressed by Intel co-founder Gordon Moore in 1965, "Moore's Law" predicted that the number of transistors packed into an integrated circuit would repeatedly double approximately every two years (Figure 1). While transistor counts and frequencies have increased, raw compute power is now often measured in the number of cores available. Each core acts as a CPU and can be put to work on a different task, enabling better multi-tasking. But simple multi-tasking becomes extreme megatasking with simultaneous, compute-intensive, multi-threaded workloads aligned in purpose.

Moore's Law change for technology
Figure 1. Moore's Law expresses the accelerating rate of change for technology (source: time.com)

The calculation originally used to measure supercomputer performance now applies to desktop gaming PCs: FLOPS, or FLoating point Operations Per Second. These are used to measure arithmetic calculations on numbers with decimal points, which are harder to make than operations on integers. The equation is:

FLOPS = (sockets) x (cores per socket) x (cycles per second) x (FLOPS per cycle)

Picture a single-socket CPU with six cores, running at 3.46 GHz, using either single-precision (8) or double-precision (16) FLOPS per cycle. The result would be 166 gigaflops (single-precision) and 83 gigaflops (double-precision). By comparison, in 1976, the Cray-1 supercomputer performed just 160 megaflops. The new Intel® Core™ i9-7980XE Extreme Edition Processor runs at about 4.3 GHz (faster if overclocked) and thus should calculate to 1.3 teraflops. For perspective, the world's fastest supercomputer runs 10.65 million cores, performing at 124.5 petaflops. In 1961, a single gigaflop cost approximately USD 19 billion in hardware (around USD 145 billion today). By 2017, that cost had fallen to USD 30 million.

To achieve that raw compute power, the Intel® Core™ i9-7980XE Extreme Edition Processor uses several technology upgrades. With up to 68 PCIe* 3.0 lanes on the platform, gamers have the ability to expand their systems with fast Intel® Solid State Drives (Intel® SSDs), up to four discrete GFX cards, and ultrafast Thunderbolt™ 3 technology solutions. Updated Intel® Turbo Boost Max Technology 3.0 improves core performance. Intel® Smart Cache has a new power-saving feature that dynamically flushes memory based on demand. The Intel Core X-series processor family is also unlocked to provide additional headroom for overclockers. New features include the ability to overclock each core individually, Intel® Advanced Vector Extensions 512 (Intel® AVX-512) ratio controls for more stability, and VccU voltage control for extreme scenarios. Combined with tools like Intel® Extreme Tuning Utility (Intel® XTU) and Intel® Extreme Memory Profile (Intel® XMP), you have a powerful kit for maximizing performance.

Intel reports that content creators can expect up to 20 percent better performance for VR content creation, and up to 30 percent faster 4K video editing, over the previous generation of Intel® processors (see Figure 2). This means less time waiting, and more time designing new worlds and experiences. Gamers and enthusiasts will experience up to 30 percent faster extreme megatasking for gaming, over the previous generation.

Gregory Bryant, senior vice president and general manager of the Client Computing Group at Intel Corporation, told the 2017 Computex Taipei crowd that the new line of processors will unleash creative possibilities throughout the ecosystem. "Content creators can have fast image-rendering, video encoding, audio production, and real-time preview-all running in parallel seamlessly, so they spend less time waiting, and more time creating. Gamers can play their favorite game while they also stream, record and encode their gameplay, and share on social media-all while surrounded by multiple screens for a 12K experience with up to four discrete graphics cards."

Intel® Core™ X-series processor family partial specifications.
Figure 2. Intel® Core™ X-series processor family partial specifications.

Another way to measure system performance is through CPU utilization, which you can find in your own Microsoft Windows PC through Task Manager > Resource Monitor. Josh Bancroft, Intel Developer Relations Content Specialist working with the gaming and VR communities, was part of the Intel® Core™ Extreme Processors rollout at Computex Taipei in early 2017, and helped coin the term "extreme megatasking" in showing off CPU utilization. Bancroft used one of the new Core i9 X-Series processor-based PCs to show a green-screen VR mixed-reality demo, simultaneously playing a VR title at 90 fps, recording the game-play, compositing the player into the scene from a separate camera, and then combining and syncing the images precisely, and streaming the result live to Twitch*.

Later, Bancroft was part of the first Intel® Core™ i9 Extreme Processor rollout at E3 in Los Angeles, where he showed the same demo on a system with 18 cores. He still recalls that event fondly: "It was really exciting to do the world's first public demo on an 18-core i9-based system. The case was gigantic, with two water loops with this blue, opaque fluid, and really cool-looking."

The demo, hosted by Gregory Bryant, went off smoothly, but wasn't without tension. "When you stack those 4 or 5 extreme tasks together, you can overload a system and bring it to its knees," Bancroft explained. But the 18 cores performed flawlessly, with the CPU utilization graphs showing what was going on under the hood. "When we turned on the recording, when we turned on the streaming, when we did everything that cranked it up, you saw those 36 graphs jump up to 90-plus percent utilization. You could see all of those threads were working really hard."

The demo illustrated Intel's commitment to VR, PC gaming, and multi-core processing power in one neat package. Since VR requires enormous resources to pull this off smoothly, it's a perfect world in which to demo new systems in general. Using Bancroft's mixed-reality technique allows developers, streamers, and content creators to make trailers and show people a VR experience without actually having to put them in a headset. Best of all, one new system can replace the multiple devices previously required to pull it off.

Trailers are one of the most important tools in an indie developer's marketing toolkit. Creating a compelling, enticing game trailer for VR is of vital importance to indies getting started on their own titles. However, the 3D experience of VR doesn't translate well to a 2D trailer, which is where the mixed-reality technique comes in. Mixed-reality VR was pioneered by Vancouver, BC-based Northway Games*, run by husband-and-wife team Sarah and Colin Northway, who added enabling code in their Unity-based game Fantastic Contraption* (Figure 3). The ability to record what the gamer is seeing as they play, as well as how they would look in a third-person view, greatly helps market VR titles by communicating the experience. In addition, the Northways showed how entertaining their game was, by including shots of onlookers watching and laughing from a sofa.

Fantastic Contraption
Figure 3. Creating and streaming a mixed-reality trailer-like this one for Fantastic Contraption*-is now possible on a single PC.

Not Invented Here, Just Enhanced

Bancroft is quick to share the credit for his mixed-reality, single-machine demos, which he learned in a cramped studio, complete with scaffolding, lighting, a green screen, and multiple cameras. The Northways wrote a blog post that offered a step-by-step walkthrough of the tasks involved, and Bancroft relied on it heavily to get started. From there, he and his team came up with some additional tweaks, all developed and shared openly.

Many of the software programs require immense power; just playing a VR title for Oculus Rift* or HTC VIVE* at 90 fps is quite a task. At a lower frame-rate, players can experience dizziness, vomiting, and other physical reactions, so a machine has to start with the power to play a game properly, before engaging any more of a load.

For mixing and compositing, Bancroft is fond of MixCast*, a growing VR broadcast and presentation tool that simplifies the process of creating mixed-reality videos. Created by Blueprint Studios*-a Vancouver, BC-based leader in the interactive technology space-the tool enables dragging and dropping the MixCast VR SDK into Unity projects, so end-users can showcase their experience in real time.

In addition, Bancroft uses Open Broadcaster Software (OBS), a free and open source software program known to most streamers for compositing, recording, and live streaming. It offers high-performance, real-time audio- and video-capturing and mixing; video filters for image masking, color correction, and chroma keying; and supports streaming platforms such as Twitch*, Facebook*, and YouTube*.

Of course, there are multiple tools to create the same end result, but that's the current software stack. A full description of Bancroft's efforts can be found at <link to Mega-tasking step-by-step article>.

Jerry Makare is the Intel® Software TV video producer, and works closely with Josh Bancroft to create videos that test the raw-compute boundaries of extreme megatasking. He sees important benefits to using a single, powerful system for VR. "Being able to split our tasks into multiple places, especially rendering, is a big deal," he said. "Once you start rendering, generally you end up killing your machine. There's almost nothing else you can do. The ability for us to split these large, compute-intensive tasks like rendering and compositing into multiple buckets is a major time-saver."

Makare is particularly eager to task an Intel® Core™ i9 processor-based system with building out a very large-scale room, using a 3-D modeling program to get a baseline for how much time it saves. He also looks forward to putting the new system to work on some real-world applications that his team can learn from.

Eye to the Future

With so much raw computing power now available, it's exciting to think of the different ways in which these new systems could be used. Gamers can anticipate more vivid, immersive, and realistic experiences. Creating and editing video from raw, 4K footage was a complex, processing-intensive chore, but now professionals and novices alike can edit in native 4K, creating stunning visual effects, and compose music with more depth and nuance. The reach of VR extends beyond gaming into virtual walkthroughs, construction planning, city modeling, and countless simulation scenarios. Scientists in fields such as biology, geology, chemistry, medicine, and astronomy may unlock even more secrets, thanks to the raw computing power behind extreme megatasking.

Click here to Join the Intel® Game Dev program for free tools, resources, and opportunities to help you bring the best game experience to the biggest worldwide audience

Source: https://software.intel.com/en-us/articles/raw-compute-power-of-new-intel-core-i9-processor-based-systems-enables-extreme-megatasking

Promotions

Promotions

This is a sponsored promotional post, written by Digit's custom content team. View Full Profile

Digit.in
Logo
Digit.in
Logo