Items tagged with intel xe

Since the early 2000s, the DirectX API has been the primary go-between for game engines on the Windows platform and the hardware that drives them. With a new DirectX update announced earlier today, Microsoft has seemingly outed feature-level support for the RDNA2-based line of GPUs from AMD, Xe-HPG discrete GPUs from Intel, as well as future-generation Snapdragon mobile platform GPU engines from Qualcomm. Feature level support is a bit of a confusing thing, so let's break that down before we dive in too deep here.  With DirectX, there are standard feature sets and feature levels, which help streamline coding for games and provide developers with an idea of what a particular... Read more...
We first heard rumblings about Intel's Ponte Vecchio last week, and now Intel has officially announced its first base GPU architecutre that is aimed squarely at the high performance computing (HPC) and artificial intelligence (AI) markets. As we had all previously suspected, Ponte Vecchio is based on Intel's 7nm Xe architecture that will underpin products across a wide range of computing sectors, from discrete cards for desktops and integrated notebook solutions, to the HPC market and data center. Intel says that this first discrete GPU instantiation of Ponte Vecchio will be positioned to tackle HPC modeling and simulation workloads along with AI training, with high double-precision Floating... Read more...
The hardware may still be in development, but Intel just landed a major design win nonetheless, one that will reportedly culminate in the world’s first Exascale Supercomputer powered partially by Intel’s upcoming Xe graphics technology. It was just announced that Intel and the U.S. Department of Energy (DOE) will deliver the first supercomputer capable of one exaFLOP of compute performance, or a “quintillion” floating point operations per second. The system, named “Aurora”, is being developed at the DOE’s Argonne National Laboratory in Chicago. It is being designed for High-Performance Computing and Artificial Intelligence applications and will be used... Read more...