For an unfathomable price. Which also means we're getting independent performance reviews of the discrete Arc A380 Alchemist GPU. And it doesn't look good.
Honestly, I'm still trying to figure out exactly why Intel took this approach. But whatever the reasoning behind it's release as the vanguard of Intel's push into discrete graphics cards, the Arc A380 looks like a GPU that can't even compare with Nvidia's last-gen GTX 1650.
Unless you count synthetic 3DMark tests, that is, where it surprisingly outperforms Nvidia's old entry-level card, as well as AMD's RDNA 2-based RX 6400 and RX 6500 XT. That's in both the Time Spy and ray tracing Port Royal benchmarks.
That would be an impressive result if it wasn't for the fact that the actual game benchmarks shown by the Shenmedoungce on Bilibili (via Videocardz) didn't have the Intel GPU behind all three of those rival cards in every test it ran. They're not strange games either, with League of Legends, GTA V, PUBG, Shadow of the Tomb Raider, Forza Horizon 5, and Red Dead Redemption 2 all given a run out on the Arc GPU.
I mean, it can run them all, which is grand. But when your new entry-level graphics card can't compete with an entry-level GeForce GPU released over three years ago, well, we've got a bit of an issue.
The best result is in the Vulkan-based run of RDR2, where the A380 is only a bit behind the RX 6400 and GTX 1650, hitting 59 fps at 1080p where the others manage 67 and 70 fps respectively. Still, a tough pill to swallow.
The testing machine makes sense, too. It's an Intel Core i5 12400-based system, on a B660 motherboard, with 16GB DDR4-3200 RAM, and Windows 11 Pro. The comparative cards are from MSI for the Nvidia RTX 3050 and GTX 1650, and from Yeston for the RX 6400 and RX 6500 XT.
Maybe the issue is in drivers, which might explain why Intel has been so behind with the launch of its first discrete cards. And also why it's chosen to let the low-end A380 limp onto the market in China alone, mostly so it can still claim to have launched Arc in the first half of the year.
I'm really hoping it is drivers, because the disparity between the 3DMark and gaming numbers are so stark as to be baffling. It's like Intel purely designed the GPU to perform in 3DMark and is now struggling to match that with actual game engines that developers actually use.
Best CPU for gaming: The top chips from Intel and AMD
Best gaming motherboard: The right boards
Best graphics card: Your perfect pixel-pusher awaits
Best SSD for gaming: Get into the game ahead of the rest
I guess this is why Intel is so resolutely holding back its higher-end versions of the Alchemist architecture, because if it can claw back the in-game performance to more closely match the levels on offer in the synthetic 3DMark tests, then it's got a far better chance of success. If Intel had followed the release cadence of other GPU manufacturers, and launched its flagship, best-performing cards first, they would have been slaughtered.
As it is, any positivity towards Intel's GPU venture is rapidly evaporating and we're all waiting to see how bad it gets when the top-end A700-series cards do finally get launched. The slow drip-feed of Arc releases, with the laptop GPUs still tough to get hold of, and the desktop ones region-locked and limited, is making this whole new graphics card venture from Intel hard to watch.
Hopefully Intel can do something about the gaming performance between now and that far off release date, because the industry could really do with a third way in the GPU market.