i rarely do this "blogger BS" but i decided to post this thing regardless.
(for uninitiated - following is PC-nerd garbage - feel free to ignore it if you are not interested)
well, tldr is: "Intel Arc not that bad" (that is what i can say about it after ~3 days of tinkering with it - i might change my mind later on).
that said: i didn't ran "extensive benchmarks" or anything - i don't have anything meaningful to compare them against anyway - "sample size of one" etc. - all i can say is: "it's noticeably faster than my gtx 970" - which isn't saying much to be honest.
also: currently i'm running it bottlenecked: with no resizable-bar, and with PCIe 3.0 - but it's all part of my "ship of Theseus" strategy - as i plant to move on from Intel Haswell (Intel 4th gen) to AMD AM4 (of some sort - probably "b450 + 1st gen Ryzen" to keep options open for even later upgrade) next year.
for some context why:
i'm not really "hard-core-gamer": my monitor is currently 1440p@60fps so for that: arc a770 will do.
Also: i was limited by budget thrice:
for money (rtx 3060/rx 6650xt/ arc a770 cards were on the table),
for power (550W PSU),
for size (2 slot card only).
I didn't want buy used, even tho i usually do buy used PC components, but this time: due to mining craze that lasted past 5 years or whatnot... tldr: mining will kill vram and it's not easily reparable - well: i neither can-, or have equipment to- solder BGA components. And even if in Poland mining wasn't really that profitable... i decided that used is "no go".
Also: i approached it with "more-VRAM-the-better" idea in mind - and while for gaming it is kind of pointless really - but for "creative" stuff it: can make a difference, and it is cheapest new GPU with 16gb of VRAM (at least where i live).
I wanted to give a shot Blender for a while - so that might be something to push me over to actually do something with it - but if i'm honest: i don't work in blender
but really: i also wanted 16gb for "IT shite" - Intel released openVINO 2.0 which translates CUDA to their crap [well: 'kind of' - 'as far as i know' - 'i'll need to look into it' - 'don't quote me on that'] - so i'll want to try it out - i did some stuff with pytoch, so this is more likely use case for me then '3d art in blender'.
and finally: i'm a tinkerer - i don't mind diagnosing PCs, going into logs to check why it black screened etc. so if it'll even turn out to be a disaster: "oh well..."
ok so what was the worst things that happen? some DirectX9 games refused to launch outright on Windows asking for "more modern drivers" ("Dishonoured" and "Fallout New Vegas" for example)
BUT on Linux they ran fine with Lutris and DXVK - lolitly lol, memety meme - good job Microsoft and Intel! - open source neckbeards beat you, at your own game, for free, and without full access to closed source crap you withhold from everyone...
to be fair - those problems went away when i updated drivers - but *drivers which support DirectX9* came out week ago: so like ~2 months after release - still bit of an cock up...
my read of PC-media before purchase was:
most went crazy with clickbait headlines about "[hardware] scheduler issues, which are unfixable via software" - which (as someone who sits in IT) that is flat out irresponsible journalism if you ask me - it very well CAN be unfixable hardware issues - but it CAN be hundreds of other things! Between: drivers, windows and directx: i wouldn't even hazard to take a guess!
more level-headed reviewers (which i trust bit more) said: "as of now: Arc a770 is wildly unpredictable - usually you'll get around rtx 3060 performance - but on occasion it can go as high as rtx 3070, or as low as rtx 3050 (more towards later than former). And do expect some teething issues."
also: people acting outraged: "HOW INTEL COULD LAUNCH IT IN THIS STATE?!" - but i'm like: did everyone forget how, early on, Amazon's MMO was blowing up rtx 3090?
or when Threadripper from AMD came out: it was faster to run "Windows, in VM, on Linux, and lie to Windows about topology of system" vs "run Windows bare metal" because Windows scheduler saw CCX'es groups, as separate CPUs, and it would do weird things (tldr: with multi-CPU pc's there is penalty with "reading/writing from/to RAM attached to CPU0 by CPU1" and operating system will try to allocate stuff accordingly to minimize that - but on AMD Threadrippers/Epics those are on same package so it doesn't really matter [that much] - so results were bad - and it was better to pretend that all of those chiplets where on same die).
issues with new GPUs (or "cutting edge stuff" in general) are kind of a thing - now here is not only 1st gen product but it's also from newcomer... that is recipe for troubles.
so yeah: i'm not endorsing Intel Arc too much - if you're after gaming, it's probably wiser to go to AMD or Nvidia - Intel is bit of an gamble right now
(and sales are coming *probably* soon - so value might be even worse then it is now - and now it's "ok-ish" but AMD will still beat Intel on "fps/dollar" - at least where i live).
lastly: completely irrelevant: i think it looks nice (bar obligatory RGB vomit) - i'm not fan of garish, edgy gamer aesthetic. (even if it's apparently hassle to disassemble - but i worked many times on many laptops - so i'll manage) also: it sits in 'Fractal Design node 304' - computer case without a window - so it doesn't matter anyway...
so: i had my reasons - Intel Arc might not fit your needs - actually: it probably won't - but it seemed ok for my use case.
but i won't lie: it's my 1st "brand new GPU"... ever... really - so i'm excited enough to share it with anyone at large.
previously i had 2nd hand 'gtx 660Ti' and until recently, also 2nd hand, aforementioned, 'gtx 970' - which btw: is sold now for more-or-less same price i bought it for ~5 years ago: right before bitcoin craze began - but it is "mini-ITX version by Asus" and it can hold their value bit better due to that - but that is still insane...
Ukrainian war:
still a thing 🙁
good luck - and best wishes to Ukrainians!
(for uninitiated - following is PC-nerd garbage - feel free to ignore it if you are not interested)
well, tldr is: "Intel Arc not that bad" (that is what i can say about it after ~3 days of tinkering with it - i might change my mind later on).
that said: i didn't ran "extensive benchmarks" or anything - i don't have anything meaningful to compare them against anyway - "sample size of one" etc. - all i can say is: "it's noticeably faster than my gtx 970" - which isn't saying much to be honest.
also: currently i'm running it bottlenecked: with no resizable-bar, and with PCIe 3.0 - but it's all part of my "ship of Theseus" strategy - as i plant to move on from Intel Haswell (Intel 4th gen) to AMD AM4 (of some sort - probably "b450 + 1st gen Ryzen" to keep options open for even later upgrade) next year.
for some context why:
i'm not really "hard-core-gamer": my monitor is currently 1440p@60fps so for that: arc a770 will do.
Also: i was limited by budget thrice:
for money (rtx 3060/rx 6650xt/ arc a770 cards were on the table),
for power (550W PSU),
for size (2 slot card only).
I didn't want buy used, even tho i usually do buy used PC components, but this time: due to mining craze that lasted past 5 years or whatnot... tldr: mining will kill vram and it's not easily reparable - well: i neither can-, or have equipment to- solder BGA components. And even if in Poland mining wasn't really that profitable... i decided that used is "no go".
Also: i approached it with "more-VRAM-the-better" idea in mind - and while for gaming it is kind of pointless really - but for "creative" stuff it: can make a difference, and it is cheapest new GPU with 16gb of VRAM (at least where i live).
I wanted to give a shot Blender for a while - so that might be something to push me over to actually do something with it - but if i'm honest: i don't work in blender
but really: i also wanted 16gb for "IT shite" - Intel released openVINO 2.0 which translates CUDA to their crap [well: 'kind of' - 'as far as i know' - 'i'll need to look into it' - 'don't quote me on that'] - so i'll want to try it out - i did some stuff with pytoch, so this is more likely use case for me then '3d art in blender'.
and finally: i'm a tinkerer - i don't mind diagnosing PCs, going into logs to check why it black screened etc. so if it'll even turn out to be a disaster: "oh well..."
ok so what was the worst things that happen? some DirectX9 games refused to launch outright on Windows asking for "more modern drivers" ("Dishonoured" and "Fallout New Vegas" for example)
BUT on Linux they ran fine with Lutris and DXVK - lolitly lol, memety meme - good job Microsoft and Intel! - open source neckbeards beat you, at your own game, for free, and without full access to closed source crap you withhold from everyone...
to be fair - those problems went away when i updated drivers - but *drivers which support DirectX9* came out week ago: so like ~2 months after release - still bit of an cock up...
my read of PC-media before purchase was:
most went crazy with clickbait headlines about "[hardware] scheduler issues, which are unfixable via software" - which (as someone who sits in IT) that is flat out irresponsible journalism if you ask me - it very well CAN be unfixable hardware issues - but it CAN be hundreds of other things! Between: drivers, windows and directx: i wouldn't even hazard to take a guess!
more level-headed reviewers (which i trust bit more) said: "as of now: Arc a770 is wildly unpredictable - usually you'll get around rtx 3060 performance - but on occasion it can go as high as rtx 3070, or as low as rtx 3050 (more towards later than former). And do expect some teething issues."
also: people acting outraged: "HOW INTEL COULD LAUNCH IT IN THIS STATE?!" - but i'm like: did everyone forget how, early on, Amazon's MMO was blowing up rtx 3090?
or when Threadripper from AMD came out: it was faster to run "Windows, in VM, on Linux, and lie to Windows about topology of system" vs "run Windows bare metal" because Windows scheduler saw CCX'es groups, as separate CPUs, and it would do weird things (tldr: with multi-CPU pc's there is penalty with "reading/writing from/to RAM attached to CPU0 by CPU1" and operating system will try to allocate stuff accordingly to minimize that - but on AMD Threadrippers/Epics those are on same package so it doesn't really matter [that much] - so results were bad - and it was better to pretend that all of those chiplets where on same die).
issues with new GPUs (or "cutting edge stuff" in general) are kind of a thing - now here is not only 1st gen product but it's also from newcomer... that is recipe for troubles.
so yeah: i'm not endorsing Intel Arc too much - if you're after gaming, it's probably wiser to go to AMD or Nvidia - Intel is bit of an gamble right now
(and sales are coming *probably* soon - so value might be even worse then it is now - and now it's "ok-ish" but AMD will still beat Intel on "fps/dollar" - at least where i live).
lastly: completely irrelevant: i think it looks nice (bar obligatory RGB vomit) - i'm not fan of garish, edgy gamer aesthetic. (even if it's apparently hassle to disassemble - but i worked many times on many laptops - so i'll manage) also: it sits in 'Fractal Design node 304' - computer case without a window - so it doesn't matter anyway...
so: i had my reasons - Intel Arc might not fit your needs - actually: it probably won't - but it seemed ok for my use case.
but i won't lie: it's my 1st "brand new GPU"... ever... really - so i'm excited enough to share it with anyone at large.
previously i had 2nd hand 'gtx 660Ti' and until recently, also 2nd hand, aforementioned, 'gtx 970' - which btw: is sold now for more-or-less same price i bought it for ~5 years ago: right before bitcoin craze began - but it is "mini-ITX version by Asus" and it can hold their value bit better due to that - but that is still insane...
Ukrainian war:
still a thing 🙁
good luck - and best wishes to Ukrainians!
Category All / All
Species Unspecified / Any
Size 1200 x 900px
File Size 223.9 kB
The ARCs are perfectly acceptable for gaming as long as you know what you're getting into. From everything I've seen, all the issues are on DX11 or earlier. Intel is using a conversion for the previous ones to DX12 instructions. Otherwise? Really good value.
Also the 970 is a beast. My friend ran his into the ground until it needed an actual replacement. I gave mine to his girlfriend. My brother had one and only replaced it because it wasn't good for 4k. That and the 1660 super is probably the best bang for the buck cards you could have bought at launch.
Edit: Another fun thing: The RGB on the ARC is actually incredibly over-designed. If you take it apart, they made huge custom PCBs *just* for them: https://www.youtube.com/watch?v=N37.....A&t=23m20s
Also the 970 is a beast. My friend ran his into the ground until it needed an actual replacement. I gave mine to his girlfriend. My brother had one and only replaced it because it wasn't good for 4k. That and the 1660 super is probably the best bang for the buck cards you could have bought at launch.
Edit: Another fun thing: The RGB on the ARC is actually incredibly over-designed. If you take it apart, they made huge custom PCBs *just* for them: https://www.youtube.com/watch?v=N37.....A&t=23m20s
*bit late response - but i was sick*
well: i had a feeling that might be the case (i'm rolling a dice on "getting improvements with newer drivers" a bit to be honest) - but i've noticed a lot of sensationalism about "being dead on arrival" - that is main reason why i even decided to share about [me buying it] in the first place - 'cause that seems exaggerated.
yeah - i was really happy with my 970 - for long time 4GB of vram was perfectly fine - because of consoles (more like 3.5GB - there is story behind that... - no matter). only now: games started to demand more. And i recently upgraded monitor from 21"/1080p to 27"/1440p, and after that upgrade gtx 970 started to show its age
but i'm human being who unironically owns "q6600 + gtx 660Ti" PC in 2022 (as HTPC) - and only major issue with it is: lack of 1Gbps LAN - as q6600 doesn't have any "HighSpeed I/O", so: no gigabit LAN or USB3.0 - but otherwise: it's actually still usable netflix/youtube machine. (it can't to much else tho - and any re-sale value would be either "not worth of trouble" or "cheating of ill informed grandparents" - so i;'' probably just keep it)
Yeah - i've seen this exact teardown - and as i've said: "it's hassle to disassemble" - but asrock version was only available in 8gb version - and for reasons explained before: i wanted 16gb one - so: "meh? - what can you do?"
well: i had a feeling that might be the case (i'm rolling a dice on "getting improvements with newer drivers" a bit to be honest) - but i've noticed a lot of sensationalism about "being dead on arrival" - that is main reason why i even decided to share about [me buying it] in the first place - 'cause that seems exaggerated.
yeah - i was really happy with my 970 - for long time 4GB of vram was perfectly fine - because of consoles (more like 3.5GB - there is story behind that... - no matter). only now: games started to demand more. And i recently upgraded monitor from 21"/1080p to 27"/1440p, and after that upgrade gtx 970 started to show its age
but i'm human being who unironically owns "q6600 + gtx 660Ti" PC in 2022 (as HTPC) - and only major issue with it is: lack of 1Gbps LAN - as q6600 doesn't have any "HighSpeed I/O", so: no gigabit LAN or USB3.0 - but otherwise: it's actually still usable netflix/youtube machine. (it can't to much else tho - and any re-sale value would be either "not worth of trouble" or "cheating of ill informed grandparents" - so i;'' probably just keep it)
Yeah - i've seen this exact teardown - and as i've said: "it's hassle to disassemble" - but asrock version was only available in 8gb version - and for reasons explained before: i wanted 16gb one - so: "meh? - what can you do?"
with my experience, pls don't get first gen ryzen, I've got a threadripper 1950X right now and it kinda sucks, the IPC is under the floor (although it does have 16 cores) with it, and the memory latency is over the top (70NS latency on DDR4 3600 CL16?) if you want to get ryzen please at least start with Zen+ or Zen2
FA+

Comments