Code 43 error with video card.

Did you buy from amazon or a seller on amazon ?

That's amazon europe with fake or broken electronics. And I'm confirming it happened twice on me - first time was shame on them, second was shame on me. I do not buy electronics from amazon any more, there are numerous online shops that didn't try to scam me.

That aside. He doesn't live in europe.
 
Joined
Oct 20, 2006
Messages
7,758
Location
usa - no longer boston
I've been building my own system since around '94. Since then I have had 1 dvd player and 2 or 3 hard disk die (not catastropically but rapid increase in dead sectors).
-
That's it but I noticed during hte bad capacitor years a couple mb were near failing and one psu might have died not sure.
-
no mb actually failed; no gpu; no cpu; i tend to replace my gaming system once every 5 years and my servers once every 8 to 10 years - but i did a quick replace when dual core came out as it had some significant benefits.


You've been really lucky! My 6800 and 9800 both blew up on me. The worst part is they both got replaced with inferior cards that just wasted more money while trying to save it.

6800 got replaced with a 8600GTS which was cheap and absolutely rubbish. So much worse. Then the 9800 got replaced with a 560ti for a while but I replaced that with a 680 pretty quickly when I upgraded CPU to quadcore. The 680 was fantastic.

I remember thinking if only I'd gone for the 570 I could have kept the card going another year but the 560 just didn't have enough ram. Always go for the 70 or 80 and don't be tempted by the 60 version!
 
Joined
Oct 20, 2006
Messages
7,758
Location
usa - no longer boston
You've been really lucky! My 6800 and 9800 both blew up on me. The worst part is they both got replaced with inferior cards that just wasted more money while trying to save it.

I'm also lucky that my card still had a few months left in its warranty period. I'm wondering if they're going to replace it with the same card though since I doubt they still make that model. It's a factory overclocked GTX 980 ti.
 
Joined
Oct 21, 2006
Messages
39,322
Location
Florida, US
Out of curiosity, I decided to see what games I could run using just the integrated graphics on my i7 4790K.

First I tried Vampire: TM-B as that's one of the older 3D games I have installed right now. It's completely smooth @1920x1200 on highest settings. I honestly wasn't expecting that.

Then I tried Knights of the Old Republic. Completely smooth except when heavy smoke effects are onscreen.

Tried The Witcher next. It was perfectly playable from a framerate standpoint, but textures and shadows constantly flickered.

Then I tried Mars: War Logs and King's Bounty: The Dark Side. Both ran fine on highest settings. Granted, they're not the most graphically intensive games.

Finally, I played Grim Dawn for awhile. I had to turn off a few of the more intensive options, like anti-aliasing and ambient oclusion, to make it smooth, but it ran fine.

I gotta say I was pretty surprised. I wasn't expecting to get playable framerates in most of those games. I was tempted to try some newer titles, like ME:A, just to see what would happen, but I know games like that obviously won't be playable.
 
Joined
Oct 21, 2006
Messages
39,322
Location
Florida, US
Try Last of Us on PS3 emulator, JDR. You may blow up you CPU, or it may lead you to wonderful experience. :)
 
Joined
Jun 5, 2015
Messages
3,898
Location
Croatia
I think it's a shame they haven't done anything yet with the potential to use the power of mismatched GPUs. There's useful horsepower sitting untapped in a lot of decent integrated GPUs.
 
Joined
Nov 8, 2014
Messages
12,085
Out of curiosity, I decided to see what games I could run using just the integrated graphics on my i7 4790K.

I was tempted to try some newer titles, like ME:A, just to see what would happen, but I know games like that obviously won't be playable.
ME1 should run with no hiccups as it runs on integrated GPU smoothly on i5.
ME2 and onward, sorry but no can do.
I think it's a shame they haven't done anything yet with the potential to use the power of mismatched GPUs. There's useful horsepower sitting untapped in a lot of decent integrated GPUs.
But that untapped power source is the readon for vulkan and dx12. Sadly, not many developers support one or both of them. Yet.
 
Joined
Apr 12, 2009
Messages
23,459
Yes, unfortunately, although Vulkan and Dx12 make it possible, they don't implement it for you. It's a challenging task for devs, and we probably need to see the big engines come up with a flexible implementation before it catches on. Sadly, efficient PC performance is often not a priority for the big studios that could afford to implement it.
 
Joined
Nov 8, 2014
Messages
12,085
But they do; people who do video transcoding have software that use the intel gpu....


I think it's a shame they haven't done anything yet with the potential to use the power of mismatched GPUs. There's useful horsepower sitting untapped in a lot of decent integrated GPUs.
 
Joined
Oct 20, 2006
Messages
7,758
Location
usa - no longer boston
Yeah, I was thinking about gaming. For compute tasks, it's much easier - you essentially chop the job up any way you like, and use all your cores and GPUs for a chunk.
 
Joined
Nov 8, 2014
Messages
12,085
They could use the discrete gpu for graphics, and integrated gpu for 'AI'. For that Intel GPUs probably need better CUDA and OpenCL support.
 
Joined
Jun 5, 2009
Messages
1,502
They could but should they? AMD was clear with Ryzen series - it's not APU nor SoC, PC according to them needs standalone GPU and all-in-one chips are ment for rubbish gadgetery that pushes markethurting exclusivity (phones and consoles).
Remember, the idea for dx12 and vulkan was never delegating a part of process to low efficiency built-in GPU. Both were supposed to replace SLI and crossfire which don't belong to PC.
One might ask why and how. It's anti-exclusivity measure. You cannot pair two different manufacturer cards in one PC without dx12 or vulkan. Sony/Nintendo behavior on PC simply cannot be tolerated. The reason most of PC audience never accepted SLI or crossfire is because PC audience refused to be held hostage. What, you thought otherwise? You thought prices and stuff is the reason... No. Don't be Sony and don't be Nintendo. They never understood what PC stands for.
 
Joined
Apr 12, 2009
Messages
23,459
It makes not much sense to me too, AI was the only example that does not require much interaction with graphics. However it is probably more cost effective to make the discrete gpu slightly more powerful.

AI is the thing that gets me most excited these days, but that's another discussion. Unfortunately, nobody is doing anything worthwhile so far...
 
Joined
Jun 5, 2009
Messages
1,502
Perhaps CPU price difference will make more sense to you. GPU inside = expensive, GPU not inside = cheap.

AI? There is no such thing. All "AI" solutions are currently animal instict based, but that ain't intelligence. Good for tamagotchi 2.0. But also for videogames, I have to admit that.
No current "AI" can lie and most definetly cannot filter what to say and how to behave in order not to risk insulting or hurting physically the correspodent. I doubt any of us here will be alive when real AI appears.

What excites me is Star Trek universal translator. I'm pretty much sure we'll see one in about a decade. google/babelfish silly machine solutions that can't even translate "murder of crows" correctly need to die already.
 
Joined
Apr 12, 2009
Messages
23,459
This is not generically true. It is hog-wash on a timeline. Btw you can buy a $50 (or cheaper) processor with a builtin in gpu many times faster than what was available 10 years ago. Yea 10 years is a long time but the point is that eventually the gpu in the cpu will be plent fast for common gamers and a lot cheaper than a dedicated solution for many reasons. Today the built in gpu is a lot cheaper than a cpu without a gpu and a separate gpu for the non-gaming masses.
--
Hum. As for AI - in most games it is pretty crappy but there are some pretty solid ai solutions out there - but it takes real experts to develop - and very few programmers on games are 'expert' outside for learning narrow apis. Actually that statement is a bit broad - a lot of games are poorly implemented but there are probably a few very good implementers scattered among the 1000's of bad ones. Don't mix game design with code design.

----
You might (but probably won't) find this an interesting read;

https://gizmodo.com/amds-newest-processors-are-so-good-you-can-skip-the-gra-1822920100



Perhaps CPU price difference will make more sense to you. GPU inside = expensive, GPU not inside = cheap.

AI? There is no such thing. All "AI" solutions are currently animal instict based, but that ain't intelligence. Good for tamagotchi 2.0. But also for videogames, I have to admit that.
No current "AI" can lie and most definetly cannot filter what to say and how to behave in order not to risk insulting or hurting physically the correspodent. I doubt any of us here will be alive when real AI appears.

What excites me is Star Trek universal translator. I'm pretty much sure we'll see one in about a decade. google/babelfish silly machine solutions that can't even translate "murder of crows" correctly need to die already.
 
Joined
Oct 20, 2006
Messages
7,758
Location
usa - no longer boston
Today the built in gpu is a lot cheaper than a cpu without a gpu and a separate gpu for the non-gaming masses.
You mean snapdragon? It's so weak there is low probability it'll ever escape from phones and tablets into PC.
Why should we care about candy crush saga masses here?
 
Joined
Apr 12, 2009
Messages
23,459
I'm a person of many flaws. Being confused is not one of them.
About my knowledge and lacking of it I cannot be objective, not that it's impossible but because it depends on who am I talking to - while Bieber fan is an easy prey for me, a music composer can destroy me any day.

The topic is not my flaws however. It's hardware flaws. Flaws you believe are top notch technology. I bet if given an opportunity you'd be constructing a supercomputer from crate of playstations 3 as was seen in Person of Interest:

It can be done, I'm not saying it can't. But with other available technology out there… Fail.
 
Last edited:
Joined
Apr 12, 2009
Messages
23,459
Back
Top Bottom