Wow...what a bummer

D

Deleted User

Guest
So, I decided to install Divinity: Original Sin today to see what all the fuss was about. I was excited to play it in full 1080p at ultra settings and all that good stuff.

When I get to the video settings, it says my display is Intel HD Graphics 4000...Okay, I figure I need to set the game to utilize my 7970m GPU in the AMD control panel. No big deal, I've had to do this before.

Only that didn't work. The game still doesn't recognize my 7970m. I tried restarting a bunch of times, switching around video settings in D:OS, nothing worked. I'm stuck with the integrated chip on my laptop and I'm VERY disappointed by this.

I'm not going to play the game if I can't play it on Ultra and utilize my GPU. Does anyone have any ideas what could be causing this? I'm using Windows 7 64-bit. Latest drivers from AMD, etc. etc.
 
Why is this integrated card activated at all?
 
Joined
Aug 30, 2006
Messages
7,830
Why is this integrated card activated at all?

I keep it activated so I have a choice of when to use the good GPU and when to save it. I dunno, I had it deactivated awhile back but when I updated my BIOS I think it reactivated it and I've just stuck with it. I heard that deactivating it won't help this issue, though.

To the BIOS with you!

What do I have to do? I read some responses but I don't recall reading anything about the BIOS...
 
No, my laptop is set to high performance mode and is a desktop replacement. I never use the battery.
 
Hmm, does the card work okay with other games? I have a similar setup in my desktop (Intel HD Graphics 4000/AMD Radeon HD 7650A card) and I had to tinker with some settings to have games recognize the card, but that was two years ago so I don't remember off hand. I'll check to see if I can remember what I did and post any findings later.
 
Joined
Oct 18, 2006
Messages
29
Hmm, does the card work okay with other games? I have a similar setup in my desktop (Intel HD Graphics 4000/AMD Radeon HD 7650A card) and I had to tinker with some settings to have games recognize the card, but that was two years ago so I don't remember off hand. I'll check to see if I can remember what I did and post any findings later.

Yeah, my card works fine with other games.

Usually, if a game defaults to my integrated Intel card, I just have to force "high performance" mode on in the AMD Control center. That fixes the problems always, except this time.

So I'm not sure what to do. For now, it's Wasteland 2. Hoping that works flawlessly on ultra settings :D
 
I just checked my Divinity:Original Sin setup and it displays the Intel HD Graphics 4000 as active instead of the AMD one. Is your problem that the game gives you a warning about the video card not supporting high settings? I get that but I just ignore it and put the settings where I want, the AMD card does its thing in the background. I have been playing D:OS in Ultra since it came out in July without any problems.
 
Joined
Oct 18, 2006
Messages
29
My previous laptop, an HP Envy, had a dual GPU solution, an AMD something and the integrated Intel HD graphics - don't remember the the exact version. Keeping the intel was useful when using the laptop on battery for other purposes than gaming, got a few more hours out of it.

I can't say I had problems with games selecting the Intel GPU, but I had problems updating the graphics drivers, the AMD drivers refused to install because of the dual configuration, I had to download drivers from HP, which wasn't updated that often (never). I don't know if this is a problem with all dual graphics configuratons, and I don't know if there is a workaround. Disabling the integrated graphics did not help.

So I was very careful when buying my recent gaming laptop, this one comes only with the NVidia card (it may very well also have an Intel Chipset, but if so, it's not visible in BIOS).

pibbur who …
 
I've had similar problems and I've never found a good solution. I really can't stand those built in Intel GPUs. They're bloody annoying, as they tend to confuse certain applications and games as they're the primary GPU (which must be the case since the computer wants to swap back and forth depending on activity). The most annoying aspect is when that swapping happens during gameplay, as some older games require so little power from the GPU it tends to start out on the Intel one, and then swap during more intense scenes, causing Windows to temporarily close the game.

Like I said, I don't know of a good way to solve this. As pibbur pointed out, even disabling it hasn't been able to solve it for me in the past. Perhaps the game itself can be specifically configured to ignore the Intel GPU, through an ini file or some such thing.
 
Joined
Oct 18, 2006
Messages
7,578
Location
Bergen
Found this on the internets, Fluent. You might check your laptop manufacturer's site or just look up switchable graphics and your model of laptop.

The issue apparently was that the laptop came with no drivers for the AMD card even installed. I've heard that if you get drivers from the manufacturer, you run the risk of the Switchable Graphics option not working at all (at least with NVIDIA cards).

What I did was, I went to the DELL support site, browsed to drivers and downloads for the Alienware M17X R4, and instead of looking at BIOS updates, I looked down at the Video updates. Right there, below the updates for the integrated graphics, was a driver download for the AMD card. I downloaded this, ran it, rebooted, and voila! Device Manager recognizes the card, the Switchable Graphics option is now available when I right click on the Desktop, and after setting an application to run in high performance (a game, for instance), it starts using the card whenever you use that application and stops when you're not. Pretty nifty.

For the sake of completeness, in case anybody else finds this topic, I still see "ATI GFX" in my BIOS for Discrete Graphics, and I still have no graphics or video options in my BIOS Advanced tab. Thankfully, after installing the drivers from the Dell website, I didn't have to fiddle with BIOS at all. Just had to enable the program in the Switchable Graphics menu.
 
Joined
Oct 18, 2006
Messages
8,821
I just checked my Divinity:Original Sin setup and it displays the Intel HD Graphics 4000 as active instead of the AMD one. Is your problem that the game gives you a warning about the video card not supporting high settings? I get that but I just ignore it and put the settings where I want, the AMD card does its thing in the background. I have been playing D:OS in Ultra since it came out in July without any problems.

Yep, this is exactly what happened. It kept saying Intel HD 4000, but now that I tried to actually play the game on Ultra, it works fine. It's using my GPU now even though it says differently in the options. Weird.

I don't understand why you're trying to "save" it. Is this because you're worried about battery drain? It's not as if you're going to wear out the GPU. :)

Eh, I just figured that the less stress on the GPU, the better. Why else would they include an integrated GPU? :)
 
Back
Top Bottom