|
Your continuous donations keep RPGWatch running!
RPGWatch Forums » General Forums » Tech Help » nVidia 400 series

Default nVidia 400 series

June 8th, 2010, 09:53
It's Fermi time! I just got a GeForce GTX 470.

It's the second time I've upgraded my video card in the last 6 weeks. I had purchased a Radeon HD 4890 in April to replace my Geforce 8800GTS. That was supposed to be a temporary upgrade to last me 4-6 months, but I just didn't like it.

This was a bit of an impulse purchase, hopefully I don't regret it.
Attached Images
File Type: jpg DSC02725.JPG (146.7 KB, 83 views)
File Type: jpg DSC02723.JPG (177.3 KB, 83 views)
JDR13 is online now

JDR13

JDR13's Avatar
SasqWatch

#1

Join Date: Oct 2006
Location: Florida, US
Posts: 17,878

Default 

June 8th, 2010, 21:06
What was wrong with the 4890? Curious since I have an ATI FirePro M7740 GPU.
Thrasher is offline

Thrasher

Thrasher's Avatar
Wheeee!
RPGWatch Donor

#2

Join Date: Aug 2008
Location: Studio City, CA
Posts: 10,002

Default 

June 8th, 2010, 23:28
Originally Posted by Thrasher View Post
What was wrong with the 4890? Curious since I have an ATI FirePro M7740 GPU.
Nothing really wrong with it, I just wasn't impressed. The filtering quality seemed slightly inferior than what I was used to, as did the anti-aliasing. I hear those things have been been improved in their 5xxx cards.

I also got a pretty good deal on that GTX 470, I paid around $50 less than the current average retail price.
JDR13 is online now

JDR13

JDR13's Avatar
SasqWatch

#3

Join Date: Oct 2006
Location: Florida, US
Posts: 17,878

Default 

June 9th, 2010, 00:02
Visually noticeable quality differences in filtering and anti-aliasing or performance/FPS?
Thrasher is offline

Thrasher

Thrasher's Avatar
Wheeee!
RPGWatch Donor

#4

Join Date: Aug 2008
Location: Studio City, CA
Posts: 10,002

Default 

June 9th, 2010, 01:51
Visual quality, but nothing major. It also only allowed for Multisampling AA, while nVidia cards do both MSAA and Supersampling, which looks better in certain games imo. ATI reintroduced SSAA in the 5xxx series.
JDR13 is online now

JDR13

JDR13's Avatar
SasqWatch

#5

Join Date: Oct 2006
Location: Florida, US
Posts: 17,878

Default 

June 9th, 2010, 02:25
Noticeable, though, huh?

At one point in time Nvidia had the super-sampling transparency AA and ATI did not. Don't know if this is still true.

But quite frankly I rarely have enough GPU umph to turn on much AA beyond 2x at 1920x1200 except on older games.
Thrasher is offline

Thrasher

Thrasher's Avatar
Wheeee!
RPGWatch Donor

#6

Join Date: Aug 2008
Location: Studio City, CA
Posts: 10,002

Default 

June 10th, 2010, 00:24
Originally Posted by Thrasher View Post
Noticeable, though, huh?

At one point in time Nvidia had the super-sampling transparency AA and ATI did not. Don't know if this is still true.

But quite frankly I rarely have enough GPU umph to turn on much AA beyond 2x at 1920x1200 except on older games.
The current consensus as far as I can tell is that ATI (HD5xxx) does better AA and nVidia (GTX 4xx) does better AF. On a whole it seems to even out pretty much. I've read a good number of GTX 470/480 recently since I was interested in a new nVidia card (not anymore due to the known issues of "Thermi" ). I can't remember a single review claiming that one manufacturer had noticeably superior image quality compared to the other so it must be about equal overall.
Moriendor is offline

Moriendor

Moriendor's Avatar
Spielkind

#7

Join Date: Oct 2006
Location: Schland
Posts: 1,916

Default 

June 10th, 2010, 01:02
Originally Posted by Moriendor View Post
I've read a good number of GTX 470/480 recently since I was interested in a new nVidia card (not anymore due to the known issues of "Thermi" ).
I'd be curious to hear about these "known issues". If you're talking about temperature, yes the new GTX 400 series runs at a higher temp than previous generations, but they've also been built to withstand higher temps. I've had no issues whatsoever with mine so far.



BTW: Can a moderator please move this topic, starting with #107, to a new thread?
JDR13 is online now

JDR13

JDR13's Avatar
SasqWatch

#8

Join Date: Oct 2006
Location: Florida, US
Posts: 17,878

Default 

June 10th, 2010, 01:07
FWIW, I had thermal issues with my laptop NVIDIA GPU.

Had to swap it out, and that's how I ended up with the FirePro.
Thrasher is offline

Thrasher

Thrasher's Avatar
Wheeee!
RPGWatch Donor

#9

Join Date: Aug 2008
Location: Studio City, CA
Posts: 10,002

Default 

June 10th, 2010, 01:19
Afaik there is no mobile GPU that uses the Fermi core yet, so that had to be from a different series.
JDR13 is online now

JDR13

JDR13's Avatar
SasqWatch

#10

Join Date: Oct 2006
Location: Florida, US
Posts: 17,878

Default 

June 10th, 2010, 01:53
Yep, the previous one (or 2) I believe.
Hopefully they've fixed the thermal issues.

Quadro FX 3700M - G92M
GTX 470 - GF100
Thrasher is offline

Thrasher

Thrasher's Avatar
Wheeee!
RPGWatch Donor

#11

Join Date: Aug 2008
Location: Studio City, CA
Posts: 10,002

Default 

June 10th, 2010, 11:34
Originally Posted by JDR13 View Post
I'd be curious to hear about these "known issues". If you're talking about temperature, yes the new GTX 400 series runs at a higher temp than previous generations, but they've also been built to withstand higher temps. I've had no issues whatsoever with mine so far.
Yep. That's what nVidia marketing (and fanboys ) wants us to believe when in reality their older chips were built to withstand temps of up to 120°C and Fermi is designed for a max temp of "only" 105°C. Besides, unless nVidia has somehow managed to defy or change the laws of physics, well, high temps and copper, silicon, and all that other material that conductors and transistors are made of don't go along very well. The lifetime of the material will suffer from high temps. No marketing speak is going to change the facts of physics.

Anyway, the GTX 470 -under load- is getting pretty close to the max temps at normal room temperature so that's one of the known issues that should be a concern.
The second known issue is the noise emitted by the fan. It may not bother hardcore gamers who play with a headset most of the time but for people like me who play with speakers and like it silent it must be a nightmare.
The third issue is the price and the poor ratio between watt and performance and price.

Before anyone lumps me in with the ATI fanboys… well, don't . I have never owned an ATI card. Always had nVidia because I preferred them for their better drivers and better compatibility. I was really kinda depressed that Fermi turned out that bad. My only hope is that the prices for the 470 are going to come down to the EUR 250 region (where it belongs IMHO instead of the current EUR 300 - 320) and that the custom designs with dual fans are going to bring down the temps and especially the noise.
I have no problem with the high power consumption but the temps (possible lack of longevity) and noise (nerve-wrecking) need to come down big time before I will even consider buying a 470.
Until then my trusty "old" 260 is going to have to do. Or maybe I will finally take a closer look at ATI. Their next gen HD6xxx is supposed to come out in fall. If it is as good as the HD5xxx series then I might eventually switch camps.
Moriendor is offline

Moriendor

Moriendor's Avatar
Spielkind

#12

Join Date: Oct 2006
Location: Schland
Posts: 1,916

Default 

June 10th, 2010, 13:13
Trying to decide if I should go with Crossfire 5970 or SLI 480 GTX at the moment.

Mankind must put an end to war or war will put an end to mankind. - John F Kennedy
An eye for an eye, and soon the whole world is blind. - Mahatma Gandhi
The world is my country. To do good is my religion. My mind is my own church. This simple creed is all we need to enjoy peace on earth. - Thomas Paine
JemyM is offline

JemyM

JemyM's Avatar
Okay, now roll sanity.

#13

Join Date: Oct 2006
Posts: 6,028
Send a message via ICQ to JemyM Send a message via MSN to JemyM

Default 

June 10th, 2010, 15:38
Originally Posted by Moriendor View Post
Yep. That's what nVidia marketing (and fanboys ) wants us to believe when in reality their older chips were built to withstand temps of up to 120°C and Fermi is designed for a max temp of "only" 105°C. Besides, unless nVidia has somehow managed to defy or change the laws of physics, well, high temps and copper, silicon, and all that other material that conductors and transistors are made of don't go along very well. The lifetime of the material will suffer from high temps. No marketing speak is going to change the facts of physics.
I was trying to search some specs on older models, but I couldn't find anything. Perhaps you can point me in the right direction. I do know that the previous-generation GeForce GTX 285 was also 105C.

Anyways, some of these "issues" are being greatly exaggerated, at least from my experience, which is "hands on", as opposed to simply reading off the web. A threshold of 105C is not going to shorten the lifespan of a video card unless it's due to owner incompetence. I've been torturing my EVGA 470 for the last 2 days, and it has yet to exceed 85C with the fan at 75%. At idle, it doesn't exceed 50C with the fan at 40%. I don't have any kind of special cooling in my case.


Originally Posted by Moriendor View Post
The second known issue is the noise emitted by the fan. It may not bother hardcore gamers who play with a headset most of the time but for people like me who play with speakers and like it silent it must be a nightmare.
The third issue is the price and the poor ratio between watt and performance and price.
These are the typical myths I keep seeing posted in some forums, allow me to do a little debunking.

Fan noise- completely bogus, the 400 series is nearly identicle to the Radeon 5xxx in terms of sound. I think my MSI Radeon 4890 might have even been slightly louder than my new card. Another comparison here. Unless you can tell the difference between 43db and 45db (I can't), the difference is negligible.

Price/performance- not great right now due to limited availability, but no worse than ATI's top end cards. I only paid about $40 more than what a Radeon 5850 is currently going for, and my card is faster. I do think that the 470 and 5850 have a much better P/P ratio than the 480 and 5870.

Power consumption- can't argue with this one, it's definitely power hungry, but it's not an issue to me. I suppose it's a valid complaint for gamers who are "sensitive" to that.
JDR13 is online now

JDR13

JDR13's Avatar
SasqWatch

#14

Join Date: Oct 2006
Location: Florida, US
Posts: 17,878

Default 

June 10th, 2010, 18:29
The only card I have ever had to overheat was ATI 800XL. Not buying the 400 nvidia series though. Waiting for revision. And I have zero urge to spend anything on hardware right now even if I have the money.

“I've learned that people will forget what you said, people will forget what you did, but people will never forget how you made them feel.” - Maya Angelou
"Those who dont read history are destined to repeat it."– Edmund Burke
zakhal is offline

zakhal

zakhal's Avatar
HomelyRebel

#15

Join Date: Dec 2006
Location: Europa Universalis
Posts: 2,990

Default 

June 11th, 2010, 02:32
Originally Posted by JDR13 View Post
I was trying to search some specs on older models, but I couldn't find anything. Perhaps you can point me in the right direction. I do know that the previous-generation GeForce GTX 285 was also 105C.
Hehe I had to do some serious research into the depths of my aging memory but thanks to the aid of Google I now think I know where I got the impression that specs used to be higher and where that 120°C number is coming from.

It's because nVidia in their older drivers actually allowed card manufacturers to include settings for thresholds for when to throttle or shut down the GPU (users then could fiddle with these settings at their own risk). Depending on modifications by the card manufacturer (the custom drivers that you can download from a manufacturer's website) these thresholds were set between 115°C and 145°C. I seem to remember that nVidia recommended 120C and actually if you do a Google search with "120C GPU max temp" then you'll find a lot of people quoting that exact number as the threshold for GPUs so there must be something to it.

Anyway, the original point was that it is not at all surprising that nVidia lowered their specs (to 105C) over time and that they removed the temperature features from their reference drivers. Graphics cards keep on getting more and more complex with more and more transistors (billions of mini-circuits) crammed into a tiny space. As we know it is a great challenge for the chip makers (CPU and GPU alike) to keep the leaking currents and heat dissipation in check. It is kind of weird of nVidia to pretend as if Fermi being able to withstand up to 105°C is an achievement when some time ago in reality they themselves used to allow higher temps of up to 120°C before throttling a GPU via their own drivers.

Also, I think we all know that usually you want to stay away from the max spec as far as possible. We all know that heat is not good for electronic components and all of a sudden nVidia is trying to make us believe that they have beaten the laws of physics by means of "design"? if that is so then why do they still recommend special ventilated systems for Thermi? Come on. Who are they trying to kid?
You know if I were them and wanted to sell a product that didn't quite turn out as originally intended I'd actually do the exact same thing and claim stuff like they do but really now….

Anyways, some of these "issues" are being greatly exaggerated, at least from my experience, which is "hands on", as opposed to simply reading off the web. A threshold of 105C is not going to shorten the lifespan of a video card unless it's due to owner incompetence. I've been torturing my EVGA 470 for the last 2 days, and it has yet to exceed 85C with the fan at 75%. At idle, it doesn't exceed 50C with the fan at 40%. I don't have any kind of special cooling in my case.
Well, I was talking about the "longevity" of the cards if they run at high temps for prolonged periods of time. I wouldn't exactly call "two days" a valid "longevity" test unless we are talking about Pluto or Uranus and not Earth days .
Seriously though I hope we can easily agree that it is well known and established that running electronic equipment at high temps or close to the maximum specs is going to lead to a higher failure rate. Of course there's going to be those cards that will still work in ten years but on average you will have a higher failure rate over a given period of time if you operate electronic equipment near its thermal limits. That is unless nVidia has not only managed to defy the laws of physics but those of statistics as well .

And regarding the fan speeds it is of course difficult to judge given only a % number but assuming similar total RPM between the fan on my GTX 260 and the fan on the GTX 470 I'm sure that 75% would lead to a completely unacceptable noise level for me.
My GTX 260 (Gainward) runs at 40% fan speed at default and I lowered that to 30% (fixed… so it stays at 30% in 3D under load, too) because I felt that 40% was a little on the noisy side already (I had a rather silent Gainward 8800GTS 640MB before that).

These are the typical myths I keep seeing posted in some forums, allow me to do a little debunking.
That is just the question, isn't it? Who is posting the myths and who is not? Let's just say I'm not quite willing (yet) to spend EUR 300+ to find out .

Fan noise- completely bogus, the 400 series is nearly identicle to the Radeon 5xxx in terms of sound. I think my MSI Radeon 4890 might have even been slightly louder than my new card. Another comparison here. Unless you can tell the difference between 43db and 45db (I can't), the difference is negligible.
Well, they're comparing to the HD5870 reference design which is over six months old and barely being used anymore. Most ATI manufacturers offer much better performing, much more efficient and much more silent custom cooler designs for the HD5xxx series.
As I said that is what I'm counting on for the GTX 470 as well. My hope lies on the Gainward GTX 470 Golden Sample with the dual fan design. Unfortunately, reviews of the similarly built Palit dual fan design haven't turned out great so far (the noise level was only barely reduced under load and the card was actually louder in idle mode… WTF?) but if Gainward uses a different BIOS with different thresholds for the fan settings etc. then there might be a silent PC compatible GTX 470 on the horizon (hopefully).

Price/performance- not great right now due to limited availability, but no worse than ATI's top end cards. I only paid about $40 more than what a Radeon 5850 is currently going for, and my card is faster. I do think that the 470 and 5850 have a much better P/P ratio than the 480 and 5870.
True. It's pretty sad if I think about my GTX 260 which cost me ~EUR 190 (and which later even dropped to prices as low as EUR 169). That was some great value. I'd consider the GTX 470 great value at ~EUR 250 and my plan is to wait until either nVidia or ATI adjust their prices or bring out a new product in that price range (the GTX 465 is not interesting IMHO since it's not even that much faster than my 260 in most games/scenarios).

Power consumption- can't argue with this one, it's definitely power hungry, but it's not an issue to me. I suppose it's a valid complaint for gamers who are "sensitive" to that.
Doesn't bother me either as I said. I mostly care about the noise and the price and those two are currently disqualifying factors. But as I said I haven't quite given up hope yet (go Gainward go! )
Moriendor is offline

Moriendor

Moriendor's Avatar
Spielkind

#16

Join Date: Oct 2006
Location: Schland
Posts: 1,916

Default 

June 11th, 2010, 05:08
Originally Posted by Moriendor View Post
Well, I was talking about the "longevity" of the cards if they run at high temps for prolonged periods of time. I wouldn't exactly call "two days" a valid "longevity" test unless we are talking about Pluto or Uranus and not Earth days .
Seriously though I hope we can easily agree that it is well known and established that running electronic equipment at high temps or close to the maximum specs is going to lead to a higher failure rate. Of course there's going to be those cards that will still work in ten years but on average you will have a higher failure rate over a given period of time if you operate electronic equipment near its thermal limits. That is unless nVidia has not only managed to defy the laws of physics but those of statistics as well
Well we can speculate about card lifespan all we want, but the fact that these vendors are including at least a 3 year warranty (my EVGA 470 is lifetime) makes that a moot point for most gamers.

Of course it's a fact that running something close to the maximum specs can shorten its overall lifespan. The misconception is that the 400 series cards are running that hot at default speeds, when in fact they're not.

Originally Posted by Moriendor View Post
And regarding the fan speeds it is of course difficult to judge given only a % number but assuming similar total RPM between the fan on my GTX 260 and the fan on the GTX 470 I'm sure that 75% would lead to a completely unacceptable noise level for me. )
Oh I completely understand where you're coming from there, I'm pretty anal when it comes to fan noise myself. The problem is that *all* modern high-end cards are fairly loud with a generic fan, regardless if it's an nVidia or ATI chipset.


Originally Posted by Moriendor View Post
Well, they're comparing to the HD5870 reference design which is over six months old and barely being used anymore. Most ATI manufacturers offer much better performing, much more efficient and much more silent custom cooler designs for the HD5xxx series.
Well considering that most of those manufacturers are also producing the 4xx series now, I'm pretty sure they'll do the same for those cards as well.


Originally Posted by Moriendor View Post
True. It's pretty sad if I think about my GTX 260 which cost me ~EUR 190 (and which later even dropped to prices as low as EUR 169). That was some great value. I'd consider the GTX 470 great value at ~EUR 250 and my plan is to wait until either nVidia or ATI adjust their prices or bring out a new product in that price range (the GTX 465 is not interesting IMHO since it's not even that much faster than my 260 in most games/scenarios).
Yeah I think you'll see a gradual price drop over the next few month as the cards become more widely available. It sounds like they're a little more expensive in Europe right now.
JDR13 is online now

JDR13

JDR13's Avatar
SasqWatch

#17

Join Date: Oct 2006
Location: Florida, US
Posts: 17,878

Default 

June 29th, 2010, 18:38
I've just acquired a GTX480. So far I only notice the fan noise and high temp if I push the GPU with GPGPU apps - where it's flying at max throttle. The temps can go to about 95C after sustained use and the fan is then bloody loud (even at 75%), if you up the fan speed though, you can reduce the temp quite a bit (they've been a little conservative with fan settings I guess to reduce noise complaints…fried gfx card or noisy fan? hm.tough choice

If I'm not fiddling with CUDA then the temps very seldom get above mid 70s and the fan noise is very tolerable. I have yet to play a game that causes any heating which approaches the heavy GPU compute induced heating. I suspect that the card may not last past 3 years, but I don't care a) it's under warranty for that period and b) I usually uprgrade on a 2/3 yr cycle. I am also aware that this is the 1st gen of a very new arch for GPUs and as such there will be teething and engineering problems. I took something of a risk buying this card - if you want to be dramatic - but given a and b above it's a risk I was happy to take given that my research is now moving into massive data parallelism and GPGPU solutions are all the rage. I'm sure the next itertaion of Fermi will iron out these kinks - on paper at least the card is just a very good match for my work, so I wasn't prepared to wait.
booboo is offline

booboo

booboo's Avatar
Keeper of the Watch

#18

Join Date: Aug 2007
Location: Cape Town, South Africa
Posts: 957

Default 

July 6th, 2010, 01:32
I've been doing a lot of work in CUDA lately for my thesis and as I understand it Fermi isn't actually released yet. The architecture is there but the software package is not.

I am hoping Fermi is not so radically proprietary (is that redundant?) that nVidia doesn't make OpenCL useless. As it is GPGPU programming has not been standardized for anything but research and the commercial computing power software can handle is getting a huge disservice.

With 512 cores you can't begin to realize the power your little machine can potentially have - not just for graphics, but general purpose power!

Developer of The Wizard's Grave Android game. Discussion Thread:
http://www.rpgwatch.com/forums/showthread.php?t=22520
Lucky Day is offline

Lucky Day

Lucky Day's Avatar
Daywatch

#19

Join Date: Oct 2006
Location: The Uncanny Valley
Posts: 3,198

Default 

July 6th, 2010, 16:32
Originally Posted by JDR13 View Post
Oh I completely understand where you're coming from there, I'm pretty anal when it comes to fan noise myself. The problem is that *all* modern high-end cards are fairly loud with a generic fan, regardless if it's an nVidia or ATI chipset.
x2 - don't even get me started… My last GPU purchase was made roughly one year ago, which I picked up on impulse when an acquaintance asked me if I wanted to buy his 4890 for $100(he upgraded to a GTX295). Coming from an 8800GTS g92 I was floored by both the airflow and noise difference, it moves a ton of air but at the cost of sounding like a jet engine… Oh and since I game(d) with my gts g92 @ 800mhz core and 2000mhz the performance delta between the two cards was typically meager(this excludes games that benefited from the extra vram and bandwith).

-Tangent/Rambling thought- AMD is clearly in the lead now, no question(if I needed to upgrade I'd be all over a 5870). But it's worth mentioning nvidia's G80(unified shader arch) was incredibly potent. IMO, if gt200 had achieved the clocks nvidia set out for they wouldn't be nearly as bad off as they are now. Yes AMD would have still wrestled away the performance and price/performance crowns with the dawn of their evergreen lineup(some say nvidia has taken back performance with Fermi - at least in dx11/GPGPU/tesselation). However, moving forward, I believe AMD's R600 arch is finally tapped out much like G80 was(or close to it - just look at 4870x2 or 4870 CF). GPU scaling is never 100%(some claim ~90% scaling for 2 gpus) yet the slower clocked 4870x2 or CFd 4870s are faster than a single 5870 in games/benchmarks more often than not. Thus the addition of more stream processors moving forward will gradually add less and less performance while bumping up both die size and transistor count. Which is probably why HD6000(codename southern islands) is rumored to be an amalgam of evergreens stream processors and northern island's(true next gen gpu arch to replace R600) uncore.

Yeah I think you'll see a gradual price drop over the next few month as the cards become more widely available. It sounds like they're a little more expensive in Europe right now.
This I'm not too sure of, TSMC is still capacity constrained as they continue to struggle with the 40nm node and meeting subsequent demand from both camps(nv/amd).

The original article from digitimes,

AMD's chip supply from Taiwan Semiconductor Manufacturing Company (TSMC) may face shortages in the second half as Nvidia has already placed a large amount of orders to TSMC in March and April and may squeeze AMD's order out of the already fully-loaded capacity, according to sources from graphics players.

Since Nvidia is set to launch its new GeForce GTX 460 GPU later in July and GF106 and GF108 in August and September, respectively, the company has already pre-booked a large volume of capacity from TSMC.

Since TSMC is likely to give its major clients, Nvidia and Qualcomm, supply priority, the sources believe AMD may not be able to share much of TSMC's capacity in the second half of 2010.

Nvidia plans to launch two versions of the GeForce GTX 460 graphics card with prices ranging between US$230-250 and targeting the ATI Radeon HD 5830. In August, Nvidia will launch the GeForce GTS 455/450 priced at US$129-179 and in September, the GF100-based GPU will be launched with a price below US$100 to replace the GeForce GT 240 and targeting AMD's ATI Radeon HD 5600/5500 series GPUs.
Just look at the original MSRPs for evergreen cards and where the actual retail/etail prices wound up… IIRC MSRP for the 5870 was $380 and 5850 $260. I don't think you'll see a dip below those price points until AMD is set to clear out it's stock of cards as they go EOL. However, due to the aforementioned TSMC capacity issues, that window will also be incredibly short lived unlike say G92 and GT200 cards where there is/was a huge stockpile. Sometimes supply/demand is a pain in the butt.
MasterKromm is offline

MasterKromm

Sentinel

#20

Join Date: Feb 2010
Posts: 376
RPGWatch Forums » General Forums » Tech Help » nVidia 400 series
Thread Tools Search this Thread
Search this Thread:

Advanced Search

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

BB code is On
Smilies are On
[IMG] code is On
HTML code is Off

Forum Jump

All times are GMT +2. The time now is 22:53.
Powered by vBulletin® Version 3.8.8
Copyright ©2000 - 2014, vBulletin Solutions, Inc.
Copyright by RPGWatch