I've played around with Witcher3 settings. It's just not a very demanding game.
Hairworks off = 135 FPS
Hairworks on = 100 FPS
35 FPS is a big deal. Console games mostly run at 30 FPS.
What happens when a newer game requires more processing power just because of increased polygon count and other classic reasons? If you're only getting 60FPS in a future game on a 1070 then you'll be down to 25FPS if you turn hair simulation on, which is pretty much a slide show.
So, do we want better overall scene fidelity or keep things the same and give everyone simulated hair strands? There's pretty much no special effects that are worth a performance hit that bad. Work done on optimisation is busting its ass to get 1FPS more from here and there. 35FPS? it's literally a joke. At least it's off by default.
And, yes, I know ur just trolling