1. Please make sure you are posting in the correct place. Server ads go here and modpack bugs go here

How impossible is shaders?

Discussion in 'General FTB chat' started by BIG mac, Aug 14, 2014.

  1. BIG mac

    BIG mac New Member

    I was wondering exactly how resource intensive shaders are and if there is a trick to using them.

    I have a loptop built for gaming with an i7 processor and a nividia gtx 770 and 16 gigs of ram.

    I can usually play monster with my 128x sphax texturepack and get around 80 fps flying around and loading chunks in creative mode. Sometimes even over 100.

    When I try to use the Unbelievable Shaders in vanilla with the same texturepack, I get around 12-17 fps, even with the lite version. If I turn off shaders, I get around 100 fps. I just think something shouldn't drop my fps by 90, even if it is really resource intensive.

    Since I have never had fps problems with minecraft on this computer before, I tried installing optifine. It didn't make things better or worse. Everything just stayed the same no matter what settings I messed with.

    So in short, I was just wondering if there was any way to make shaders fun a decent fps, because lets face it; they look freaking amazing. I would love to have shaders in a build world with all kinds of aesthetic mods.
  2. midi_sec

    midi_sec New Member

    I'd say it's probably something in your Laptop. Neutered video card perhaps. They do that sometimes for mobile.

    Your cpu outclasses mine by far (core2quad), you have twice the ram, and our gpu's are roughly evenly matched on paper, only yours being a generation newer and mobile. I can run shaders in the 30-50fps range depending on the shader.
  3. ratchet freak

    ratchet freak Well-Known Member

    it also depends on how current your graphics drivers are a newer driver will have more optimizations that it can do in the shader
  4. zemerick

    zemerick New Member

    Mobile GPUs are ALWAYS pathetic compared to desktop. The ones with dedicated memory do considerably better than those with shared, but it still puts them more than 2 full generations behind. Speed wise, it's 2-4 times slower. I really do hate how they use the same numbering system. So many people see something like GTX 770m ( not saying the OP did, just people in general. ), and think it's the same as a desktops GTX 770. It just isn't even remotely close. Laptops are only a last resort for gaming, as you'll pay twice the amount of money for half the performance at best.

    Also, don't rule out your CPU so quick. They made an Intel i7 mobile chip that ran at a whopping 1.06 ghz, with turbo of only 2.133ghz and it's just dual core. ( http://ark.intel.com/products/43562 ) At this point, i7 doesn't say anything at all. I would hope with a 770m the OPs laptop has a better CPU than that though. Even so, mobilitis applies to CPUs a bit as well. They aren't quite as full featured and powerful as a desktop chip.

    In fact, everything about a laptop is that way. They have special RAM that runs slower, special motherboards that are slower, etc.

    All of that being said, I would absolutely expect a laptop sporting a 770 to handle things a bit better. Perhaps it's an incompatibility between the higher resolution texture pack and the shader. Try running stock textures and compare the performance. Also check with everything stock to cover all of the bases.
  5. ratchet freak

    ratchet freak Well-Known Member

    laptops are also nerfed in terms of cooling, a desktop case has a lot more room for airflow or liquid cooling than a slim book that makes your creditcard look fat

    so if you want performance donw shy away from the clunkier design
  6. midi_sec

    midi_sec New Member

    I realize all of this, that's why I turned my suspicion on his laptop itself, and then proceeded to spout my specs.

    ...by the way, let's really study his configuration for a moment. There is no way any computer manufacturer would put a cpu such as the one you linked (which was launched in 2010, btw) into a laptop with a current gen gpu. It just does not happen.

    Thanks for your input though.
  7. Quesenek

    Quesenek New Member

    I would narrow it down to the pure power of your CPU and the poor optimization of the shaders mod to be at fault here.
    From my own research of laptops the only fault they have are the CPU's they are just a bit too weak in the CPU section because you have to take into consideration the heat and the battery drain.
    When you take that bit of info and work minecraft into the equation you have a weak CPU and minecraft is a CPU eater.
  8. BIG mac

    BIG mac New Member

    I have an i7 4800QM running 2.70GHz. As far as optimization, that could be a major problem because I'm not very tech savy in that field.
  9. midi_sec

    midi_sec New Member

    Yeah, it's not your CPU. Your CPU far outclasses mine.


    My money is still on that neutered GPU. By that, I mean it has no balls. Neutered cards have less shaders, and are slower to process "complex" renderings in an unoptimized game like minecraft. I listed my specs above (sort of: Q9400, 8gb ram, 550ti gpu), yours far outclass mine, at least double in power in all except the GPU.
  10. Hambeau

    Hambeau Over-Achiever

    Most, if not all mobile computer chips will be "nerfed" (to use a well-known term) compared to their "equivalent" desktop counterparts in the interest of power consumption, heat generation, weight and battery life.

    For example, you may have "identical" I7s in a desktop and a laptop, both using (fake value to make the math easy to describe) a 1Ghz clock, but because of the above considerations the mobile chip may only have a 2x internal multiplier while the desktop chip may have a 4x multiplier, same for the GPU. Also, the mobile chipset may only support 1666 Ghz ram while the desktop supports 2300 Ghz ram. Thus, while specs may seem the same on paper you have to do more in-depth comparison if you are trying to decide between a laptop or a desktop purchase.

    This holds true for AMD chips as well.
  11. midi_sec

    midi_sec New Member

    This is all true, but the two CPU's in this comparison are so different that those examples don't even come into play. the Q9400 bus runs at 1333mhz, and is a 6 year old cpu. Even at it's worst, his cpu should run laps around mine...and it does.

    His weak link has to be his gpu. :\

    edit: Another thing to factor in, I've mildly abused this processor in the 6 years I've owned it. It's not the spring chicken it once was, and I'm sure performs less than what was benchmarked in that link. At one time I had it overclocked a fair amount. Enough to need more vcore. So...yeah degraded performance :p
  12. ratchet freak

    ratchet freak Well-Known Member

    actually the laptop will be using a 0.8 GHz clock while the desktop will be using the full factory spec 1 GHz easiest way to prevent heat generation is underclocking
  13. Zaflis

    Zaflis New Member

    It exactly depends on the shader code. From the basics of OpenGL all graphics are rendered with shaders, even if it means a driver built-in shader. Vertex and fragment shaders are what everything goes through anyway. Once you start using different/new shader features, you start losing compatibility for old graphics cards, and getting crashes, white textures or other anomalies.
  14. ratchet freak

    ratchet freak Well-Known Member

    if the shader is too complicated the driver may fall back to software rendering which is much, much slower than gpu rendering
  15. Celestialphoenix

    Celestialphoenix Too Much Free Time

    Might be Vanilla MC
    Optifine+ 128x Texture + Shaders [bumpmapping and shadows]
    I think I was getting around 30-50fps. (a somewhat better average I get now without shaders)
    Intel Core i5, Nvidia GT 420m, 4g of RAM

  16. BIG mac

    BIG mac New Member

    I'm not sure if this is the problem, but I was playing Monster today, and my screen went black for a second and it said my intel hd graphis had stopped working and recovered. This makes me think my computer is not using the nvidia graphics card with the shaders at all. Does anyone know of a fix for this?

    I could be wrong about that though.

    edit: when I go into the nvidia control panel, it says java should be using my nvidia card.
    Last edited: Aug 16, 2014
  17. Celestialphoenix

    Celestialphoenix Too Much Free Time

    Is your minecraft/ftb launcher defaulted to run via the nvidia card?
    (or try right clicking -> run with graphics processor -> nvidia processor)
  18. BIG mac

    BIG mac New Member

    Yes it is.
  19. Padfoote

    Padfoote Brick Thrower Team Member Forum Moderator

    I had this issue as well. I had to set my computer to use the high performance card all the time in order for it to work. Make sure to restart for this to take effect if you do change it.
  20. Cptqrk

    Cptqrk Popular Member

    I keep seeing this "right click on the FTB launcher and select run with graphics processor" tip, but that option is not in my right click menu.
    I have gone into my NVIDA control panel and added both FTB and java to it's list of programs to use the video card, but when I hit F3, it doesn't show that I'm using it.

    (using a Gforce GTX 470 with driver v.337.88)

Share This Page