How impossible is shaders?

  • Please make sure you are posting in the correct place. Server ads go here and modpack bugs go here

BIG mac

New Member
Jul 29, 2019
183
0
0
I was wondering exactly how resource intensive shaders are and if there is a trick to using them.

I have a loptop built for gaming with an i7 processor and a nividia gtx 770 and 16 gigs of ram.

I can usually play monster with my 128x sphax texturepack and get around 80 fps flying around and loading chunks in creative mode. Sometimes even over 100.

When I try to use the Unbelievable Shaders in vanilla with the same texturepack, I get around 12-17 fps, even with the lite version. If I turn off shaders, I get around 100 fps. I just think something shouldn't drop my fps by 90, even if it is really resource intensive.

Since I have never had fps problems with minecraft on this computer before, I tried installing optifine. It didn't make things better or worse. Everything just stayed the same no matter what settings I messed with.

So in short, I was just wondering if there was any way to make shaders fun a decent fps, because lets face it; they look freaking amazing. I would love to have shaders in a build world with all kinds of aesthetic mods.
 

midi_sec

New Member
Jul 29, 2019
1,053
0
0
I'd say it's probably something in your Laptop. Neutered video card perhaps. They do that sometimes for mobile.

Your cpu outclasses mine by far (core2quad), you have twice the ram, and our gpu's are roughly evenly matched on paper, only yours being a generation newer and mobile. I can run shaders in the 30-50fps range depending on the shader.
 

ratchet freak

Well-Known Member
Nov 11, 2012
1,198
243
79
it also depends on how current your graphics drivers are a newer driver will have more optimizations that it can do in the shader
 

zemerick

New Member
Jul 29, 2019
667
0
1
I'd say it's probably something in your Laptop. Neutered video card perhaps. They do that sometimes for mobile.

Your cpu outclasses mine by far (core2quad), you have twice the ram, and our gpu's are roughly evenly matched on paper, only yours being a generation newer and mobile. I can run shaders in the 30-50fps range depending on the shader.

Mobile GPUs are ALWAYS pathetic compared to desktop. The ones with dedicated memory do considerably better than those with shared, but it still puts them more than 2 full generations behind. Speed wise, it's 2-4 times slower. I really do hate how they use the same numbering system. So many people see something like GTX 770m ( not saying the OP did, just people in general. ), and think it's the same as a desktops GTX 770. It just isn't even remotely close. Laptops are only a last resort for gaming, as you'll pay twice the amount of money for half the performance at best.

Also, don't rule out your CPU so quick. They made an Intel i7 mobile chip that ran at a whopping 1.06 ghz, with turbo of only 2.133ghz and it's just dual core. ( http://ark.intel.com/products/43562 ) At this point, i7 doesn't say anything at all. I would hope with a 770m the OPs laptop has a better CPU than that though. Even so, mobilitis applies to CPUs a bit as well. They aren't quite as full featured and powerful as a desktop chip.

In fact, everything about a laptop is that way. They have special RAM that runs slower, special motherboards that are slower, etc.

All of that being said, I would absolutely expect a laptop sporting a 770 to handle things a bit better. Perhaps it's an incompatibility between the higher resolution texture pack and the shader. Try running stock textures and compare the performance. Also check with everything stock to cover all of the bases.
 

ratchet freak

Well-Known Member
Nov 11, 2012
1,198
243
79
laptops are also nerfed in terms of cooling, a desktop case has a lot more room for airflow or liquid cooling than a slim book that makes your creditcard look fat

so if you want performance donw shy away from the clunkier design
 

midi_sec

New Member
Jul 29, 2019
1,053
0
0
Mobile GPUs are ALWAYS pathetic compared to desktop. The ones with dedicated memory do considerably better than those with shared, but it still puts them more than 2 full generations behind. Speed wise, it's 2-4 times slower. I really do hate how they use the same numbering system. So many people see something like GTX 770m ( not saying the OP did, just people in general. ), and think it's the same as a desktops GTX 770. It just isn't even remotely close. Laptops are only a last resort for gaming, as you'll pay twice the amount of money for half the performance at best.

Also, don't rule out your CPU so quick. They made an Intel i7 mobile chip that ran at a whopping 1.06 ghz, with turbo of only 2.133ghz and it's just dual core. ( http://ark.intel.com/products/43562 ) At this point, i7 doesn't say anything at all. I would hope with a 770m the OPs laptop has a better CPU than that though. Even so, mobilitis applies to CPUs a bit as well. They aren't quite as full featured and powerful as a desktop chip.

In fact, everything about a laptop is that way. They have special RAM that runs slower, special motherboards that are slower, etc.

All of that being said, I would absolutely expect a laptop sporting a 770 to handle things a bit better. Perhaps it's an incompatibility between the higher resolution texture pack and the shader. Try running stock textures and compare the performance. Also check with everything stock to cover all of the bases.

I realize all of this, that's why I turned my suspicion on his laptop itself, and then proceeded to spout my specs.

...by the way, let's really study his configuration for a moment. There is no way any computer manufacturer would put a cpu such as the one you linked (which was launched in 2010, btw) into a laptop with a current gen gpu. It just does not happen.

Thanks for your input though.
 

Quesenek

New Member
Jul 29, 2019
396
0
0
I would narrow it down to the pure power of your CPU and the poor optimization of the shaders mod to be at fault here.
From my own research of laptops the only fault they have are the CPU's they are just a bit too weak in the CPU section because you have to take into consideration the heat and the battery drain.
When you take that bit of info and work minecraft into the equation you have a weak CPU and minecraft is a CPU eater.
 

BIG mac

New Member
Jul 29, 2019
183
0
0
I have an i7 4800QM running 2.70GHz. As far as optimization, that could be a major problem because I'm not very tech savy in that field.
 

midi_sec

New Member
Jul 29, 2019
1,053
0
0
I have an i7 4800QM running 2.70GHz. As far as optimization, that could be a major problem because I'm not very tech savy in that field.
Yeah, it's not your CPU. Your CPU far outclasses mine.

http://www.cpu-world.com/Compare/413/Intel_Core_2_Quad_Q9400_vs_Intel_Core_i7_Mobile_i7-4800MQ.html

My money is still on that neutered GPU. By that, I mean it has no balls. Neutered cards have less shaders, and are slower to process "complex" renderings in an unoptimized game like minecraft. I listed my specs above (sort of: Q9400, 8gb ram, 550ti gpu), yours far outclass mine, at least double in power in all except the GPU.
 

Hambeau

Over-Achiever
Jul 24, 2013
2,598
1,531
213
Most, if not all mobile computer chips will be "nerfed" (to use a well-known term) compared to their "equivalent" desktop counterparts in the interest of power consumption, heat generation, weight and battery life.

For example, you may have "identical" I7s in a desktop and a laptop, both using (fake value to make the math easy to describe) a 1Ghz clock, but because of the above considerations the mobile chip may only have a 2x internal multiplier while the desktop chip may have a 4x multiplier, same for the GPU. Also, the mobile chipset may only support 1666 Ghz ram while the desktop supports 2300 Ghz ram. Thus, while specs may seem the same on paper you have to do more in-depth comparison if you are trying to decide between a laptop or a desktop purchase.

This holds true for AMD chips as well.
 

midi_sec

New Member
Jul 29, 2019
1,053
0
0
Most, if not all mobile computer chips will be "nerfed" (to use a well-known term) compared to their "equivalent" desktop counterparts in the interest of power consumption, heat generation, weight and battery life.

For example, you may have "identical" I7s in a desktop and a laptop, both using (fake value to make the math easy to describe) a 1Ghz clock, but because of the above considerations the mobile chip may only have a 2x internal multiplier while the desktop chip may have a 4x multiplier, same for the GPU. Also, the mobile chipset may only support 1666 Ghz ram while the desktop supports 2300 Ghz ram. Thus, while specs may seem the same on paper you have to do more in-depth comparison if you are trying to decide between a laptop or a desktop purchase.

This holds true for AMD chips as well.
This is all true, but the two CPU's in this comparison are so different that those examples don't even come into play. the Q9400 bus runs at 1333mhz, and is a 6 year old cpu. Even at it's worst, his cpu should run laps around mine...and it does.

His weak link has to be his gpu. :\

edit: Another thing to factor in, I've mildly abused this processor in the 6 years I've owned it. It's not the spring chicken it once was, and I'm sure performs less than what was benchmarked in that link. At one time I had it overclocked a fair amount. Enough to need more vcore. So...yeah degraded performance :p
 

ratchet freak

Well-Known Member
Nov 11, 2012
1,198
243
79
Most, if not all mobile computer chips will be "nerfed" (to use a well-known term) compared to their "equivalent" desktop counterparts in the interest of power consumption, heat generation, weight and battery life.

For example, you may have "identical" I7s in a desktop and a laptop, both using (fake value to make the math easy to describe) a 1Ghz clock, but because of the above considerations the mobile chip may only have a 2x internal multiplier while the desktop chip may have a 4x multiplier, same for the GPU. Also, the mobile chipset may only support 1666 Ghz ram while the desktop supports 2300 Ghz ram. Thus, while specs may seem the same on paper you have to do more in-depth comparison if you are trying to decide between a laptop or a desktop purchase.

This holds true for AMD chips as well.
actually the laptop will be using a 0.8 GHz clock while the desktop will be using the full factory spec 1 GHz easiest way to prevent heat generation is underclocking
 

Zaflis

New Member
Jul 29, 2019
184
0
0
I was wondering exactly how resource intensive shaders are and if there is a trick to using them.
It exactly depends on the shader code. From the basics of OpenGL all graphics are rendered with shaders, even if it means a driver built-in shader. Vertex and fragment shaders are what everything goes through anyway. Once you start using different/new shader features, you start losing compatibility for old graphics cards, and getting crashes, white textures or other anomalies.
 

ratchet freak

Well-Known Member
Nov 11, 2012
1,198
243
79
if the shader is too complicated the driver may fall back to software rendering which is much, much slower than gpu rendering
 

Celestialphoenix

Too Much Free Time
Nov 9, 2012
3,741
3,204
333
Tartarus.. I mean at work. Same thing really.
Might be Vanilla MC
Optifine+ 128x Texture + Shaders [bumpmapping and shadows]
I think I was getting around 30-50fps. (a somewhat better average I get now without shaders)
Intel Core i5, Nvidia GT 420m, 4g of RAM

2012-05-21_230135-1.png
 

BIG mac

New Member
Jul 29, 2019
183
0
0
I'm not sure if this is the problem, but I was playing Monster today, and my screen went black for a second and it said my intel hd graphis had stopped working and recovered. This makes me think my computer is not using the nvidia graphics card with the shaders at all. Does anyone know of a fix for this?

I could be wrong about that though.

edit: when I go into the nvidia control panel, it says java should be using my nvidia card.
 
Last edited:

Padfoote

Brick Thrower
Forum Moderator
Dec 11, 2013
5,140
5,898
563
Yes it is.

I had this issue as well. I had to set my computer to use the high performance card all the time in order for it to work. Make sure to restart for this to take effect if you do change it.
 

Cptqrk

Popular Member
Aug 24, 2013
1,420
646
138
I keep seeing this "right click on the FTB launcher and select run with graphics processor" tip, but that option is not in my right click menu.
I have gone into my NVIDA control panel and added both FTB and java to it's list of programs to use the video card, but when I hit F3, it doesn't show that I'm using it.

(using a Gforce GTX 470 with driver v.337.88)