Tekkit is a joke but my FPS was Higher

Status
Not open for further replies.

Meara

New Member
Jul 29, 2019
30
0
0
Considering your monitor most likely only 60Hz, there is literally no difference between 200-150 fps. As your monitor can only display 60 fps at the most anyway. I really doubt you can tell the difference between those two framerates.
I get like 10-70 fps and that's bad. Not because I think at that fps is unplayable, it's because the fps flunuates a lot. Why?
I think i stated this before. You cant really see a difference past a certin point.. but feel a difference. I think when it jumps up and down is when it causes the most problems. Is it unplayable?.. well if its bouncing up and down like an ADHD child on a bag of pure sugar.. probably. Hence, I think people who have large jumping like that in frame rate probably should get optifine and just check on the smooth FPS. that Will lower it yes, but it will stabilize it. It can be caused by having too many .. how do i say "ticks?" well at least "do something at tick". it can be like trying to read while having bad hiccups.
 

Meldiron

New Member
Jul 29, 2019
641
0
0
I notice framerate drops from 150 to 100 easily and I use a 60hz monitor. I can feel the difference on most games I play. Trust me when I record and it drops to around 100 I feel a bit laggy this is very noticeable to GFX intense games like BF3 and Crysis 2
Any chance you'r using SLI/Crossfire? If so isn't it really 150 and 100 propper fps due to microstuttering.
 

slay_mithos

New Member
Jul 29, 2019
1,288
0
0
Let us cut the "fps is noticeable or not", as above 60, it tends to be a case by case thing.

The only thing that really benefit from a crazy high fps is the controls, as they are way more tight when checking each 50ms instead of each 200ms, and will be noticed by anyone trying to do fast input.

That said, minecraft is not designed to be a fast-paced game, and it tends not to take well the fast inputs sometimes, resulting into stuck keys and other double click problems.
It is, however, very impacting in games like first person shooters, platform (meat boys?) or action games, and basically any game where precision and fast reaction is rewarded.

the problem we have here is a low and/or fluctuating fps, that makes the game a little hard to play, because the timing and delays are no longer constant. (skeleton shooting arrows, furnaces operations, vegetable growth etc)

For low-end computers, the fact minecraft also runs a server now will impact the performances, as it needs to do more computations, and for that, even the mighty optifine won't do a thing.
The only thing you can do is to remain with the default texture pack and put optifine, it will considerably boost your fps, usually at least.
 

heavy1metal

New Member
Jul 29, 2019
104
0
0
Tekkit vs FTB: I've never used tekkit but have played many flavors of mods, and host a vanilla bukkit server (since beta 1.8). I can tell you the FPS and TPS from 1.2.5 > 1.3 dropped horribly. My FPS in 1.2.5 are over 200, and with 1.4.5 hover around 80 (both using 128bit texture packs). My TPS went from 20 all day long to 5-10. (45 player server) All this is vanilla.
Mods in 1.2.5 using opitfine - I was over 100 all day long.
Mods in 1.4.2 w/o optifine (no texture pack) - hovers around 50-60
Mods in 1.4.2 w/ optifine - Hovers around 80-90
Mods in 1.4.2 w/ optifine + texture pack (128) stays at 80

Optifine does not like ATI drivers. Optifine also does not like specialty settings in video cards that process multiple threads.
The option "Video Settings -> Chunk Loading" switches between Standard, Smooth and Multi-Core chunk loading. When using "Chunk Loading: Multi-Core" make sure to Disable "Threaded Optimization" / "OpenGL Threading" in the graphics card control panel (example). For best results disable it globally, not only for java.exe or minecraft.exe.
http://www.minecraftforum.net/topic...d4-fps-boost-hd-textures-aa-af-and-much-more/

Summary: YMMV with optifine. I have to use "smooth" as opposed to "multi-core" which used to work fine for me in 1.2.5. Multi-core crashes the game every 5 minutes, but during that time I'm around 150fps.


Human eye: Who cares what exact reason there is for enjoying 60fps? I can tell you, once I dip below 30fps I notice tearing and jumping. Why? Uh.. probably because it's MISSING FRAMES.... You can argue the science all day long but the reality is that anything below 25fps looks awful. Go download handbrake, convert a movie and limit to 20fps and tell me how you enjoy it.

Look at the 10fps difference. Crap is crap.
 

GreenWolf13

New Member
Jul 29, 2019
188
0
0
Cnsidering that most movies are in 24 fps, you're wrong. There is literally no (noticeable) difference between 30 and 60 fps. At least, not one you can see.
 

heavy1metal

New Member
Jul 29, 2019
104
0
0
Pretty sure I said anything below 25, and nothing about the difference in 30 and 60. We can agree here, there isn't a difference between 30 and 60. The only reason you want 60fps in a game is to keep a buffer for when graphic intensive events happen causing a dip in FPS. There is however a noticeable difference in 25 and 30 in video games. I think it's safe to say video games have faster action sequences than movies and to maintain clear motion it requires a higher fps. 15fps (above your suggested 12fps) is horrible, the motion is so jerky that I'm personally unable to enjoy whatever it is I am watching.

Well, considering more of what I watch is TV in the US rather than film (to mean movie theaters), you're in fact wrong. Everything here is broadcast at ~30fps. At this point I'm beginning to wonder if you're just trolling, if so - well done sir. Entirely derailed the thread lol.


60i is an interlaced format and is the standard video field rate per second for NTSC television (e.g. in the US), whether from a broadcast signal, DVD, or home camcorder. This interlaced field rate was developed separately by Farnsworth andZworykin in 1934,[12] and was part of the NTSC television standards mandated by the FCC in 1941. When NTSC color was introduced in 1953, the older rate of 60 fields per second was reduced by a factor of 1000/1001 to avoid interference between the chroma subcarrier and the broadcast sound carrier. (Hence the usual designation "29.97 fps" = 30 frames(60 fields)/1.001)
http://en.wikipedia.org/wiki/Frame_rate


Early silent films had a frame rate from 14 to 24 FPS which was enough for the sense of motion, but it was perceived as jerky motion.
http://en.wikipedia.org/wiki/Frame_rate

Thomas Edison said that 46 frames per second was the minimum: "anything less will strain the eye."[4][5] In the mid- to late-1920s, the frame rate for silent films increased to about 20 to 26 FPS.[4]
Page 284 from the book: "Early Cinema: Space, Frame, Narrative"


Television in the U.S., for example, is based on the NTSC format, which displays 30 interlaced frames per second (60 fields per second).
http://www.webopedia.com/TERM/F/fps.html
 

GreenWolf13

New Member
Jul 29, 2019
188
0
0
If I'm trolling, it's possibly the greatest troll I've ever done, and is completely unintentional. I'm oretty sure I'm just wrong/mistaken. Thank you for enlightening me and correcting my mistake.
 

vineet

New Member
Jul 29, 2019
190
0
0
I have optifine yet it still flunuates. I'll try manually putting in the mods in the vanilla minecraft.
 

GuntherDW

New Member
Jul 29, 2019
12
0
0
Cnsidering that most movies are in 24 fps, you're wrong. There is literally no (noticeable) difference between 30 and 60 fps. At least, not one you can see.

There's just one crucial difference. Movies have something called "motion blur". They do that so that even though it's 24fps it won't feel like 24fps.
Try watching a movie without the motion blur effect on 30fps and 60fps.
 

slay_mithos

New Member
Jul 29, 2019
1,288
0
0
{redacted for name calling}

Both sides (my post comprised) are purely irrelevant to the topic at hand.
You can see the difference, great, you can't, good for you.

The fact remains that in games, you will feel the difference through the controls, and the matter at hand is all about the framerate hurting the gameplay, not whether the user can see the difference between 30 and 60 frames per second with his eye only.

I know that pseudo-moderation is not accepted, but the off topic is going for quite a while now, and little is said about the actual topic and possible solutions.
 

Jadedcat

New Member
Jul 29, 2019
2,615
3
0
The original question seems to be about optifine. Optifine cannot be distributed through the launcher because the optifine author does not give out permissions to anyone. I have used optifine with all of the FTB packs without an issue. We cannot fix a problem that is not related to a mod we have put in the pack. I am assuming you don't just want HD textures. I find I personally get my best fps level with no optifine. I use a 64x texture pack with no optifine because Forge supports the use of 64x and lower. That with my setting set to MAX FPS gives me about 180 fps on a decently populated server. About 75 fps when streaming.

Basically we cannot include optifine. No one can as optifine's author will not give permission. It works for most people. And we cannot troubleshoot a mod we are not distributing.

Please stay on topic or risk this thread being locked.
 

heavy1metal

New Member
Jul 29, 2019
104
0
0
Just to retouch, the only issue I've seen with optifine is using the multicore setting. This causes the client to crash. A fix for you nvidia users out there:
The option "Video Settings -> Chunk Loading" switches between Standard, Smooth and Multi-Core chunk loading. When using "Chunk Loading: Multi-Core" make sure to Disable "Threaded Optimization" / "OpenGL Threading" in the graphics card control panel (example). For best results disable it globally, not only for java.exe or minecraft.exe. (source - http://goo.gl/T385z)
 

Oakley999

New Member
Jul 29, 2019
1
0
0
apart from Optifine to boost fps, i recommend downloading 64 bit latest java, and removing every other older java.
http://www.java.com/en/download/manual.jsp
Using this link, download the (windows offline 64 bit), then remove the rest using the control panel - programs - uninstall
Then after doing this, use task manager to change priority to high. it might not work for some people, but for me, I can play with normal distance now, with boosted fps, compared to rubbish fps with small.
 

BigDaveNz

New Member
Jul 29, 2019
48
0
0
A few posts ago people were talking about fps. Between 23 and 24 fps is what the average human eye "sees" meaning that which the brain decodes as images. However you percieve alot more than that. Especially if you work with PC's or are a gamer. You have a higher chance to notice higher framerates. A few people can notice if the FPS drops below 120. However i dont mean that they "see" every frame. They just can tell something is not right. Also 90% of monitors max out at 60 fps and some older ones 50 fps. A few gaming monitors do go up to 120fps. Most film is Shot at 60fps and rendered into 30/25 fps depending on whether it's PAL or NTSC.

How is this related to minecraft?
People say "it's okay if your FPS is only 30"... That may be the case... But as soon as there is any sort of lag spike you notice it straight away. At higher frame rates you dont notice the lag spikes etc.
Also FTB has ALOT more mods then Tekkit/Technic and the mod's have updated and increase the amount of ram/cpu they use. Chances are if you are struggling to play when you load a new world, you are going to have a really tough time with any sort of automation. When i am in the wilderness i get approx 300-400fps with extreme render and a 512x pack.... However inside a workshop type area i can drop to below 60... I have a friend whose PC is similar to the OP's. And his FPS get's down to 1fps occasionally because of how demanding FTB is. Even vanilla minecraft has gotten alot more resource intensive in the last 5-6 updates. It sucks for alot of people who cant afford a decent computer. The suggestions in this this thread so far are pretty much all that you can do. And if you still cant play chances are you need a less resource intensive modpack. 64 bit java, optifine and changing all the setting in optifine (especially particles) can make a huge difference. Also if you have alot of IC2 machines in an area and it's causing lag... Disable IC2 sounds in the IC2 config... that can change a pc from 1fps to 30.... Depending on how many machines are in the area
 

Bigglesworth

New Member
Jul 29, 2019
1,072
0
1
I think i stated this before. You cant really see a difference past a certin point.. but feel a difference. I think when it jumps up and down is when it causes the most problems. Is it unplayable?.. well if its bouncing up and down like an ADHD child on a bag of pure sugar.. probably. Hence, I think people who have large jumping like that in frame rate probably should get optifine and just check on the smooth FPS. that Will lower it yes, but it will stabilize it. It can be caused by having too many .. how do i say "ticks?" well at least "do something at tick". it can be like trying to read while having bad hiccups.

No. There is no difference. MC works via 20 ticks per second when interacting with things. Youre experiencing a placebo effect.

How is this related to minecraft?
People say "it's okay if your FPS is only 30"... That may be the case... But as soon as there is any sort of lag spike you notice it straight away. At higher frame rates you dont notice the lag spikes etc.

Youre not making sense here. If a system is capable of running at 200fps, but you limit it to 60 due to monitor refreash, it isnt magically going to spike lower than if you didnt limit it to 60. A lag spike that lowers fps to 30 is going to lower it to 30 regardless of if you limited the high-end to 60 or 600! You would see a *larger* difference. Meaning if a system is capable of 200fps, and ypu limit it to refreash rate of 60, you will see FAR LESS lag spikes and FAR MORE stable framerate. Any lag spike that lowers a systems framerate to 10-20 is going to lower it to that no matter if you limit the highend or not.
 

nallar

New Member
Jul 29, 2019
270
0
0
No. There is no difference. MC works via 20 ticks per second when interacting with things. Youre experiencing a placebo effect.



Youre not making sense here. If a system is capable of running at 200fps, but you limit it to 60 due to monitor refreash, it isnt magically going to spike lower than if you didnt limit it to 60. A lag spike that lowers fps to 30 is going to lower it to 30 regardless of if you limited the high-end to 60 or 600! You would see a *larger* difference. Meaning if a system is capable of 200fps, and ypu limit it to refreash rate of 60, you will see FAR LESS lag spikes and FAR MORE stable framerate. Any lag spike that lowers a systems framerate to 10-20 is going to lower it to that no matter if you limit the highend or not.
That would be correct, except minecraft is bad and setting it to anything other than performance will make it slower, even though they're just supposed to be frame rate caps. Capping frame rate with another program or optifine's vsync may work correctly.
 

Meara

New Member
Jul 29, 2019
30
0
0
No. There is no difference. MC works via 20 ticks per second when interacting with things. Youre experiencing a placebo effect.
When did the "tick"effect only work when just interacting with things. I believe for example Wheat works on Tick Possess. Meaning it calculated its growth per tick. So thus an logger will check a block to see if there are logs in it's area every tick, and as long as that machine is loaded.. that using some reasorces (maybe not a lot but something). What I WAS talking about, is that "Check per Tick" begins to add up. If I didn't state it well enough read this.
Last I wasn't just referring to minecraft alone. What I said before is I used to play games and .5fps and now run easily at 100-200 I can see and feel a difference in the responsivness of my machine.
I don't see how it could possibly be called a placebo effect. If that was so, we'd all still have Pentium 4s
 

LazDude2012

New Member
Jul 29, 2019
169
0
0
Wheat is on world ticks. World ticks are random, but they happen during a game tick when they happen.
 

Meara

New Member
Jul 29, 2019
30
0
0
Wheat is on world ticks. World ticks are random, but they happen during a game tick when they happen.
....and? still doesn't change the point.
Edit: any way.... I think i'll refrain any more on this topic. As since we are no longer discussing WHY the FPS has changed between Tekkit/technic to FTB... I think that question was answered a long time ago, regardless of the Optifine bit.

We shouldn't keep poking this dead cat with a stick.
 
Status
Not open for further replies.