So you're delusional? Because there is no (noticable) difference between HD and "normal" tv.Hehe, I'm one of those people who can tell a big difference between HD and normal TV
So you're delusional? Because there is no (noticable) difference between HD and "normal" tv.Hehe, I'm one of those people who can tell a big difference between HD and normal TV
I don't know if you've ever played a game on a normal TV vs a HD one, but there's so much graininess on a normal TV.So you're delusional? Because there is no (noticable) difference between HD and "normal" tv.
So you're delusional? Because there is no (noticable) difference between HD and "normal" tv.
GreenWolf13 said:The human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually
Source: Wikipedia (Frame Rate) - BackgroundThe human eye and its brain interface, the human visual system, can process 10 to 12 separate images per second, perceiving them individually,[1] however the threshold of perception is more complex, with different stimuli having different thresholds: the average shortest noticeable dark period, such as the flicker of a cathode ray tube monitor or fluorescent lamp, is 16 milliseconds,[2] while single-millisecond visual stimulus may have a perceived duration between 100ms and 400ms due to persistence of vision in the visual cortex. This may cause images perceived in this duration to appear as one stimulus, such as a 10ms green flash of light immediately followed by a 10ms red flash of light perceived as a single yellow flash of light.[3] Persistence of vision may also create an illusion of continuity, allowing a sequence of still images to give the impression of motion. Early silent films had a frame rate from 14 to 24 FPS which was enough for the sense of motion, but it was perceived as jerky motion. By using projectors with dual- and triple-blade shutters the rate was multiplied two or three times as seen by the audience. Thomas Edison said that 46 frames per second was the minimum: "anything less will strain the eye."[4][5] In the mid- to late-1920s, the frame rate for silent films increased to about 20 to 26 FPS.[4]
Didn't read this. Read the title. Read OptiFine, more than once.
Optifine has not given FTB permission to be used in this pack (from my understanding).
This pack is still in it's beta stages.
Actually it's causing a whole pile of problems.True, but they were stating that they manually added Optifine. I don't think Optifine has a problem with people manually installing Optifine onto their Feed the Beasts.
Actually it's causing a whole pile of problems.
Waiting to use Optifine until the pack is complete (and the configs are done being changed, all mods are added, and all files are in) is the safest bet if you're having problems with it.
Otherwise, go ahead, use it. It seems to work for most people, whereas, I am ok, without it.
(Also, for picking my post from over 2 weeks ago, I've since learned more and have a modified viewpoint)
Not a problem. I'm not a genius with all of this, but personally, I'd rather drive a car that's been completely put together, rather than one that's still a WIP, if you know what I meanSorry, I probably should've been a little more clear about what I meant. I meant that the creators of Optifine shouldn't have a problem with people installing Optifine onto their FTBs, since it's not much different from installing it onto a Minecraft.
(Sorry about the old quoting.)
Whoa whoa whoa.. Who said 18fps is what the human eye can register? Last i herd it was 24... still a load of BS if you ask me.
Not only is FPS a visual thing it also has to do with how "Fluid" the game plays as well. Meaning the FEEL.. how well it responds to input.
However I do think people whining about having Minecraft at 30fps and saying its Unplayable..<Sarcastic> really gives them a leg to stand on. </Sarcastic> I have played more intensive games at less of a frame rate, on a sony viao laptop, from (geeze...what was it....8 years ago?), 0.5 FPS is unplayable!
My current specs are 8gig DDR3, Phenom II x4 Black Edition, ATI Radon 6700, and an Old WD 7200RPM 200gig HHD (well.. that's the master) I'm going no lower then 41fps, sitting in my ware house with my logistic minion dose 99.9% of my work, and guess what NO OPTIFINE! with 64x textures ta-boot, and I'm Perfectly Ok with it..
So, In my conclusion to what i will state that's ON TOPIC i revert back to my original post a few pages back. There's too many variables to determen WHY this is occurring. meaning Scientific Annasis can not occur with out further evidence.
You could go down the list of all the usuals. Are your drivers up to date? is Java up to date? Have you cleaned out your Temp Files? Defraged in the last year? Have you thought of a nice once a year Reformat? (Not trolling just stating "clean computers go faster.") Virus free and Registry in top conduction?
@Greenwolf
Really if you cant see a difference.. You not being totally mean here, you might wanna get your eyes checked. Just sayin'... .
Cause let's brake down the facts here.
240p is Antena Quality? (I can't even look it up to verify)
360p (I think this is TV... like old Cable TV from before they had boxes, meaning it's defiantly not Digital Quality)
480p VHS (Up Most best quality VHS can produce) Oh and this is SD
720p DVD (Low HD)
1080p Blue Ray. (True HD)
So your telling me there's no difference between VHS and Blue-Ray? No difference between a PS3 and a Super Nintendo? If you said Yes to either of those questions then either u-b-trollin' or you need to go back to line one after your name. OH! If your baseing your knowledge off of games you have played, only PC and PS3 play HD games , and PC will lower your reso then top if you Auto-Detect.
@CoderJ
Actually current Video is at 29.99 (or 30) FPS. So, if 30 fps is unplayable then movies much be un-watchable. However I will admit.. 60-120fps makes me feel like I have the 1337est computer in the world... Like Driving an Audi (fantasizes)
Not a problem. I'm not a genius with all of this, but personally, I'd rather drive a car that's been completely put together, rather than one that's still a WIP, if you know what I mean
Also, <3 the avatar you've got there
No way!i like you, and just so you know. audis are amazing, however i totaled my moms.... into a gaurd rail sideways.. just saying. audi a4 2.8l quattro... no more
Kinda what I was getting at. I'm assuming the Mixing of Single Player and Minecrafts Somewhat Unreliable Multi-Player, and Some of the mods that where pretty much exclusively single player going multi-player, makes really too many problems. It just adds too many Xs ys and Zs to this equation.As a matter of fact Optifine gave me loads of problems in my custom modpack build for personal use for MC 1.4.5, lots of spam crap in MultiMC console every time I was inside a Mystcraft age, console spam=FPS drop... removed Optifine and all fixed.
My computer is quite old (Core2Duo 3Ghz/NvidiaGTX465) but without Optifine I have 75~125FPS, so if you can avoid to use Optifine, don't use it, IMO.
Since 1.3 Minecraft runs an internal server to play Single-player, so Tekkit (1.2.5) is a whole another story right now.
No way!
I hope you're ok cuase i bet your mom went off the deep end!
Kinda what I was getting at. I'm assuming the Mixing of Single Player and Minecrafts Somewhat Unreliable Multi-Player, and Some of the mods that where pretty much exclusively single player going multi-player, makes really too many problems. It just adds too many Xs ys and Zs to this equation.
Also if I understand the Releasionship between Server and Client correctly....
The server is the one running all the code and data. The Client just reseeves a message saying, "Display this stuff and 'den 'dat" So thus it's less resorce Intensive client side. I mean most of what Minecraft is heavy on is RAM. Witch is Generation (RAM and CPU Server side), Calculations (witch is handled by the Servers CPU), and Graphics Render (Client GPU)... ::shrug::
----- In General ----
Any way, kinda just thought of another thing. people could try making sure Smooth FPS setting is On? Only other time i could thing that FPS at 20-40 would be really unplayable if its jumping a lot.
Considering your monitor most likely only 60Hz, there is literally no difference between 200-150 fps. As your monitor can only display 60 fps at the most anyway. I really doubt you can tell the difference between those two framerates.There is a huge difference. i have a 1080p laptop and my old 768 (i believe) hurts my eyes. go on youtube. go to 240p (240 pixels) and go up to 1080p. more pixles = better picture and sharpness. so you are delissuional. and i know for a fact i can tell when my fps drops from 200 to 150. i feel like im lagging. i can notice..... trust me. and yeah, redwood forests drop me down to 170 from my usual 200 on fance. i have a treehouse on Beastcraft in the forests