All I can say is this: When it comes to computer or TV screens, you really can't tell the difference between, say, 120 and 240 Hz unless there's something with a lot of contrast moving very quickly across the screen. In video games, such as Minecraft, this can happen if you flick your mouse quickly. In TV and movies, it's rarely visible at all.
When my folks got our current TV a couple of years ago, we did compare the different TVs they had on display, some of which ran at 120 Hz and others at 240. We could only just barely see any difference between them at all when they showed clips of football games (that is, American football, for you foreign soccer-lovers), in which there were a whole bunch of nearly-vertical white field lines on a dark green background, all zipping horizontally across the screen as the camera panned from one end of the stadium to the other.
We don't watch football (or much any sports at all, for that matter), so we opted for the less expensive 120 Hz model.
Disclaimer: This is coming from a guy who, for the past few years, has consistently run Minecraft at 30 Hz. I run a laptop (a decently powerful one, at that), and while it can handle faster refresh rates just fine, the GPU gets really hot, so it burns my fingers if I miss the keys and touch the aluminum casing between them. I can certainly tell the difference between 30 Hz and 60 Hz, but 30 works well enough and doesn't burn my fingers.