Eloraam has not disappeared. (random half related topics)

  • Please make sure you are posting in the correct place. Server ads go here and modpack bugs go here
  • The FTB Forum is now read-only, and is here as an archive. To participate in our community discussions, please join our Discord! https://ftb.team/discord
Status
Not open for further replies.

Hydra

New Member
Jul 29, 2019
1,869
0
0
that said i generally have to agree on the finding it odd as to the language choice. worse is that apparently the engine does some derpy heavy handed things like the full texturemap push every tick we've heard about. but the use of a jvm which cannot take full advantage of hardware virtualization thus added another layer, to what is still viewed as more of an interpreted language(ala python vs c/c++) and the lack of hardware acceleration for same..

Complete and utter nonsense. Minecraft uses OpenGL so it's by definition hardware accelerated.The reason it's quite heavy is simply because simulating an entire world is a lot of work for your CPU (not GPU). Have a look at dwarf fortress; that game is even more CPU bound thanks to all the pathfinding and liquid simulations.

Aside from that; Java used JIT compiling. Porting it to C++ would hardly have an impact.

It would be nice if people would not talk about stuff they know so little about. I have over 15 years of experience programming in Java and also wrote my own minecraft viewer. You can't compare MC to typical shooters that look very pretty but don't have to cope with simulating a fully destructable world.
 

Dravarden

New Member
Jul 29, 2019
1,693
0
0
and render the 6 faces of each block, and animations, and mobs, and entities, and chunk loading... most sandbox games are resource intensive, just because the way they are.
 

whizzball1

New Member
Jul 29, 2019
2,502
0
0
Well to state something from earlier in a different thread, Minecraft sends every single texture in the game to the graphics card.
Every tick. 20 ticks a second.
Sixty seconds a minute.
Sixty Minutes an Hour.
24 Hours a day.
7 Days a week.
52 1/7 weeks a year.
Etc Etc.

That is the worst, brutest process I have ever heard of in my entire life.
Especially with about a billion mods. No wonder I get lag.
Better tactic: Send every texture visible to the graphics card every tick. Also spread out a bit in case the player moves really fast within one tick. That would be much, much, less laggy. However, we all know that Mojang is dedicated to creating processes that cause massive lag. And also never fixes them.
 

SteveTech

New Member
Jul 29, 2019
144
0
0
Well to state something from earlier in a different thread, Minecraft sends every single texture in the game to the graphics card.
Every tick. 20 ticks a second.
Sixty seconds a minute.
Sixty Minutes an Hour.
24 Hours a day.
7 Days a week.
52 1/7 weeks a year.
Etc Etc.

That is the worst, brutest process I have ever heard of in my entire life.
Especially with about a billion mods. No wonder I get lag.
Better tactic: Send every texture visible to the graphics card every tick. Also spread out a bit in case the player moves really fast within one tick. That would be much, much, less laggy. However, we all know that Mojang is dedicated to creating processes that cause massive lag. And also never fixes them.

Whoa whoa whoa! hold up!
I'm no expert, I'm but a hobbyist really, but one should not have to send all the textures each cycle. There is a reason graphics cards have memory( yes they need it to do stuff but it is also used to store texture memory, vertices, maybe shaders, and it is far faster than the data bus used to communicate with it) and you should be able to load the textures in memory and promptly stop worrying about it(unless your graphics card is severely lacking in the memory department in which case you should lower your screen resolution,texture resolution, or buy a better card).

Textures are one of these things that you should be able to preload, and forget about. Thus freeing that precious data bus for important things, like which of the millions of vertices has moved and to where.
 

RavynousHunter

New Member
Jul 29, 2019
2,784
-3
1
Yeaaaaaaah, hearing that little bit from one of DW20's videos about the texture thing seriously made me WTF. Just...why? Might make sense for animated textures, I guess, but the regular, static textures that make up, like...99% of Minecraft's graphics? ...Whatchu talkin bout, Notch?

[ETA]

Complete and utter nonsense. Minecraft uses OpenGL so it's by definition hardware accelerated.The reason it's quite heavy is simply because simulating an entire world is a lot of work for your CPU (not GPU). Have a look at dwarf fortress; that game is even more CPU bound thanks to all the pathfinding and liquid simulations.

Aside from that; Java used JIT compiling. Porting it to C++ would hardly have an impact.

It would be nice if people would not talk about stuff they know so little about. I have over 15 years of experience programming in Java and also wrote my own minecraft viewer. You can't compare MC to typical shooters that look very pretty but don't have to cope with simulating a fully destructable world.

I have over a decade of general programming experience, myself, and I know you can't compare Minecraft to, say, Call of Duty. I never did, because I know its stupid to do so. Yes, there's a lot of shit to track in Minecraft, again, I never said there wasn't. My problem, primarily, is with the Java Virtual Machine itself. It, for me, has never been as efficient as the .Net VM, and neither are comparable to C++, because no matter what, you will always have that extra layer of abstraction with an interpreted language vs a compiled one, and that means extra overhead.

Again, I'm not saying that they should drop everything and go to C++, I'm just wondering why they didn't, seeing as that's what damn near every game is made with, nowadays, because its faster and offers more fine control than higher-level languages like Java and .Net. Of course, it also comes down to the skill of the programmers in question, and doing something as pointless and resource-intensive as flushing the entire texture map to the video card every tick...kinda makes me wonder exactly where they are, in that regard. Beyond me, certainly, but even I know such a thing is nothing more than a pointless, wasteful kludge.
 

steelblueskies

New Member
Jul 29, 2019
141
0
0
Complete and utter nonsense. Minecraft uses OpenGL so it's by definition hardware accelerated.The reason it's quite heavy is simply because simulating an entire world is a lot of work for your CPU (not GPU). Have a look at dwarf fortress; that game is even more CPU bound thanks to all the pathfinding and liquid simulations.

Aside from that; Java used JIT compiling. Porting it to C++ would hardly have an impact.

It would be nice if people would not talk about stuff they know so little about. I have over 15 years of experience programming in Java and also wrote my own minecraft viewer. You can't compare MC to typical shooters that look very pretty but don't have to cope with simulating a fully destructable world.

nods. caveats noted and ignored apparently. examples ranging from comparison to DATABASE SOFTWARE not shooters which have the advantage of essentially precomputing and having human optimization passes done, already having noted the cpu bound nature(and lack of multicore development where the core load could be parallelized well) while highlighting the irrational additional bus stress of constantly pushing large volume data needlessly, and ponderings about actual java hardware acceleration bandied about in both IEEE articles and by sun and the openjdk teams themselves. oh yeah and in your directly quoted segment, what part of "mostly viewed as" in the realm of caveats added to a statement skipped by unnoticed? i pointedly never stated the common perception was correct. if i had we'd have been going down the "everything in asm 100% all the time because performance" logical herpaderp that crops up from time to time as well. java is just fine as a language, and when wielded well can be at near parity with most others. minecraft as an example product made with java was a small indie title to begin, so from that perspective it works rather well.

opengl acceleration of minecraft is not hardware virtualization or a dedicated bit of hardware for interpreting a language mind you, and as you kindly reinforced even were we talking about the same thing (i said virtualization not acceleration, acceleration was mentioned with respect to the "common view" only)the title is not exactly gpu bound in any case.
perhaps my taking the jvms' options to utilize hardware features like various sse* extensions, and considering a full hardware interpreter/compiler vs a software layer using some hardware functions was taking too much for granted, as was specifically differentiating between minecraft and the language it was made with.when i say hardware accelerated i mean it has dedicated equipment. if i have an audio device which uses cpu clocks to function, and compare it to a dedicated dsp chip based audio device, only the latter is actually hardware accelerated audio. setting aside strange middle ground for the oddball fpga based devices that reconfigure on the fly to service modem and soundcard with the same hardware(mostly extinct nowadays). i must remind myself or be kicked from time to time into so doing, that audience experience can sometimes be quite a long mile different.
simply because some things are "common" beliefs, lack of experience or lack of knowledge will lead even tongue in cheek references to the same into a hazard. which isn't even to say those beliefs are wrong, but frankly in relative terms the differences in code execution time between java and c/c++ when written with equivalent expertise will be negligible for most things of any reasonable complexity. the reactor planner is the sort of thing i'd be able to get an intro to vb student to churn out by the end of a semester if he wasn't slumming. i wouldn't exactly consider a rewrite improving its performance a huge leap. you could probably double it's performance metrics as an application even rewriting it in java. but hey, it works, isn't terribly buggy, and has been done, so let us say that reinventing the wheel in that instance is more of a mild nuisance, and that performance optimizations there probably aren't as critical given its use case.




the last title i went boring through that was java based was in fact terraria, and i pondered some of the same items therein initially, although with only two dimensions to work with the data sizes were commensurately smaller. there a system of multidimensional arrays were used, which i suspect if i actually spend the time looking will be the case here as well seems like the chunk class and it's overloads handle things a bit strangely what with the independent regioning array for entities along a 16x16 y axis and the use of raw integers for each x,y,z then tagging on metadata and block info and lighting being yet another layer is odd compared to say a four dimensional array containing instances of the populating class information.it might actually be functionally doing exactly that, and seeing it as odd might just be my lack of specific java language experience and nuance or the 5 minute perusal( the nibble segment? the heck? and just what the sam hell is actually the driving logic for the view frustruming here, the frustrum calls seem .. abbreviated..).

i cannot definitively even say dravarden is flat wrong, as logically, populated or not if the coordinate exists in the loaded boundary region it must be initialized at the very least for the other passes.
so in spite of the nether only world generating a 128 high y axis region, the remaining 128 must be instantiated as air, receive a lighting pass, etc from what i'm seeing. air is a blocktype. observationally outside the code, given one can build there, the engine considers that top region above the nether as a valid spawn region, and mobs can spawn there, it exists and is tracked.

thus the note on the number of blocks to track per chunk remains, barring a function somewhere to explicitly void regions from handling that i simply didn't lay eyes on as yet, at the full volume. thats a lot of data to churn through volumetrically, and each of those 6 million per chunk locations isn't a simple bit being flipped on or off, its quite a bit more. all said and done while the texturemap push every tick is a needless strain on the system bus, and memory bandwidth, it seems a more than fair wager it still pales next to the raw data throughput for the block data being tracked in any given loaded region.
further given what it looks like is happening with block updates informing every adjacent block that's nearly multiplicative in work done.

but then again, sandbox. while one could simplify a cfd sim to work around the concurrent interactions complexities, this sort of thing requires a bit different tack.

a text mode roguelike (c'mon you know you've at least fiddled with trying to write one at some point) can be complicated enough( ever seen nethack bring a modern machine to it's knees? a floor with multiple mobs that summon other mobs, and extinction of many species that aren't summoner types. hurrah exponentiating workloads).




as to background however, i agree my experience with java is somewhat limited to some conjectural reading about and fiddling with a couple of games using it. were we discussing instead anything from basic, vb, asm, ladder logic, python, c/c++ and a smattering of others, i've taught them to junior level undergraduates. the further irony being a background in electrical engineering, not "pure" computer science. i'll also be the first to (abashedly) admit that much of my upper tier knowledge has gone somewhat to rot given overexposure to beginner level language instruction and underexposure to much beyond the same.

in any event if i ruffled your sensibilities in there, my apologies. i thought the modders are humans bit at the end would make the tone of the post clear as somewhat conversational with a bent towards tongue in cheek style humor, not raging iron man absolutist.

as a final note, i'm absolutely certain i'll have phrased something, or said something that will a) be taken wrongly and b) considered wrong, abrasive, or somesuch, by at least someone somewhere. i'd even wager someone somewhere going through that, seeing that i've taught at a collegiate level and having a fit over something or other. only human, sorry.

i'm all for being wrong on occasion, so long as it's not fatal, and not the same thing twice. it engenders not only learning on occasion, but also discussion and elucidation on related issues.
 

PhilHibbs

Forum Addict
Trusted User
Jan 15, 2013
3,174
1,128
183
Birmingham, United Kingdom
Yeaaaaaaah, hearing that little bit from one of DW20's videos about the texture thing seriously made me WTF. Just...why? Might make sense for animated textures, I guess, but the regular, static textures that make up, like...99% of Minecraft's graphics? ...Whatchu talkin bout, Notch?
I think it's a simplicity thing - rather than having one system for rendering fixed textures and a procedural system for animations, everything is a potential animation and so there's only one rendering system.
 

Lambert2191

New Member
Jul 29, 2019
3,265
0
0
New direction for the thread: What you're most happy about being able to keep in your world now that RP2 has a breath of life back in it.
For me: sickles and red alloy wire, I can't play without those now, soooooooooooo useful!
 
  • Like
Reactions: EternalDensity

Bigglesworth

New Member
Jul 29, 2019
1,072
0
1
I'm not the only one that thinks this?! My god, has someone checked the temperature in hell? I think Satan might need a jacket.

Part of the lag, I'm sure, is up to a very...broad design, meant to target as many machines as possible, something which Java is, admittedly, not too bad at. However, that incurs overhead, and the more general you get, the more overhead you incur; its a balancing act that you've got to be very careful with, lest you release a graphically simple game that runs all its graphics on the CPU. Like its the 90s. Besides that, what extra is earned by targeting, say, Linux machines? 1-2% of the market? Even if half the Linux users on the planet bought and played Minecraft, that's still going to be a very minute percentage (again, likely 1-2%) of your overall user base. They tried to reach as many computers as humanly possible, and in doing so, sacrificed performance and, by extension, gameplay to a degree.

First off, i agree that java may not be the best language for this game (now) as it has evolved into something more than Notch originally forsaw and is outgrowing its adolescent clothing. However...

Minecraft has roughly 25million registered users. I would say generously say 5 million more unregistered.
Linus users (desktop use, not server) are roughly 30million+.

So no, its not 'again, likely 1-2%' its more like 50%. I understand that that statement was just to illustrate the point that java isn't really needed as Linux makes up a tiny portion of the market and sacrifices a large chunk in performance to do it. I agree with that, lets just not make up numbers shall we..

The other part is Java itself, or rather, its virtual machine. Time and again, I've found that Java programs take longer to start and consume more resources than something similar made in, say, .Net. I'll give you an example, on my laptop (which is good enough to run Guild Wars 2 on minimum graphics with reasonable FPS), the standalone IC2 Nuclear Reactor Planner (v3) takes anywhere from 3-5 seconds to start up, and according to Task Manager, consumes around 60 meg in memory. Too fucking long, and too fucking much, in other words. Yeah, that program might have some pretty complex math behind it, I'm not saying that it doesn't, but...something like that should really have at most -half- that much memory usage, and start almost instantly. Its bulky, its cumbersome, and worst of all, its inefficient. That inefficiency means greater resource demand, and that means lag.

I honestly wonder -why- Mojang even made Minecraft in Java in the first place. Damn near every game nowadays is made with C++, and extended by scripts in something like Lua, Python, or some homebrew language. Why? Because C++ compiles directly into machine code, no interpretation layer necessary, it. Just. Runs. Want to target multiple platforms? C++ can do that. Yeah, it'd take a little extra work, but if you ask me, that work is worth it if it spells a better experience for your end users. If they'd -insist- on going with an interpreted language...why not go with one of the .Net languages? All you need for cross-platform capabilities there is Mono, and in my experience, .Net programs run a fair deal faster than what I've experienced with Java.

I'm almost tempted to make a clone of the IC2 Reactor Planner in C# just to prove my point.


Mojang didnt make minecraft. Notch did. He couldn't have known what a success it would be or that it would be added to like this (even without the mods that basically evolved around it even hacking in a user made API!). If he had known, he might have considered a different approach at LEAST to how he initially coded it, and possibly in switching languages altogether. However Notch(and Jeb) arnt egotistical dudes. They didnt think this shit would happen like it did. Minecraft used to get 400+ FPS on 6+ year old GPUs. That SEEMED OK. It would be foolish for any of us to think they are not considering it now, but consider that they have a huge amount of manpower and time that has gone into the Java development, and the vast majority of people are getting ok performance still.

I hope they do consider shifting to a lower level language in the future, as its ridiculous a 128x128+ texture pack should have any impact on performance. I think they are doing some sort of recode of the graphic engine, so we will have to wait and see how that goes.
 
  • Like
Reactions: southernfriedbb

KirinDave

New Member
Jul 29, 2019
3,086
0
0
First off, i agree that java isnt the best language for this game (now) as it has evolved into something more than Notch originally forsaw and is outgrowing its adolescent clothing. However...

There is nothing fundamentally wrong with using Java for a video game environment. Nothing. Java is one of the most sophisticated, consistent and performant interpreted runtimes on the planet.

Anyone who says otherwise is wrong. Provably so.
 
  • Like
Reactions: southernfriedbb

Mash

New Member
Jul 29, 2019
892
0
0
Are there examples of other similarly-huge player base games of a similar genre that use Java?

Or just any games in general, really. I can definitely agree with the assertion that Notch never saw this game becoming this big. A lot of people forget that Minecraft started with just one dude developing an indie game by himself.
 
Status
Not open for further replies.