> Removing all uses of undefined behaviour is probably a fool's errand, as it would require significant changes throughout the code that would take time and come with a performance impact, all for no immediate practical benefit. So I just hunted down the undefined behaviour cases that actually broke determinism. This was easy enough, as we have plenty of tools we use to test determinism. By comparing the game state CRC for every tick of every test (we have 2,417 tests) between x86 and ARM, I believe I got a pretty good coverage of potential issues.
Holy hell.
> "This is the first time we had to make sure the game is deterministic between ARM and x86. We should be fine, C++ is portable, right? Just don't use undefined behavior. Turns out we use quite a lot of undefined behavior..."
Hah
This has been my experience dealing with portability and C++ as well. In theory, very portable. In practice, "portable" as long as you don't care about the details of how features work and didn't use any undefined behaviors that just happened to work in the first architecture but are implemented totally differently in the new architecture (and C++, of course, embraces undefined behavior instead of keeping the developer out of it).
Things as simple as "believing you know the byte-width of a primitive data type" are dangerous.
Would it occur had they used UBsan on debug builds?
No doubt. For starters, UBsan does not check for any invalid memory access.
In my C++ course, I require that the code runs correctly under 5 different compilers (GCC, LLVM Clang, Apple Clang, MSVC, Intel C++ Compiler Classic) on 3 OS (Ubuntu, Windows, macOS) in Release, Debug+Valgrind, Debug+Sanitizers modes, and students still get UB quite often.
Extra reading: http://evan.nemerson.com/2021/05/04/portability-is-reliabili... and discussion at https://news.ycombinator.com/item?id=27044419
Why even write C++ at that point? Wouldn’t changing over to Rust save you at least a couple of those steps?
Tooling, libraries, OS vendor support, IDEs, work force availability,...
Many of us write C and C++, because we have to, not because we want to.
Rust will eventually reach there, however many of us want to deliver a product, not build an ecosystem from scratch.
It won't find threading issues, or "implementation-defined behavior" (which isn't the same thing as undefined behavior).
For things that UBsan checks for, yes, but there's a ridiculous amount of things that are surprisingly UB in C(++).
If the switch proves anything it's that you don't have to pay $1600 for a GPU to enjoy amazing games. Or the other way round, if you pay $1600 for a gpu it does not mean that you'll be able to play any good games.
I think the technical limitations of the Nintendo Switch makes for more interesting games in general. Most of the games on my Xbox and PC tend to be FPS/Action games with very boring, brown and green "realistic" environments whereas games on my Switch tend to be bright and colorful with lots of 2D and pixel graphics and games in the vein of simulators, adventures, roguelikes etc.
Tangent: It's amazing that the popularity of XP to "level up" and HP to bring to zero has become. This was a new concept to me way back when Final Fantasy 3 (3 in the US, I think 6 in Japan) came out. It felt very niche and reserved for a specific type of game. Now those ideas are everywhere.
What's a common new game trend since then? I can think of these: randomized microtransaction packs, online multiplayer PvP and co-op.
Are there any established trends that I'm missing out on, or new concepts emerging that old gamers may not have come across?
The Battle Royale genre (Fortnite, PUBG, ect.) is the biggest new genre I can think of. The 100 player last-man-standing game mode probably wouldn't have been possible without the advancements in networking speed and infrastructure in the last 15ish years.
From personal observation and experience, I believe the BR game mode got some traction after the first Hunger Games movie released. After that movie released, I remember all sorts of "Hunger Games" Minecraft maps appearing online. I downloaded, played and hosted a handful of these Hunger Games "last man standing" maps for family and friends to play.
After a few weeks of that, the game format grew stale, and I stopped playing HG maps. Then a few years later a whole bunch of BR games hit the scene and all I could think of was how similar they were to the Hunger Games Minecraft maps.
That's just my guess from personal experience.
The modern Battle Royale mode evolved from multiplayer mods for Minecraft and then ARMA made shortly after the 2012 movie Hunger Games which provided the essential ideas all together (elimination, lots of players, must explore a huge map for items).
I think Minecraft can't be overrated as a vehicle for experimentation in multiplayer modes. Not only did the game have a huge modding and multiplayer community and have support for large player counts and maps, but it's unusually easy to create for and the game itself has no built-in structured competitive multiplayer gametypes, which means there's tons of demand for people to make gametypes from scratch and there's no built-in code for how matches work that needs to be worked around for new ideas to be attempted. Many other online games have specific ideas of how matches work built-in that reduce the demand for brand new gametypes and could make implementing a gametype where matches work differently daunting.
Wasn't PUBG originally an Arma3 mod?
- Early access. Back in the day the dev would pay you with a free copy of the game for helping with the beta. Now you pay the dev for the privilege of helping with the beta.
- Cosmetics. Tons of games have stuff you can buy (equipment, skins, etc.) to make your character look cool. Extra stuff that can be developed easily by an artist, doesn't affect game balance, and sometimes even costs real money.
- Battle pass. I've never bought one but I think you get stuff on two tracks for progression, a free track and a paid track. Progression resets and there are new tracks like 4 times a year or something. Makes people want to open their wallets for the paid track due to FOMO and "...but I already earned it!"
I think if you switch from free to paid you immediately get all the stuff on the paid track you've already progressed past. Also I think they want to charge you $10 four times a year, every time the new tracks come out.
- itch.io -- Sort of like Steam / Epic / GOG for indie developers too small to be on those sites yet.
Crafting is absolutely everywhere, as is some kind of level up system (attributes/skills/perks/traits/etc...).
Procedural generation was supposed to be big and exciting.
If done incorrectly, though, procedurally generated quests or dungeons feel boring and soulless.
Oh yeah, and terrain deformation! Cool concepts. I'm not sure which ones do it well. The Diablo series seemed cool, but now it just kinda seems like "yeah, it's more and more and more of the same", even though some of it is procedurally generated.
Most of those are hardly Switch exclusive.
I like the Switch, but it definitely chugs, and I suspect had we not had the supply chain meltdown over the pandemic we would‘ve gotten a more significant hardware refresh.
As far as simulators go, I‘m playing Two Point Campus, and it definitely chugs and crashes occasionally.
A lot of the best 3rd party switch games are ports from PC games or games available on other consoles: spelunky, shovel knight, cup head.
I think the technical imitations can definitely hamstring their games, even the first party ones. breath of the wild comes to mind, and frequently dips into sub 30 fps which breaks the immersion factor.
Play different games, Ori, Tunic etc are both bright & colourful and yet somehow manage to be heart-wrenching at the same time.
Yes you don't need PC hardware to play those but non-console gaming environments are also about no lock-in, customisability etc as well.
You're right though - you don't need a fancy GPU to have a fun game, but what you DO need is permission from Nintendo to publish on their platform.
But that's your choice. Almost all of my PC games are 4X and turn based RPGs. All of them make heavy use of a mouse and keyboard. And in addition, you don't need stupid 4 slot cards to play them.
It would be true if Nintendo still had support of a beast like Rare. Alas, there are very few if any companies that are willing to work with these limitations and create games specifically for Switch.
And the main reason the games are colorful on Switch is that Nintendo still, to this day, has a reputation of kids-only console. And Switch also pushed it towards "casual one-off gaming" which invites bright colors (think Candy Crush).
I mean that's always been Nintendo's thing, hasn't it? Cheaper, less powerful consoles, but focus on iconic gameplay
I'd say that only really started in the 7th console generation (Wii, PS3, Xbox 360)
Gamecube and prior was roughly comparable to the competition, but after that the other companies were trying to outperform each other in graphical fidelity while Nintendo released basically another Gamecube but with motion controls. And it was a hit!
Certainly the N64 (immediately prior to Gamecube) was supposed to be super high tech. They actually stressed the partnership with SGI in the marketing, and literally named the console after the word size of its CPU. In retrospect the former was pretty weird, since the public would not have known that name or anything.
Folks can quibble about the Gamecube if they like, but step back one generation and they were definitely trying to push the technology.
It was NEC that started the bit wars with the PC Engine in Japan. They heavily marketed its graphics as 16-bits, to have a quick way to explain the quality improvements over the Famicom.
> Gamecube and prior was roughly comparable to the competition
It gets forgotten about because it didn’t sell all that well and because the tiny little 1.8GB discs kept most of the multi-platform blockbusters off it, but the GameCube was significantly more powerful than the PS2 (though not as powerful as the Xbox).
There was an amazing effort post back when G4TV forums existed explaining why GC was on par or better than xbox using examples like Rogue Squadron and technical details. The main issue was that third party games didn't exist or were low effort ports to GC.
I wish I knew how to find the text of that post. The G4TV forums seem erased from history.
> I'd say that only really started in the 7th console generation (Wii, PS3, Xbox 360)
I think that's when they really started leaning into it again, but Gunpei Yokoi (Gameboy inventor) was a big proponent of the idea, which he called "Lateral Thinking with Withered Technology". So it's been a part of Nintendos design process for a while before that.
It was fairly true for the NES and SNES generations as well. Each used a CPU five years old by then and a RAM size that was common in computers eight years prior. Really, for only the N64 console generation did Nintendo ever try to outdo its competitors on hardware specs.
There were also the Atari consoles. The 7800 was contemporary with the NES with slightly better specs, there was also the XEGS, and the Panther was in development at the same time as the SNES before it was cancelled for the Jaguar instead. They all flopped in the market, of course, but Nintendo didn't know that would happen while developing theirs, but they kept their hardware simple rather than ultimately overreaching like the Jaguar.
It doesn't matter how the NES compared to computers, because computers cost much more. Also, less RAM is required if the program is on a cartridge.
I'd argue that first time Nintendo exhibited that mentality was with GameBoy.
I think their Game and Watch systems show it too.
Not always, the N64 was more powerful and more expensive (Game carts) than the competition. So was the GameCube (It was more powerful than the PS2).
Maybe the cpu was more powerful (honestly idk) but the low memory of the N64 limited the textures and video features (never saw any video in an N64 game) in the games (see how flat texture-wise the textures of Mario64 are).
Iconic meaning no new IP, just nostalgia hacking.
Sadly, pretty much all of the games I want to enjoy run terribly on the Switch to the point where I had to go back to my 800$ GPU to play them.
Pretty much every strategy / tactics game (outside a single one: Fire Emblem) runs terribly, below 30 fps, looks very burry and... the killing blow... has crazy long loading times. You can go grab coffee while Civ6 is loading and still won't be done by the time.
Quality control of games from Nintendo is very very poor. There are several games that came out which were unfinishable or unplayable and they just happily certified them.
Interestingly Civ runs fine for me and looks good handheld, but it all falls apart when docked.
I only use Nintendo hardware for Nintendo software and I'm never disappointed!
Depends entirely on what you play. Cyberpunk 2077 in 4K with DLSS and raytracing is exquisite...
... as long as you ignore the places where CDPR didn't actually finish debugging the game and errors abound, from innocent things like a pathing error in a pre-rendered character animation causing them to steamroll through movable objects in a scene to arguable game-breakers like a mission neither being succeed-able or failable because some key internal flag is in the wrong state, with the only solution being to roll back to a save before the error occurred.
I've beaten the game and NEVER had any issues like that, other than right after launch. You should consider leaving the faction of people who are irrationally bitter at that game because it is a position that is becoming more comical over time. The game is really quite good now and has been for some time.
> actually finish debugging the game and errors abound
Literally EVERY software developer is guilty of this. I cannot count how many games have come out in this state and there wasn't an ridiculous backlash, just CP2077.
NOTHING comes out bug free nowadays, because we as an industry have this idiotic policy of shipping things ASAP
You can't be very far in the game if you are stuck on that quest. Why not just start over?
Consider SQLite as a conterpoint before using superlatives. There are others.
Edit: well, of course, a game bug, even if game-breaking, isn't safety critical. Yo have to plan what to spend time on.
I looked up what this meant, and I think it means this: https://www.youtube.com/watch?v=Xk8Id06dcAQ
The level of detail is impressive
Cyberpunk 2077 main issues are the very unequal main story and the extremely boring and repetitive side quests (also the driving gameplay but that’s fixable). That’s not things graphics can save sadly.
Switch software is incredibly expensive. if anything, PC Gaming proves that much more easily. My $300 laptop can run Factorial better than a Switch can.
you dont have to pay 1600 usd for a gpu on PC to enjoy Switch like indie games. even integrated graphics will let you enjoy thousands of non AAA games these days. what is your point?
Btw Factorio runs on any kind of PC well before it was on the Switch.
> Now, the final startup time when the game is installed on internal storage is 70 seconds, but let's not get ahead of ourselves.
Maybe I don't notice this with other games but that seems like a really long time.
Animal Crossing has some pretty long load times on startup or coming back from islands, but I'd still expect it's under a minute.
On the other hand, factorio is a load once and play for hours type game compared to something like mario which incurs loads on each level
Civ VI can also take a while to load. Especially if you’re loading a game that’s already in progress. I haven’t timed it but I wouldn’t be surprised if it was over a minute.
That jumped out to me too. I was hoping "let's not get ahead of ourselves" meant he'd explain why the startup time is so long but it's never mentioned again.
I just timed Breath of the Wild (cartridge version), and from the time I launched the game from the Switch menu to seeing the game menu was about 16 seconds, and the time from selecting a save file to loading the game world was about 18 seconds. What exactly about this game makes it take so long to load?
BotW got some after-launch patches that significantly reduced the launch time.
They did this by overclocking the Switch during load. I didn't see any mention of boost/overclock in the Factorio writeup, though.
Interesting, I wasn't aware (or forgot)! I looked into it though, and it seems like it wasn't anywhere near the minute+ mentioned in the article.
I found a GameFaqs forum thread[1] from shortly after the game launched that claims it was about 20 seconds. And a US Gamer article[2] that says the update that came a couple years later only shaved off about 5 seconds (seems within the ballpark of what I measured)
[1] https://gamefaqs.gamespot.com/boards/189707-the-legend-of-ze...
[2] https://www.usgamer.net/articles/breath-of-the-wild-load-tim...
Do you know if this technique is available for all games or just Nintendo's own? It seems like it would be easy to cause issues if games were allowed to mess with the clocking willy-nilly, so I assumed that it was only done on Zelda because it's a first-party title (and one of the biggest on the console).
Interesting, curious if Factorio is using it then.
This article[1] claims Crash Team Racing takes advantage of boost mode, so seemingly it's available to third-party developers.
Perhaps they don't get to mess with the clock "willy nilly" and it's more like an API that enables a higher - but fixed - clock speed?
[1] https://www.shacknews.com/article/112895/crash-team-racing-n...
This is available for any game since 2019, here’s an article talking about it: https://www.eurogamer.net/digitalfoundry-2019-nintendo-switc...
Why manual and not a dynamic allocation of power budget like gaming laptops use? The driver decides which of the CPU or GPU is "starved" and gives the other more power.
One theory: If a game is GPU limited, which most are, it will be at 100% utilization no matter how much power it steals. However the CPU can't be power limited too much. Games have a physics loop that has to runs at a constant rate, independent of rendering. If the CPU is at 100% any disturbance might cause the physics step to not finish in time, and the game crashes...
Gonna have to wait for someone from the community to come along and profile the game to find that 10 Mb json file of in-app purchase items being parsed in O(n*2).
I love how this still comes up all these years later
I presume the GP is referring to the GTA V fix, which was only 18 months ago [0].. having been ever present in a game that had been out for over 6 years!
I think it'll be referred to for years to come as an example of not only how seemingly simple decisions can come back to haunt you, but also how patient users can be if they really want to play something..
0: https://www.thegamer.com/gta-online-fix-loading-times-offici...
I’m kind of surprised (and bummed) that they ported it to the Switch and not the iPad. The Switch is a bit slow (can’t imagine big bases performing well) and I’m not convinced using a game pad will work well. By contrast, modern iPads are smoking fast and I bet an iPad touch interface would suit the game fairly well. Ah well, maybe some day.
I think you could say this about a lot of games. IMO, Apple doesn't make it easy to port a game from PC to iPad (app store guidelines in particular), and I doubt there are as many iPad gamers as there are other platforms.
This is true but Factorio is not Madden; for example they have a sizable (for Factorio) built-in audience of everyone making iOS apps. The pencil would work well for input, I'd gladly buy it again.
The Switch got a touch screen as well?
A teeny tiny one in comparison.
I think I only use it when entering my character's name in an RPG after slowly typing a few letters with the controller then remembering I can press the screen.
For anyone who wants a Factorio-like on iOS, there's a great game called Builderment. It's not as complex as Factorio but it does get rather involved. Works well with touch and even works well on the smaller screens of iPhones.
I was going to check it out, but IAPs are full of gem packs, which is not very encouraging. How is the freemium aspect?
Sorry for the late reply. It works fine for free. There's no impediments to the actual gameplay. Gems aren't mandatory for anything. It's been a while since I last played, but when I did, gems weren't needed for anything other than random decorative items.
How does a game like Factorio take so long to load? The game assets themselves are like megabytes. What is it doing?
A comment from the developer on Reddit: https://old.reddit.com/r/factorio/comments/xlufvr/friday_fac...
> The majority of time spent starting the game is preparing the images from disk into a format that the GPU can use. The images on disk being compressed and or not the exact format that GPUs want. Now, what that time on switch is spent on… I don’t know for sure but I would guess something similar.
Some games, like ACNH, have a long load time.
Which is a little disappointing, since one of the things I loved most about the GameCube game was the instant load times. I also enjoyed taking the disc out of the slot after the game was loaded.
Animal Crossing is an interesting game to me because of how little the series code has changed over time. The GameCube releases seems to have almost literally been the Nintendo 64 release recompiled against a different SDK for the GC platform (weighing in at only 27MB total for the GC release! An entire 1.4GB DVD for 27MB!), while the Wii and 3DS games, and seemingly the Switch one, are variants of the DS rewrite (seemingly the last time the codebase was scrapped and restarted).
I'd be really curious to know how similar the Switch code is. On the surface, the game seems to have been extended quite a bit, but I feel like a lot of it is just uncovering functionality that already existed in the codebase, e.g. the ability to place arbitrary items on the outside grid, I'm fairly sure the "engine" already supported this, and it was the way outside objects were implemented, there was just no interface for adding/removing them.
Hmmm, I wonder! My experience with this comes from doing some minor reverse engineering work on the Wii game back in my teens, in order to pass all the dialogue through Google translate, for great amusement [0].
My observation was the data structures/file formats were basically slight modifications of the ones used in other first party Nintendo games, primarily Mario Kart and Mario Party. So it seems like it's always drawn heavily from some internal SDK/engine, it's just not clear how much, or how much of that engine is saved between consoles.
Anecdotally, the DS and Wii (and ... maybe the GC? I don't remember now) used the same file format for the dialogue scripts, although the DS version was little-endian, and broke all my (terribly brittle) "tooling". The 3DS was totally different, and I ... Think the Switch was a variant of the 3DS format? It's been a hot minute since I've looked at any of this stuff :-)
Pretty much all of the games on Switch have very long loading times. Some stretch even into minutes (XCom and Civ6 come to mind as being pretty bad).
I haven’t measured, but Subnautica (on Switch) takes forever to load a save when it starts. I would not be surprised if it takes more than one minute.
Playing modded Rimworld (even on reasonable hardware) is a total ball ache for loading times.
I need to be absolutely sure i want to play the game five minutes before i do.
Related:
Factorio is coming to Nintendo Switch - https://news.ycombinator.com/item?id=32825543 - Sept 2022 (359 comments)
Will the controller designs end up in the pc version? Would love to play this on my Steam Deck in console mode!
The article linked says this:
> Even after the launch, there is much to do. Next to my screen there's a stack of post-it notes with future improvements, possible features and technical debt I need to solve. As mentioned in the announcement last week, after the launch I will also work on controller support for PC and Steam Deck.
The Nintendo Switch has an ARM64 CPU, which means the biggest part of porting to that CPU is now done. Does it mean we can expect an Apple Silicon mac build some time soon(ish)?
I emailed them last week about Apple Silicon and they told me to watch for today's blog post. I interpret that to mean that, yes - this means they can more easily do that.
It runs fine on the M1. iPad support would be the real winner.
No. It’s a completely different graphic stack.
If you target PC/Console and don't have an abstract GPU interface, then you should rethink your life choices
And all the graphics stacks nowadays are similar, Vulkan, Metal, DX12, all share the same ideas
Also there are countless opensource cross platform GPU libs, bgfx [1] or sokol [2] for example
Yeah theoretically it’s that easy but irl it’s not…
There’s already a Mac OS version, so it they’ve presumably already handled that side of things.
Like most macOS games, it most likely is an x86-64 game that Apple's x86-64 emulator does an admirable job of emulating. As far as I can tell, very few Mac games, even new ones, are proper ARM builds currently.
IIRC, Factorio uses SDL - which afaict already targets macOS.
> It allowed me to make many Nintendo Switch specific optimizations, and even some optimizations for the PC version
Are there significant CPU-specific optimizations that can be made for the Switch / ARMv8 that wouldn't apply to x86-64? I've never really dug into things at that level, I wouldn't know where to begin except for like vector instructions.
My understanding is ARM has less strict memory (ordering) guarantees, as well as the inability to explicitly trigger cache line flushes from user-space. That being said, I imagine most of the optimizations would come from the particular aspects of the GPU/graphics pipeline, which I imagine is substantially different from the standard PC structure.
Yeah you can always cut corners with the GPU if you have to, but Factorio has the large problem of making a CPU-intensive deterministic simulation run at 60 Hz.
Thinking about it some more, there are also probably some tweaks you could use for situations with limited resources which wouldn't apply to most PCs, but would affect the Switch.
(Disclaimer: I haven't written assembly for a while so this may not be true any more)
One cool thing about ARM is that vector instructions run in parallel with the regular CPU pipeline. You can do some neat optimizations where you interleave sequential part of an algorithm runs while the SIMD instructions are executing. However if you do this yourself the code is going to be super non-portable. In general, knowing that something is always going to be running on a Switch lets you dig deeper into architecture specific optimizations, since x86 has so much variance it would take a ton of effort to really dial in performance on the whole menagerie of Intel and AMD CPUs
CPU wise - although you can optimize specially for the ARM - and it's great to make sure your math is using NEON and that you don't have some pathological slow down because your code assumes more cores you'll still find the lions share of your optimization on CPU benefiting both architectures.
For fun, I'll mention that there is a hardware hash function that you can use. The public A57 documentation has a performance guide available that is applicable to the Switch.
Most likely the optimizations are higher level than that (for the most part). With the switch they will be able to target to specific core counts and use a less generic thread/memory synchronization api.
There may be some cases where the fixed ARMv8 instruction set allows them to use instructions that have equivalents on recent x64 processors but not back to their minspec PC.
I have no idea. But if there are, the factorio devs will find them and write an extremely detailed blog post about the entire process.
I was so excited when I saw the Allegro library in the 90s! I lost track of it later and believed it was dead (I believe the author went to Microsoft to work on DirectX). But lo and behold, I've just checked what libraries the Factorio guys used - and what a pleasant surprise!
Yup, allegro is alive and well as version 5. But it's pretty much entirely different.
Am I the only one who after seeing the title thought they are building Nintendo switch inside Factorio? Which makes me think can it at least run something like Atari 2600 at decent speed?
A Minecraft-based Atari 2600 emulator was running about 1 frame per second last I knew: https://www.youtube.com/watch?v=mq7T5_xH24M
And that's heavily based on internal scripting with real code.
The recent "Minecraft"-in-Minecraft emulation was done with pure Redstone internal to the game, but has to be run on a specialized server that accelerates Redstone operations by several thousands to be even fast enough to record time lapses at a reasonable sped.
In-game simulations of logic gates that are generally running at fractions of a Hz are a pretty steep obstacle to overcome for any "real" computing like that. Just to calibrate your expectations.
Lots of things already are in progress on similar things: https://old.reddit.com/r/factorio/comments/93cd1t/building_a...
I’m not a game developer. What I would like to know is how does porting the code that draws the UIs and perhaps the game work? For example on PC one might use WinForms and DirectX so is this just a matter of writing wrappers to all your DirectX calls to the Switch equivalents?
Factorio uses SDL. Factorio has been cross-platform (Linux, Windows, Mac) from the beginning, so it makes sense that they're not tied down to Microsoft's proprietary nonsense. I'd imagine this choice made it a lot easier to port to the Switch as well.
It is a bit more involved than that, but yeah that is kind of basics.
The Switch supports OpenGL 4.6, Vulkan, and their own API NVN, which actually many engines make use of. Most likely Factorio is using one of the former anyway, so that problem is kind of out of the way, other that having to deal with the extensions and hardware/driver specific behaviors.
Then as they mention the blog, having to adjust the way the GUI works from a PC setup to an handheld.
This came as a total surprise... I'm impressed they managed to get the game running well on the Switch!
Will I actually buy it again, just to have it on the Switch?
...yes. Yes, I will.
I would absolutely love an iPad version. Much more than the Switch version (which I am going to get anyway).
It would certainly blow the Switch version out of the water performance-wise.
It would... on the other hand, iPads also cost a lot more.
It’s more a case of playing it on an iPad I already have. Otherwise yes, getting an iPad specifically for Factorio is some serious commitment.
That sounds great to me. And the iPad has a better screen and still supports PS/Xbox/Switch controllers.
But there would have to be a touch only interface and that may be the killer.
Well if they port it to Apple Silicon macs making an iOS build shouldn’t be crazy difficult right? (Famous last words)
Still waiting on the Apple Silicon port. The Mac build of the game is still x86-only.
Might be the only thing that can utilize all the power of an M1 Pro!
The devs have talked about performance bottlenecks a lot - the real benefit of Factorio on the M1 would be the RAM interconnect, as most of the performance issues are related to RAM latency. Upgrading your PC from (e.g.) DDR3 to DDR4 gives you a whole lot more FPS per $ than upgrading your graphics card or CPU.
Seems like ram latency is huge for factorio. I know the UPS skyrocketed on the 5800X3D with its 96mb of L3.
https://www.reddit.com/r/factorio/comments/f2nab9/ram_speed_...
Sounds like the M1 is Taylor-made for this, then! The memory bandwidth is insane and latency is great.
Factorio is the best game I've ever played. It's so good.
Factorio's official requirement for x86 is a dual core 3.0Ghz processor & yet it's able to run on Switch which has a very weak 1Ghz quad core cortex-A57 ARM CPU.
How is this possible? I would imagine CPU requirements to be more or less same across any resolution / operating system.
This is what makes Nintendo Switch interesting to me. It's CPU is even less powerful than an Raspberry Pi 4. Even midrange android smartphones today are much more efficient and powerful than Switch and yet Switch can run Doom / Skyrim just fine.
Because the Switch is running Factorio on top of a bit of code optimized for running games. A PC is running Factorio on top of a full-featured general-purpose OS, and most PC users are going to be running other things concurrently; web browsers, Electron messaging apps, malware, etc.
Of course, being able to target a single hardware platform with very minor differences like the Switch also means the ability to make optimizations to that specific platform without having to care about possibly breaking compatibility with others. It'd be like being able to optimize your game for people with Nvidia cards and completely ignore those with ATI, Intel, or any other brands on the PC.
Switch can't run Doom "just fine", it downscales down to 360p frequently and chogs between 20-30fps. It's barely playable. Mind you, Doom is one of the better optimized games of this gen. And running it sub~60fps on with all settings turned down to ultra low is a sin.
Factorio is pretty well optimized too, however the factory size Switch can handle are on the - can launch the rocket - scale. This is but a small fraction of the factory size people end up building
If it does i do not notice Doom down scaling in the first 3 hours of game play.
Part of it might just be that they have lower expectations for Switch in terms of the CPU being able to keep up with a large megabase. They expect players to be able to launch a rocket without any noticeable lag, but one-rocket-per-minute bases might be a bit much.
Also, they mentioned in the previous blog post that they've done some optimization to keep the framerate from bogging down, so maybe even the x86 requirements are lower these days (or will be when those optimizations show up in the public version, which might not have happened yet).
> Switch and yet Switch can run Doom / Skyrim just fine.
That "just fine" hides a massive understatement on just how compromised these games are on Switch. They usually run on settings even below lowest possible on PC and regularly drop resolution way down to even 640x360, making everything extremely oily and blurry.
These games run and can be enjoyable - but "fine" is not really a word that fits into their performance :)
A modern 1.6ghz laptop cpu like the 1240U is probably fine for factorio.
A Core 2 Duo E8400 at 3ghz probably isn't.
They don't want to test and then list the minimum CPU for every generation of cpu for the last decade or more when listing system requirements. So they go a little higher in the stated requirements than is probably needed with newer gens so people don't get angry and disappointed that their older gen cpu doesn't run well despite having the listed clock speed.
The Switch CPU isn't actually that terrible. You can run into issues with the number of cores. The main thing you will find is that generally most codebases are full of potential optimizions but they aren't acted on until you find yourself constrained.
The platform variance for x86/x64 CPUs is... way bigger. The actual instructions per-clock could be wildly different and has massive impact on performance than just the clockspeed.
I wonder if they could release a Pi 4 port or finally a mobile version (controls will be troublesome).
It's still a year away according to the end of the blog, but it's nice to get an insight into the amount of work necessary for a port.
(oops, that was the expansion, misread)
> But we are releasing on 28th October 2022
Unless I'm misunderstanding this section, that's about 5 weeks away.
> it still won't be ready sooner than in a year from now
This is a reference to the expansion.
This comment was about the upcoming Factorio expansion.
when i’ve played games like RCT1 that can handle thousands of guests in the park, 1000s of placed items, each square customized, hundreds of rides simultaneously with real physics, it makes me wonder why factorio needs such intense system requirements for a mostly 2d production game
For one thing RCT1 was famously written in hand tuned assembly language, but the main reason AFAIU is coupling of components. Early Factorio isn't that bad, but later factories it has to constantly simulate every machine on the map all of the time because everything is running regardless of your presence and everything is interconnected. That complexity scales up rapidly as your factory expands. Games like RCT aren't as dynamic and tightly coupled, so the simulation can use a lot of simplifications to cut down on on CPU use.
This kind of determinism testing is a fairly common approach in games that implement shared-state multiplayer, and I think that's why it was implemented for Factorio as well. Otherwise you get drift between clients that breaks the game.
One alternative is to run everything on the server, but that introduces another set of problems.
IIRC that's what Satisfactory does and it causes all sorts of synchronization issues and glitches - non-hosting players not seeing buildings or stuff on belts or even glitching into/colliding with things that were not there a split second ago, as they loaded from the server too slowly.
That won't really happen in Factorio due to the shared state & makes possible insanely complex layouts that all players can see an interact with exactly the same. The downside usually is that eventually - just after you got that 20 GW nuclear power plant with trucked water & and steam going - the slowest player starts to have issues as his hardware no longer can keep with the complexity of the simulation.
Then you either start shedding players or call it a day and start again with a different objective or mod.
I built a similar system for a Doom port--compatibility was an important issue for me, and conveniently I was adding client/server multiplayer with state deltas, so I had all the infra built to do a full dump every tic. It sounds wild, but if your game is already deterministic and you support saving, you're 80% of the way there.
Automated testing is great! More game developers should get serious about using it, even if you're not doing full coverage unit testing like business software.
Yes! Automated testing is a super power. I haven’t had a single bug in production for 5+ years because of automated tests.
Whats CRC?
I'm guessing cyclic redundancy check. You turn the game state in each tick into a number, then play the exact same game on both platforms. If you get the same stream of numbers, you can be pretty sure the game is behaving correctly.
It's likely a hash of the game state. They save input and playback recordings of the game, expecting to always receive the same hash for the game state every frame.
Just this week I started on a similar system for a game engine I work on (Zelda Classic). Main reason was for testing purposes, but another benefit is that it is SO cool to see the game replayed for you. Like those self playing pianos.
cyclic redundancy check