Part of the original thing was Epic offering a discount of 20% on V-Bucks if they processed the payment through Epic instead of Apple.
The end user got a nice discount and Epic still pockets more because interchange fees are ~2.5% and not 30%.
The Epic Games Store had a similar strategy, cut the fee for devs and incentivize end users to move onto their platform (weekly free games). They still make a boatload of money even though it's not as much as Valve's money printer.
In general I see this as a great thing for devs (indie in particular) if it triggers more competition to bring platform fees down across the board.
Epic isn't being overtly greedy with end users (yet)
Retailers take far less than 30%. Consoles provide first-party QA, disc/cartridge manufacturing and distribution, bandwidth, and customer service among other services. Whole apps on the mobile stores also have similar services provided by the platform - testing (nowhere near as rigorous as consoles), discovery, bandwidth, customer service, etc. In both cases, a large cut can be reasonably argued for.
IAP is nothing more than moving money around, in an app the user has already discovered and downloaded, that will affect a user's account on the developer's servers. The service to the end user is indistinguishable to what is provided by any other payment gateway for ~3% (Stripe, Square, Paypal, etc.) Your card is charged and you have something new in the app.
The only reason I can think of for why IAP takes the same cut as whole apps is that most if not all apps would be "free" but open up to a screen that makes you pay for the app via IAP to take advantage of the reduced fee.
This could be mitigated by ToS restrictions preventing this exact situation, but there would still be a ton of gray area like "pro" versions of apps.
Opening up to third-party payment processors for IAP would create a vacuum in one of their highest margin and most consistent revenue sources so they won't be doing that willingly. Opening up to third-party stores would be more tolerable but any sufficiently large developer will move to their own store and do everything themselves and pocket the ~27.5% (if they are their own payment gateway, interchange fees are ~2.5%)
It's an interesting situation because the platforms want to be paid for all the services I listed above, but in Epic's case they already have the infrastructure to handle everything on their own, and they offer a fully-featured game for free, with the only source of revenue being a conversion of actual money to V-Bucks.
There is no place where the platform can provide a service that Epic would get any value from, but they are imposing a 30% fee in the only place they can, payment processing.
I 100% agree and I also work in AAA. On the subject of StackOverflow being worthless... I've been working for 4 years now and I have learned pretty consistently over that time that the best way to solve a problem is to just keep digging deeper into the system with the issue.
You will eventually figure out that either the system has a bug or you used it wrong. And along the way you will familiarize yourself more with the system. (and debugging tools!)
The learning effect of this snowballs the more you do it. I'm a year and a half into a UE4 project and am now the "engine person" who people come to with questions or odd crashes.
I have seen every single pattern this book describes used somewhere within Unreal. They are all super valuable to know especially within game programming where problems are novel and often open ended.
>I have learned pretty consistently over that time that the best way to solve a problem is to just keep digging deeper into the system with the issue.
>You will eventually figure out that either the system has a bug or you used it wrong. And along the way you will familiarize yourself more with the system. (and debugging tools!)
I identify with this so much. It's a different kind of programming, where you're finding the source of the bleeding instead of trying to just bandage the wound, which is much more what I encountered in the tech industry.
10 years ago, Stackoverflow was fantastic. I feel a bit like its usefulness has dwindled.
But it's never been about debugging deep problems. Nobody can look at what's happening in your debugger. It's mostly useful for how to use specific libraries and frameworks or basic language features.
Unreal's Python integration is incredible. I made some modifications to it at work to run a version of WinPython (both for loose scripts outside the engine and pip access)
It's great for complex asset pipelines and quick one-off editor scripts, at least in my experience.
The only thing that felt a bit wonky to me was attempting to use bitflag enums.
Trying to find a GC-related crash on a stale pointer that is only reproducible in a Shipping build on a single platform is fun (depending on your definition of fun)
(Shipping in UE4 means release mode with full optimization enabled, most logging & profiling stripped out, etc)
It's possible that it's an async compute task, which could potentially miss a frame and show old data (instead of the whole frame missing vsync).
Also this demo is supposed to be running on a PS5 devkit, which means that you'd need a devkit to run it, which means that you'd need to sign NDAs and join their developer programs and whatnot.
Having worked with current gen consoles (meaning I can't go into any amount of detail), it's not a trivial thing to get a demo like this running well on PCs. This demo is likely making use of every platform specific feature available to them.
That said, the demo might be accessible through some back channels if you're already a UE4 licensee and have a PS5 devkit.
The link I shared wasn't UE5, and their other videos specifically indicate that they are running on PC - unbelievably modest PC hardware at that.
> it's not a trivial thing to get a demo like this running well on PCs
This old tired argument.
The current theory for how the meshing is done is something similar to mesh shaders (available on commodity PC hardware since 2018)[1]. This "PS5 platform specific feature" running on PC in 2018[2].
As for the lighting, NVDIA have already had this "platform specific feature" on PCs for some time now. It's called RTX. In 2018[3] (using DLSS), in 2020[4] (no apparent DLSS usage, but it may have improved).
Both next-gen consoles are essentially PCs. Their primary advantage is tightly coupled hardware (e.g. memory latency, the absurdly fast PS5 SSD). While dedicated raytracing silicon on AMD is currently unique to PS5 (AMD claims they can emulate DXRT on Navi), it has been around for more than a year in consumer hands in the form RTX.
I work in AAA. I'm talking lower level things like picking which "type" of GPU memory to allocate, access to specific registers in shaders, etc. PC didn't have real async compute capabilities until DX12, for example.
On the CPU side yeah it's 100% just a normal computer but nothing will be interrupting your threads. I think Windows 10 tries to do in it's new game mode too.
Sorry for assuming the link was the PS5 one. I have a UDN account and their login setup sometimes just dumps me to their homepage, so I made the assumption that it was the same video that I had seen everywhere else.
AMD GCN absolutely supports async compute[1]. Radeon cards for years would only make use of the ACEs in pure compute contexts, as OpenGL and DX11 had no concept of a secondary command queue and could not make use of them. This is a big part of the reason why Vulkan/DX12 require so much boilerplate to get a triangle rendered.
The PS3's SPU definitely counts as async compute especially with how it was used later in the console lifecycle[2] once people had time to familiarize themselves with it.
However, in the current gen consoles, you don't have to deal with a different ISA, command queuing, and shared memory between the GPU and CELL processor. You are only writing HLSL/GLSL/PSSL and setting up an aggressive amount of fencing to transition resources between readable and writable states within the GPU.
I'm 25 work at a relatively large game studio. My boss and the whole chain up (3 people) are all in their late 40s or older. I've learned so much from them as well as the senior engineers that I sit next to.
After my experience here, a low average age for an engineering team is a red flag. You're better off for not having been offered or taking that job.
My i7-2600k @ 4.4GHz agrees with you. The only reason I would upgrade would be for better USB 3+ support (I have a hard time with anything more complex than a SuperSpeed flash drive).
SSDs are by far the most cost-effective upgrade you can get nowadays. HDDs tend to be the bottleneck for boot times and general "snappiness" nowadays.
The reason I would upgrade from there is PCIe 3.0+, and thus support for fast NVME cards. They are almost as much a step up again from SATA SSD, as SATA SSD was from spinning rust IMHO
IIRC the sandy bridge generation could only do PCIe 2.0.
Very true, I did just recently pass up a great deal on an NVMe drive because I can't use it in my current setup. I believe PCIe 2.0 will also bottleneck USB 3.2 (or Gen 2x2 or whatever the naming scheme is now), and whatever GPU I upgrade to next (I read that it's a bottleneck with the GTX 1080 and up)
NVMe is the only thing that will cause a noticeable improvement for me though, seeing as I still game on a 1080p60 monitor and generally don't need that sort of speed from any USB peripheral.
Still, the processor itself kicks ass and I think the only reason why most people would need to upgrade are for newer peripherals.
I work in games doing mainly graphics work - it's amazing how many of these concepts still exist and have been recycled in interesting ways. Well worth the read if you're in my line of work.
For example, the concept of "sorted spans" in Quake is conceptually the same as how "light culling" is done in deferred and forward+ rendering pipelines. The first I'd heard of the technique was how Battlefield 3 used the PS3's SPU to do light culling for 64x64 blocks of pixels at a time.
From my experience (graudated in 2016), most interviewing is centered around algorithmic complexity or at least regurgitating logarithmic complexity algos.
Potential hires still in or just out of school should have no problem answering those questions, but a few years out and most people forget those skills since most of the time the answer is to use an existing implementation or find a way to avoid the problem entirely. All of the people I know with a 4-year CS degree learned all about that stuff in their data structures/intro to algo classes.
I work in games and have had to both implement a few data structures on my own (mainly specialized trees and graphs). I've seen them help performance a ton and I've also had to scrap one or two of them because the naive implementation was faster. Nowadays a lot of indirection means your processor is spending most of it's time waiting on memory reads, while flat arrays can be loaded into CPU caches a lot more efficiently.
The end user got a nice discount and Epic still pockets more because interchange fees are ~2.5% and not 30%.
The Epic Games Store had a similar strategy, cut the fee for devs and incentivize end users to move onto their platform (weekly free games). They still make a boatload of money even though it's not as much as Valve's money printer.
In general I see this as a great thing for devs (indie in particular) if it triggers more competition to bring platform fees down across the board.
Epic isn't being overtly greedy with end users (yet)