This is really cool. I'm using an M1 Pro at the moment, and with all of the settings turned on I maintain around 50FPS. This seems pretty reasonable to me. Reflections, ambient occlusion, bloom, etc.
It looks like changing the shadow map resolution breaks things:
None of the supported sample types (UnfilterableFloat|Depth) of [Texture "Directional Shadow Depth Texture"] match the expected sample types (Float).
- While validating entries[3] as a Sampled Texture.
Expected entry layout: {sampleType: TextureSampleType::Float, viewDimension: 2, multisampled: 0}
- While validating [BindGroupDescriptor ""G-Buffer Textures Input Bind Group""] against [BindGroupLayout "GBuffer Textures Bind Group"]
- While calling [Device].CreateBindGroup([BindGroupDescriptor ""G-Buffer Textures Input Bind Group""]).
// redacted...
webgpu-sponza-demo/:1 WebGPU: too many warnings, no more warnings will be reported to the console for this GPUDevice.
Nice demo - small tip/bug report, when I enable performance stats I get “FPS: 30.2ms”, which doesn’t make dimensional sense - one side of this statement has the wrong units.
If “30.2ms” is how long it took to render 1 frame, then label it “frame time”, not FPS (frames per second, or frame rate). Or if you want to show FPS, compute an actual FPS value, ie 1000/frame_time_ms.
(Frame time is a better metric for performance optimization work than the more popularly known frame rate, because frame time is linear, and frame rate isn’t.)
It's nice that you can do this. But 3D rendering in the browser is rare. It's been working for years, with WebGL. Here are some examples.[1] Once in a while you see 3D models you can rotate and zoom. There are 3D games in the browser.[2] Unity will target WebGL if desired. Despite fairly good technology for 3D in the browser, it hasn't really caught on.
WebGPU is more powerful. It's basically Vulkan Lite. Limited threading. Bindless Vulkan is at least two years away. Only one queue to the GPU. This limits performance to roughly OpenGL levels. Not clear there's a big market for slightly better 3D in the browser. You can't do an AAA title in the browser yet, because the browser environment is too weak. But in a few years, maybe.
Unclear where this is going. The near future might be a world in which the only way you can run unapproved programs is via a browser. Phones mostly only run apps from approved app stores, and Windows in S mode only runs approved apps from Microsoft's store. Each year, the restrictions seem to get tighter.
In which case the only way to do 3D anything without paying off the platform operator will be to use WebGPU or WebGL.
It's exciting to see WebGPU move from requiring "this nightly version of a specific browser" but I still had to go over to my desktop for that instead of Safari on iOS. There had been some rumbling that might be changing in 18.2 https://news.ycombinator.com/item?id=42110252 but I just tried resetting the feature flags to defaults and it was still off by default for me :/.
In this part of the code:
private onKeyDown = (e: KeyboardEvent) => {
// @ts-expect-error Deprecated but still available
this.presedKeys[e.keyCode] = true
}
The suppressed error was trying to highlight ".keyCode" causes a broken experience when the user has a non-QWERTY keyboard. Switching to using ".code" will behave based on consistent position ("KeyW" is always where W is on QWERTY, even when the user is e.g. AZERTY) for less work than suppressing the error. For user controlled instructions things get a bit more complicated/dicey if you want to 100% polish https://developer.mozilla.org/en-US/docs/Web/API/Keyboard/ge... but if there is a step to skip it's properly labling WASD in the user's layout rather than having movement be randomly positioned keys.
In the Safari section of the Settings app is a screen with all the debug flags, you can turn on the flag for WebGPU and then this demo works fine. Be warned, there are many shiny things in that menu… touch nothing but the lamp…
Most people don’t do this or know about it, so it’s still wrong to say that “WebGPU is supported on iOS Safari”. But if you want your iOS Safari to support WebGPU so you can check out demos like this, it can.
One of the big unlocks of WebGPU - shown here - is many lights in the scene, which is not possible with WebGL.
You might notice this still looks pretty dated, and that's primarily because the scene doesn't include ambient occlusion, which is usually the most important lighting feature to fake for realistic looking lighting.
What do you mean? Many lights rendering is very possible in WebGL. You need MRT (Multiple Render Target) support, which is widely available, and you can use that to implement a deferred pipeline. Here's an example from my favorite WebGL library: https://oframe.github.io/ogl/examples/?src=mrt.html
MRT support is available in WebGL 2 by default and in WebGL 1 with an extension.
You might be referring to some of the newer GPU-side light culling algorithms using compute shaders. I think that's the only major drawback of WebGL, the lack of compute shaders, but that can be worked around with some effort.
The only thing which I'd call unreasonable to implement in WebGL would the fancier virtual geometry approaches like Nanite, but for 98% of web 3D graphics WebGPU still seems excessive to me. Maybe around 2030 it'll be stable and widely available enough to start using for everything
You don't even need MRT to do deferred shading. You can do a light pre-pass as described here https://diaryofagraphicsprogrammer.blogspot.com/2008/03/ligh.... There's limitations because without MRT you can only output the normal in RGB and roughtness in A, but it works.
You can support many lights in WebGL just like you would with OpenGL. I implemented a rendering engine in WebGL2 a couple of months ago with deferred shading and support for hundreds of point lights using light volumes. I also implemented ambient occlusion, and directional lights with cascaded shadow mapping.
>One of the big unlocks of WebGPU - shown here - is many lights in the scene, which is not possible with WebGL.
Why not? My Firefox won't run the demo, so I don't know what it is doing that wouldn't be possible with WebGL. AFAIK there is no inherent reason why you can't render many lights using WebGL.
The slow progress of WebGPU and WebTransport has hurt my enthusiasm for games on the web. I was so excited 6-7 years ago but it feels like everything slowed to a crawl.
> Even with WebGL 2.0, there is nothing at the level of iOS and Android OpenGL ES 3.x games, after a decade.
I'm not so persuaded the barrier here is as technological as this forum is predisposed to believe, although I will concede that the Resident Evil iOS battery melter has no web equivalent to date.
The real problem is the web audience is wildly different to other platforms, and has very different expectations which prioritize speed of loading and then extreme long form engagement with little threat. This has created a very different ecosystem, and one that when it encounters something technically impressive goes "oh nice" then moves swiftly on to something else.
For example, you could 100% do Minecraft on the web today, with P2P multiplayer and everything else, and it's kind of revealing that this isn't a huge thing already.
But you absolutely could do Infinity Blade. No one does because it's not worth the effort. (I would argue this was true on iOS too - the games that made money did not look like Infinity Blade).
That recent Marble Madness a like https://news.ycombinator.com/item?id=42212644 was a far better fit for the audience on the web, and is not technically unimpressive, considering how smooth and responsive it is, along with the image quality.
And I don't have the same amount of assets, but in terms of rendering features this is more than Infinity Blade: https://luduxia.com/whichwayround/
I suggest you go and look at youtube videos of Infinity Blade, because that doesn't use physical lighting models or even have real time shadowing of any kind. It is all just big textures covering 90% of the screen.
That is my point: there isn’t a technological barrier. It is a business one.
If you made Infinity Blade and put it on the web today what would you get in return for your efforts? Complaints about how it runs better on newer devices than some six year old low end Android running Firefox, and people trying to hack it to change the assets and repackage it on crazygames.
It is a technology barrier as well, because browsers don't provide the tooling native APIs do.
Starting by providing the mechanisms to actually control the GPU, work around possible driver issues, the lack of debugging tools, no ways to actually fit into the browser sandbox PlayStation 2, XBox 360 and Dreamcast class games, let alone anything more modern.
It is a black box regarding user and developer experience alike.
You just ignore all contradictory evidence because you don’t understand as much as you think, and have just a superficial grasp of what you are looking at, while having a very nostalgic view of the past.
I am not saying you will get dx12 level games in a browser, certainly not on a phone browser, but your concept of what you are looking at and the real limitations are completely off.
To be honest you come off as stunningly offensive in the process on this subject, but I know you enough from other areas to know you are far from stupid.
The web environment today is nothing like as hard to work with as the Android NDK was in the early years. Source: I led the tech side at EA doing this, among other things.
It is not on my head, it is what I can see on browsers today across desktop and mobile devices, outside streaming native rendering, or ShaderToy demos.
The PS2 remark could as well be Flash 3D games, given what is available.
This WebGPU demo I'm linking to proves that statement to be incorrect, it's far superior to iOS and Android OpenGL ES content: https://play.spacelancers.com/
What “real games” on mobile devices are you thinking of? I assumed when you said real games you meant desktop games only hardcore gamers play.
So why does one game crashing on a tablet prove anything?
I think you’re right about browsers not providing enough graphics debugging tools… at least half the entire problem is browsers. They also don’t provide storage APIs that can deal with game assets, nor robust APIs for audio & controllers & peripherals. For better or worse, the current set of anti-cheat software for competitive games can’t run in the browser. The other half of the problem is distribution and ecosystem.
To a first approximation, around ~0% of the problem is WebGL, at least for mobile games, casual games, and most non-AAA games. The graphics is the one thing that’s more or less there and good enough, it’s everything else that’s missing.
It proves there’s a bug in one indie web game, and nothing more. It proves nothing about the process or the ecosystem or what can happen in the future with APIs.
> Even with WebGL 2.0, there is nothing at the level of iOS and Android OpenGL ES 3.x games, after a decade.
The main thing it seems you’re confused about is that ES3 and WebGL2 are very similar, WebGL2 was designed to be compatible with ES3. Why do you believe that ES3 is far superior, and what features, exactly, do you believe ES3 has that WebGL2 doesn’t?
edit: I see you also wrote a webgpu ray tracer, very impressive! I am slowly working on a browser based 3d game in my spare time, your projects are right on my interests. I see you used MIT license on your ray tracer, do you have a license for the sponza demo?
hey, I turn things off dynamically if the framerate dips below 60fps for longer than 2 seconds. Some people will have them unchecked automatically (and of course have the ability to turn them back on if they really want to)
Neat demo, but it runs pretty poorly on my 1650 Ti. It looks like the framerate is being displayed in the wrong unit, so I'm not sure exactly how fast it's rendering. Also glad to see a TAA toggle, when it's on, the ghosting artifacts are pretty atrocious.
WebGPU tech demo running in modern browsers showcasing various rendering techniques like deferred rendering with 400+ dynamic lights, Hi-Z screen space reflections and cascaded shadow mapping.
The point stands. Your comment added nothing of value to the discussion. You don't need to scrounge karma points by tl;dr-ing stuff for other people. We can all click on the link.
Is it just me, or is performance kinda... bad? Don't get me wrong, it's stable 144FPS, but it feels like it got my GPU's fans spinning faster than some modern games did (and indeed, power draw was fairly high). Similarly, when I launched it on my iPhone, even after it automatically disabled reflections, the phone got noticeably warm in my hand in under a minute.
Unless this was meant to be more of a stress test?
webgpu-sponza-demo/:1 No available adapters.
index-BeB41sTJ.js:2288 Uncaught (in promise) TypeError:
Cannot read properties of null (reading 'requestDevice')
at yn.initialize (index-BeB41sTJ.js:2288:13559)
at async index-BeB41sTJ.js:2288:13792
Uncaught (in promise) ReferenceError: GPUShaderStage is not defined
On Chrome:
TypeError: Cannot read properties of null (reading 'requestDevice')
It would be great if it showed some message, like, "your browser is not supported", instead of just showing an indicator which spins forever. At first I thought it was downloading a large WASM file and waited for a minute...
That "WebGPU: Disabled" can be anything from "Chrome considers the driver too buggy to enable WebGPU on by default for users" to "Chrome doesn't support that GPU/OS/Driver combo for WebGPU at all". You can try force enabling some various GPU flags in chrome://flags/ but whether that's successful will depend on the particular setup.
Until that switches from "Disabled", no WebGPU content or demos will load in your Chrome instance.
You can see overall user support https://web3dsurvey.com/webgpu. Particularly Safari on iOS/macOS and most browsers on Linux are still yet to start rolling out support by default.
You don't need a dedicated graphics processing unit to display video from a computer! Back before GPUs were even a thing the framebuffer would just be a region of memory and the video controller would turn that into a signal. Nowadays the latter is assumed to be integrated with the GPU because "who wouldn't need dedicated graphics processing in a desktop"?
You have to distinguish between the renderer and the assets it is rendering. In the video you linked to there are a huge amount of very specific assets (especially textures) and a few effects are doing a lot of heavy lifting.
In the demo on this page it is fair to say the starting camera position is about the least impressive location in the whole scene though. If you move it to hover up nearer where the lights are you see a lot more is going on.
It looks like changing the shadow map resolution breaks things:
Firefox: 'Uncaught (in promise) ReferenceError: GPUShaderStage is not defined <anonymous> https://gnikoloff.github.io/webgpu-sponza-demo/assets/index-... <anonymous> https://gnikoloff.github.io/webgpu-sponza-demo/assets/index-... index-BeB41sTJ.js:422:31 "
> Uncaught (in promise) DOMException: WebGPU is not yet available in Release or Beta builds.
If “30.2ms” is how long it took to render 1 frame, then label it “frame time”, not FPS (frames per second, or frame rate). Or if you want to show FPS, compute an actual FPS value, ie 1000/frame_time_ms.
(Frame time is a better metric for performance optimization work than the more popularly known frame rate, because frame time is linear, and frame rate isn’t.)
Uncaught (in promise) ReferenceError: GPUShaderStage is not defined
edit: seems to be related to this issue https://github.com/gfx-rs/wgpu/issues/5186
Uncaught (in promise) DOMException: WebGPU is not yet available in Release or Beta builds. initialize https://gnikoloff.github.io/webgpu-sponza-demo/assets/index-... <anonymous> https://gnikoloff.github.io/webgpu-sponza-demo/assets/index-... <anonymous> https://gnikoloff.github.io/webgpu-sponza-demo/assets/index-...
WebGPU is more powerful. It's basically Vulkan Lite. Limited threading. Bindless Vulkan is at least two years away. Only one queue to the GPU. This limits performance to roughly OpenGL levels. Not clear there's a big market for slightly better 3D in the browser. You can't do an AAA title in the browser yet, because the browser environment is too weak. But in a few years, maybe.
Unclear where this is going. The near future might be a world in which the only way you can run unapproved programs is via a browser. Phones mostly only run apps from approved app stores, and Windows in S mode only runs approved apps from Microsoft's store. Each year, the restrictions seem to get tighter. In which case the only way to do 3D anything without paying off the platform operator will be to use WebGPU or WebGL.
[1] https://webglsamples.org/
[2] https://www.crazygames.com/t/3d
In this part of the code:
The suppressed error was trying to highlight ".keyCode" causes a broken experience when the user has a non-QWERTY keyboard. Switching to using ".code" will behave based on consistent position ("KeyW" is always where W is on QWERTY, even when the user is e.g. AZERTY) for less work than suppressing the error. For user controlled instructions things get a bit more complicated/dicey if you want to 100% polish https://developer.mozilla.org/en-US/docs/Web/API/Keyboard/ge... but if there is a step to skip it's properly labling WASD in the user's layout rather than having movement be randomly positioned keys.Most people don’t do this or know about it, so it’s still wrong to say that “WebGPU is supported on iOS Safari”. But if you want your iOS Safari to support WebGPU so you can check out demos like this, it can.
You might notice this still looks pretty dated, and that's primarily because the scene doesn't include ambient occlusion, which is usually the most important lighting feature to fake for realistic looking lighting.
MRT support is available in WebGL 2 by default and in WebGL 1 with an extension.
You might be referring to some of the newer GPU-side light culling algorithms using compute shaders. I think that's the only major drawback of WebGL, the lack of compute shaders, but that can be worked around with some effort.
The only thing which I'd call unreasonable to implement in WebGL would the fancier virtual geometry approaches like Nanite, but for 98% of web 3D graphics WebGPU still seems excessive to me. Maybe around 2030 it'll be stable and widely available enough to start using for everything
Why not? My Firefox won't run the demo, so I don't know what it is doing that wouldn't be possible with WebGL. AFAIK there is no inherent reason why you can't render many lights using WebGL.
https://discussions.unity.com/t/webgpu-support-in-unity-6-1/...
Even with WebGL 2.0, there is nothing at the level of iOS and Android OpenGL ES 3.x games, after a decade.
Additionally, browser vendors haven't yet provided any debugging tools.
I'm not so persuaded the barrier here is as technological as this forum is predisposed to believe, although I will concede that the Resident Evil iOS battery melter has no web equivalent to date.
The real problem is the web audience is wildly different to other platforms, and has very different expectations which prioritize speed of loading and then extreme long form engagement with little threat. This has created a very different ecosystem, and one that when it encounters something technically impressive goes "oh nice" then moves swiftly on to something else.
For example, you could 100% do Minecraft on the web today, with P2P multiplayer and everything else, and it's kind of revealing that this isn't a huge thing already.
That recent Marble Madness a like https://news.ycombinator.com/item?id=42212644 was a far better fit for the audience on the web, and is not technically unimpressive, considering how smooth and responsive it is, along with the image quality.
And I don't have the same amount of assets, but in terms of rendering features this is more than Infinity Blade: https://luduxia.com/whichwayround/
That recent example, was designed for desktop, for example, lacking gyro use, and doesn't respond well to touch.
That demo looks more like a PS2 kind of thing, 2000 technology.
You have very serious rose tinted spectacles.
All really impressive rendering taking place on the browser are ShaderToy samples and demoscene competition entries.
That is my point: there isn’t a technological barrier. It is a business one.
If you made Infinity Blade and put it on the web today what would you get in return for your efforts? Complaints about how it runs better on newer devices than some six year old low end Android running Firefox, and people trying to hack it to change the assets and repackage it on crazygames.
You definitely would not recover your dev cost.
Starting by providing the mechanisms to actually control the GPU, work around possible driver issues, the lack of debugging tools, no ways to actually fit into the browser sandbox PlayStation 2, XBox 360 and Dreamcast class games, let alone anything more modern.
It is a black box regarding user and developer experience alike.
I am not saying you will get dx12 level games in a browser, certainly not on a phone browser, but your concept of what you are looking at and the real limitations are completely off.
To be honest you come off as stunningly offensive in the process on this subject, but I know you enough from other areas to know you are far from stupid.
The web environment today is nothing like as hard to work with as the Android NDK was in the early years. Source: I led the tech side at EA doing this, among other things.
When will EA prove folks like myself wrong?
The PS2 remark could as well be Flash 3D games, given what is available.
So why does one game crashing on a tablet prove anything?
I think you’re right about browsers not providing enough graphics debugging tools… at least half the entire problem is browsers. They also don’t provide storage APIs that can deal with game assets, nor robust APIs for audio & controllers & peripherals. For better or worse, the current set of anti-cheat software for competitive games can’t run in the browser. The other half of the problem is distribution and ecosystem.
To a first approximation, around ~0% of the problem is WebGL, at least for mobile games, casual games, and most non-AAA games. The graphics is the one thing that’s more or less there and good enough, it’s everything else that’s missing.
It proves how fragile the whole process is after a decade.
> Even with WebGL 2.0, there is nothing at the level of iOS and Android OpenGL ES 3.x games, after a decade.
The main thing it seems you’re confused about is that ES3 and WebGL2 are very similar, WebGL2 was designed to be compatible with ES3. Why do you believe that ES3 is far superior, and what features, exactly, do you believe ES3 has that WebGL2 doesn’t?
UE5 WebGPU demo: https://play.spacelancers.com/
Company website: https://simplystream.com/
edit: I see you also wrote a webgpu ray tracer, very impressive! I am slowly working on a browser based 3d game in my spare time, your projects are right on my interests. I see you used MIT license on your ray tracer, do you have a license for the sponza demo?
I suppose they are unchecked by default so that the demo runs out of the box on worse hardware.
Out of curiosity, are there any of these features that couldn't be done with WebGL 2?
Interesting key ordering in the instructions.
WebGPT has entered the chat.
Did you do the whole reversing of the z trick?
I despise implementing cascaded shadow maps, and have a lot of respect for anyone that makes them work.
Unless this was meant to be more of a stress test?
=======================
* Canvas: Hardware accelerated
* Canvas out-of-process rasterization: Enabled
* Direct Rendering Display Compositor: Disabled
* Compositing: Hardware accelerated
* Multiple Raster Threads: Enabled
* OpenGL: Enabled
* Rasterization: Hardware accelerated
* Raw Draw: Disabled
* Skia Graphite: Disabled
* Video Decode: Hardware accelerated
* Video Encode: Software only. Hardware acceleration disabled
* Vulkan: Disabled
* WebGL: Hardware accelerated
* WebGL2: Hardware accelerated
* WebGPU: Disabled
* WebNN: Disabled
Until that switches from "Disabled", no WebGPU content or demos will load in your Chrome instance.
You can see overall user support https://web3dsurvey.com/webgpu. Particularly Safari on iOS/macOS and most browsers on Linux are still yet to start rolling out support by default.
If he didn't, he would get the message "This demo requires the modern WebGPU graphics API to run. Seems like your browser does not support it."
In the demo on this page it is fair to say the starting camera position is about the least impressive location in the whole scene though. If you move it to hover up nearer where the lights are you see a lot more is going on.