Great article. The description of how they handle shaders is just bonkers to me.
Is that really what you’d have to go through to have a working system with plugin shaders from 3rd parties on multiple backends? Or is mostly the result of time and trying to keep backwards compatibility with existing plugins?
Telling external devs “Write a copy in every shader language” would certainly be easier for the core team but that’s obviously undesirable.
Transpiling shaders is what most game engines have done for a decade now. Everybody thinks it's stupid in that field as well, but there is no viable alternative.
“Mac-only” was disappointing to read, but OBS’ render performance has been fine on macos and linux even with older hardware. James Webb calls anything heavier than helium “metal.”
That's besides the point though, the OS has been trash for realtime encoding for over a decade now. At the very least you have to write a script to repeatedly renice the process back to the top when it tries to protect you from the excessive thermal load lmao
While Metal might be easier to use, i'm pretty sure it is still easier to have to worry about Vulkan alone than Vulkan+Metal. And Metal predating Vulkan is really only of concern to code that existed before Vulkan was made available (which wasn't that much).
I'm no expert on topic. So, I maybe understood only 5% of what I read but I wish we had more posts like that. Announcements without any technical details sounds like marketing pieces.
I’m more excited about the upcoming support for VST3, but this is still welcome news. It is far easier than getting hardware encoding working with Rockchip SoCs on Linux.
> Metal takes Direct3D's object-oriented approach one step further by combining it with the more "verbal" API design common in Objective-C and Swift in an attempt to provide a more intuitive and easier API for app developers to use (and not just game developers) and to further motivate those to integrate more 3D and general GPU functionality into their apps.
slightly off-topic perhaps, but i find it amazing that an os-level 3d graphics api can be built in such a dynamic language as objective-c; i think it really goes to show how much optimization put in `objc_msgSend()`... it does a lot of heavy lifting in the whole os.
Modern graphics APIs minimize the number of graphics API calls vs. OpenGL and similar. Vulkan/Metal/DirectX 12 will have you pass command buffers with many commands in them instead of separate API calls for everything.
In the early 2000's there was a book on using Direct3D from C# that was pretty influential as far as changing people's assumption that you couldn't do high performance graphics in a GC'd language. In the end a lot of the ideas overlap with what c/c++ gamedevs do, like structuring everything around fixed sized tables allocated at load time and then minimal dynamic memory usage within the frame loop. The same concepts can apply at the graphics API level. Minimize any dynamic language overhead by dispatching work in batches that reference preallocated buffers. That gets the language runtime largely out of the way.
No, it doesn't. You won't find it used much if at all at these levels of the OS. Once you get past cocoa and friends it's restricted subsets of C++ (IOKit for example)
No it's not - the compiler for MSL is of course C++ because it's LLVM but the runtime is absolutely written in objc (there weren't even C++ bindings until recently).
No, I mean what is inside the Objective-C objects. Essentially everything on macOS has an Objective-C API but is implemented using C++. Have you ever noticed the ".cxx_destruct" method on like all objects?
What you are talking about are C++ wrappers around Metal Objective-C API. Yes, it is weird as they are going C++ -> Objective-C -> C++. Why not go directly? Because Apple does not ship C++ systems frameworks.
I hope Modern GPU APIs are just a stepping stone to something simpler. OpenGL is loved and hated; and I have grown to love it after using the new stuff.
It says in passing As the Metal backend is only supported on Apple Silicon devices, GPU and CPU share the same memory in the part talking about the differences between the Direct3D and Metal render pipelines.
Not sure why though, because Metal 3 is still supported on a bunch of Intel Macs...
I actually used an M1 MacBook Air for encoding/compositing by sending the video/audio sources over from my main PC with DistroAV (LAN).
Worked reasonably well (you can send camera/VTuber output and captured video from game and any overlays separately, or just use the setup in a similar way to a capture card and run ONLY the game on the gaming PC and everything else on the Mac), but added some complexity to it all.
A beefy Nvidia GPU would make that setup not necessary, unless you want to directly play games on the Mac.
Streaming video from camera? In general the newer Mac Minis in general were fine already just because the M-series chips are very fast, but hopefully this should make it much more efficient
Not all streamers are game streamers, and not all obs users are streamers. I installed on all of my workstations for its screen capture and virtual camera features.
I think the point extends well beyond the specific app/OS example though, even though the article talks to macOS exclusively. For macOS and Windows there are built in tools which offer direct recording functionality. To trigger on macOS Command+Shift+5 (or launch it via QuickTime as jasonlotito noted), on Windows Win+Shift+S. Both of these utilize the same OS APIs OBS Studio uses to get the screen content, but they skip the step of needing a renderer at all.
You need to install a 3rd party software Blackhole to even get desktop audio for screen recording with QuickTime. After about an hour of troubleshooting settings I gave up and used OBS, esp since I was in a public space at the time and the Blackhole config disabled my headphones and for a moment you could hear a loud YouTube tutorial playing through my Mac speakers. Also the shortcut to stop screen recording on QuickTime sucks, it’s like CMD+CTRL+ESC and you need to have it memorized because there’s no “Stop Recording” button option
I've had a lot of issues using the QuickTime screen recorder, especially when it comes to recording from an iOS simulator for app/game development and needing to produce preview videos.
Does anyone know if AMD 8845HS with 780M graphics (running fedora) can into this? Ideally very low system resources used, I only have 16GB RAM, also ideally very little storage space used, one or two frames per second is enough, ideally should compress even more if nothing has changed in the screen for a while, also ideally should create a new file every eight hours or so.
It should work yes. Fedora by default disables hardware accelerated video encoding but if you use flatpak versions of software (in this case the flatpak version of gpu screen recorder) then it should work. Even 12 year old gpus work.
Lower framerate doesn't really decrease video size because of how videos work, but you can set bitrate quality for the recorded video to reduce the video quality a bit to decrease the size.
NVIDIA has a "lower overhead" screen recorder, no? It's alt + f9 or something. AFAIK It's supposed to be optimized, because they own the stack and all. It's probably only on Windows though.
Is that really what you’d have to go through to have a working system with plugin shaders from 3rd parties on multiple backends? Or is mostly the result of time and trying to keep backwards compatibility with existing plugins?
Telling external devs “Write a copy in every shader language” would certainly be easier for the core team but that’s obviously undesirable.
https://devblogs.microsoft.com/directx/introducing-advanced-...
“OBS Studio Gets A New Renderer: How OBS Adopted Metal”
And i call it great music.
But they’ve clearly learned a lot that will help in the future with other modern APIs like DX12 or Vulcan.
That's besides the point though, the OS has been trash for realtime encoding for over a decade now. At the very least you have to write a script to repeatedly renice the process back to the top when it tries to protect you from the excessive thermal load lmao
Vulkan support was introduced in OBS Studio 25.0 in March 2020, 5.5 years ago.
Metal DOES... but only apple hardware.
In the early 2000's there was a book on using Direct3D from C# that was pretty influential as far as changing people's assumption that you couldn't do high performance graphics in a GC'd language. In the end a lot of the ideas overlap with what c/c++ gamedevs do, like structuring everything around fixed sized tables allocated at load time and then minimal dynamic memory usage within the frame loop. The same concepts can apply at the graphics API level. Minimize any dynamic language overhead by dispatching work in batches that reference preallocated buffers. That gets the language runtime largely out of the way.
What you are talking about are C++ wrappers around Metal Objective-C API. Yes, it is weird as they are going C++ -> Objective-C -> C++. Why not go directly? Because Apple does not ship C++ systems frameworks.
The term is Objective-C++.
Not sure why though, because Metal 3 is still supported on a bunch of Intel Macs...
AAA titles with newer graphics, well, you can always send a capture the PC with the nvidia card's screen through a capture card.
Back in my days of streaming, macOS was no option, cca. 2017. Today I'd do it with any M processor mac without a second thought.
Worked reasonably well (you can send camera/VTuber output and captured video from game and any overlays separately, or just use the setup in a similar way to a capture card and run ONLY the game on the gaming PC and everything else on the Mac), but added some complexity to it all.
A beefy Nvidia GPU would make that setup not necessary, unless you want to directly play games on the Mac.
I hope the next version actually works in some facility.
Turning off nearly everything iCloud- or Spotlight-related is a pretty good start; disable network access and you may find even more pearls of wisdom.
- recording your screen but not streaming
- you are not customizing what goes into your screen
Then use something else. GPU screen recorder has a lower overhead and produces much smoother recordings: https://git.dec05eba.com/gpu-screen-recorder/about/
A famously missing macOS feature. Loopback is yonder: https://rogueamoeba.com/loopback/
> the shortcut to stop screen recording on QuickTime sucks, it’s like CMD+CTRL+ESC
I just stop it from the menu bar, then on the resultant video press Cmd-T (trim) to lop off that footage.
Lower framerate doesn't really decrease video size because of how videos work, but you can set bitrate quality for the recorded video to reduce the video quality a bit to decrease the size.
Edit: I think you might have skipped reading the post. It's about OBS on MacOS. Where quicktime exists. Your suggestion seems geared toward Linux.