- recording your screen but not streaming
- you are not customizing what goes into your screen
Then use something else. GPU screen recorder has a lower overhead and produces much smoother recordings: https://git.dec05eba.com/gpu-screen-recorder/about/
Edit: I think you might have skipped reading the post. It's about OBS on MacOS. Where quicktime exists. Your suggestion seems geared toward Linux.
A famously missing macOS feature. Loopback is yonder: https://rogueamoeba.com/loopback/
> the shortcut to stop screen recording on QuickTime sucks, it’s like CMD+CTRL+ESC
I just stop it from the menu bar, then on the resultant video press Cmd-T (trim) to lop off that footage.
AAA titles with newer graphics, well, you can always send a capture the PC with the nvidia card's screen through a capture card.
Back in my days of streaming, macOS was no option, cca. 2017. Today I'd do it with any M processor mac without a second thought.
“OBS Studio Gets A New Renderer: How OBS Adopted Metal”
But they’ve clearly learned a lot that will help in the future with other modern APIs like DX12 or Vulcan.
That's besides the point though, the OS has been trash for realtime encoding for over a decade now. At the very least you have to write a script to repeatedly renice the process back to the top when it tries to protect you from the excessive thermal load lmao
> Metal takes Direct3D's object-oriented approach one step further by combining it with the more "verbal" API design common in Objective-C and Swift in an attempt to provide a more intuitive and easier API for app developers to use (and not just game developers) and to further motivate those to integrate more 3D and general GPU functionality into their apps.
slightly off-topic perhaps, but i find it amazing that an os-level 3d graphics api can be built in such a dynamic language as objective-c; i think it really goes to show how much optimization put in `objc_msgSend()`... it does a lot of heavy lifting in the whole os.In the early 2000's there was a book on using Direct3D from C# that was pretty influential as far as changing people's assumption that you couldn't do high performance graphics in a GC'd language. In the end a lot of the ideas overlap with what c/c++ gamedevs do, like structuring everything around fixed sized tables allocated at load time and then minimal dynamic memory usage within the frame loop. The same concepts can apply at the graphics API level. Minimize any dynamic language overhead by dispatching work in batches that reference preallocated buffers. That gets the language runtime largely out of the way.
Is that really what you’d have to go through to have a working system with plugin shaders from 3rd parties on multiple backends? Or is mostly the result of time and trying to keep backwards compatibility with existing plugins?
Telling external devs “Write a copy in every shader language” would certainly be easier for the core team but that’s obviously undesirable.
__mharrison__•8h ago
snvzz•8h ago
I hope the next version actually works in some facility.
keyle•7h ago
ChrisMarshallNY•5h ago