I use Keynote to make my presentations, and one time I wanted to build a presentation with someone else. I asked my friend who has worked at Apple for 20 years, "How do you guys build Keynote presentations together? There doesn't seem to be an easy way to do that?".
He said, "We don't collaborate at Apple because of the (perceived) risk of leaks. None of our tools are built for collaboration". Apple is famously closed about information sharing, to the point where on some floors every office requires its own badge, and sometimes even the cabinets within.
So it doesn't surprise me that their video editing tools are designed for a single user at a time.
Edit: This happened about six years ago, they have since added some collaboration tools, however it's more about the attitude at Apple in general and why their own tools lag on collaboration.
Edit 2: After the replies I thought I was going crazy. I actually checked my message history and found the discussion. I knew this happened pre-COVID, but it was actually in 2013, 12 years ago. I didn't think it was that long ago.
It was quite common to have remote desktop cards on high end machines so that you could hide them away somewhere quiet. The edit stations/Flame/Baselite machines all hada fucktonne of 15k sas drives in them, so were really noisy.
You couldn't invite a director to see what you were doing, when all you can hear is disk/fan whine.
They were quite expensive because they needed to be able to encode and send 2k video in decent bitdepth (ie not 420, but 444), and low latency. Worse still they needed to be calibrateable so that you could make sure that the colour you saw was the colour on the other end.
Alas, I can't remember what they are called, thankfully, because they are twats to manage.
Interesting, but this misses perhaps the most embarrassing part: They're using Avid and not FCP.
I also don't buy the author's rationale for remote editing; it's oddly archaic: "high-end video production is quite storage-intensive, which is why your favorite YouTuber constantly talks about their editing rigs and network-attached storage. By putting this stuff offsite, they can put all this data on a real server."
Storage is cheap now, and desktop computers are more than powerful enough for any video editing. Any supposed advantage of remote "real servers" is going to be squandered by having to send everything over the Internet. The primary benefit of remote editing (and the much-hyped "camera to cloud") is fast turnaround, which you need for stuff like reality TV and news. But a dramatic series like Severance?
It is pretty baffling that Apple would create a PR vehicle that impugns its products like this. It would be better to say nothing. After Apple acquired Shake, they splashed Lord of the Rings, King Kong, and other major tentpoles on the Apple homepage at every opportunity... of course not mentioning that Weta was rendering those movies on hundreds of Linux servers instead of Macs. But at least Shake was the same product across all platforms, and it really was the primary effects tool on all those movies.
"they do not mention the use of Jump Desktop, which seems like a missed opportunity to promote a small-scale Mac developer. C’mon Apple, do better.)
Oh boy, this is just a minor infraction in Apple's history of disrespect toward developers. They do this, and worse, to major development partners too. I'm not going to name names, but after one such partner funded the acquisition of material on its own equipment and that material was used in a major product keynote... Apple not only neglected to credit or even mention that partner, but proceeded to show the name of a totally uninvolved competitor in its first slide afterward. The level of betrayal there was shocking.
Avid does have a cloud based solution. This isn't that.
It's a clever way to have your media centralized and yet have access to editors all over the world.
And a modern AVID system does not struggle with a few editors accessing the same footage.
First of all it's usually a proxy format and Secondly the storage can deliver a combined 800MB pr box sustained for x number of editors at the same time.
My home internet is a fiber gigabit 3g/3g up/down. Tucked away under the staircase is where my fiber ONT terminates and it is my server room. I have half a dozen boxes running various things. 4 symmetric 2012 i7 mac minis running linux KVM, and hosting various critical services - pihole, home automation, Homekit Secure Video etc.
Then there a giant former gaming PC with 7 HDD bays running the entire storage backend for a whole load of GoPro/Osmo/Insta360 videos I capture. Rclone to Google Photos for back-up. I don't edit any videos. Just there to capture memories so I can at some point when AI tools get good enough just have it generate clips. Same box runs my plex server with HW transcoding.
Then there is the actual gaming PC, a mini-ITX running steam remote play. Has power, a network cable and a fake HDMI dongle that emulates a monitor to trick the GPU into thinking something is actually plugged in.
Basically everything I do with desktop PCs at home is via some sort of remote interface.
Remote gaming is probably the most demanding of all of these. Low-latency HW-accelerated solutions eg: Parsec / steam-link are incredible technologies.
I carry an AppleTV + PS5 controllers to friends' houses and play the latest games across the internet.
> "In other words, little of the horsepower being used in this editing process is actually coming from the Mac Mini on this guy’s desk... I’m not entirely sure we were supposed to see that, but there it is. Oops."
Sounds like this author didn't watch the whole video. They are completely open about the fact that the editing team collaborated through remoting. At 5:20 an editor specifically says they "remoted into the Mac mini."
The second half of the post raises an arguably good question about the need for fancy Macs when cloud-based workflows only require glorified terminals. But that too may misplaced here -- it's entirely possible that the team members each do local editing work and then host their own collaboration sessions.
There are a number of reasons why the industry centralises. Particularly in post. One of them is the fact that the shot footage is insured and those policies have very strict clauses about handling the material. Yes this applies to an all-digital production as it would have applied to the film era.
The linked promotional materials [0] say that they remote into a mac mini running Avid.
> he works on iMac, which remotes into a separate Mac mini that runs Avid
So the conjecture from the article that the mac mini isn't powerful enough is false
> In other words, little of the horsepower being used in this editing process is actually coming from the Mac Mini on this guy’s desk. Instead, it’s being driven by another Mac on the other side of a speedy internet connection
And based on other comments here, this is a pretty common way to do things.
I would like to use this comment to mention Parsec. It's unbelievable how much snappier it feels compared to the default Screen Sharing. What is their secret sauce?!
I just wish it didn't require an internet connection for authentication
Video editing is not as portable as coding, there ain't no git. It doesn't surprise me that they have to do that, I imagine it's simply speedier and comfier to connect to a desktop that already has the work in progress in the latest state instead of ensuring everything is synced on different devices one uses. I also imagine that beefy MBPs with M3 and upwards could handle 4K editing of Severance (or maybe 8K) and they'd edit on local machines, should it be actually more convenient than connecting to a remote desktop. It's a bit shameful to admit, but still something we have to deal with while having such crazy advances in technology.
Meta: If I had to rank software features of an NLE when I was employed as an editor, key-to-photon (or click-to-sample, etc) latency would rank #1, far outpacing all other concerns. It's fundamental to the rhythm feel of the result, and prevents fatigue.
Avid bent over backwards to optimize for that in their software. I can't imagine cloud/remote editing being a good artist tool.
Makes sense why the framerate is so bad during some of the playback scenes. Also makes sense as multiple editors will be sharing the same editing tasks and it’s easier to share a single resource with the scenes loaded that are connected to local storage, and manipulate remotely, versus trying to pull that content to your machine and then push it back.
I've been working this way for a long time. Not video editing, but it's the same principle -- I want to be over here (with my monitor, keyboard and mouse) but the large, complex, performance-sensitive environment I need to use is over there.
Jump is excellent, BTW.
The article seems confused though. They say they are confused if Macs are being used to edit the show, but since the editors are remoting from one Mac to another it seems unambiguous.
The flavor of both the local machine and remote machine makes a difference. The OS of the machine you're remoting to makes the biggest difference, but since different OS's have their own ways of handling input devices, the local machine's OS is significant too. Every combo has its quirks, but I find Mac to Mac over Jump to be good.
He's saying that Apple stuff is hard for IT people to configure, customize, and virtualize, but isn't Apple's whole selling point that you don't need to be an IT person to use their products? It's a different market.
I think that's why a lot of tech companies now give their employees Apple laptops (they are easy for employees to self-support) but use everything but Apple in the data centers.
I think Apple and Microsoft are both prepping us for a future in which our computers are, mostly, mere terminals for their host of cloud services, rather than personal computing devices. This may be a test run/demonstration of whether and how a highly interactive, compute-intensive task like video editing can be performed under such a paradigm.
I've seen NICE DCV be used for this too. Amazon bought them, so it's free if the server end is on AWS, but they will also sell you licenses for your own hardware too. It's essentially 4k60 video streaming where the video is your desktop and they use all the tricks they've developed for media streaming here as well.
A colleague used to work for Apple, outside the US, and described his development environment as SSH’ing into a physical machine in Cupertino and working exclusively in a terminal with 100ms latency, because they weren’t allowed on site machines.
> If you want to run a Mac in the cloud, it has to be a full machine in most cases.
If cloud companies have the opportunity they will divide the resources of that Mac into 30 vms and then meter access to the point where it would have been cheaper to go out and acquire the hardware yourself.
Unpopular opinion, but Apple should stick to its guns and maybe create a physical Mac rack server with legal and technical restrictions on maximum tenancy.
a super easy way to work on big video files and not worry about the hassle of remote desktop and the back and forth with the team, versioning, etc.. is lucidlink (https://www.lucidlink.com/). A content creation collaboration tools lots of studios use. The app makes accessing cloud files as smooth and fast on your laptop as if they were local.
I will post a text to a friend of mine from a day ago: "I use my iphone to access pwas on my server so I can use it as a computer. I use my computer for x forwarding so I can use my server's programs."
I'm not the norm but isn't it telling when I don't want to use your hardware, I have to? I want to enjoy these products, but their immutability compared to prior versions is a thorn in my side.
I'm surprised at this point Apple still doesn't have some sort of solution for cloud/remote editing integrated into Final Cut. What I mean is a native desktop GUI but with the video files streaming from a remote location for the previews, thubmnails, etc. Heck, the GUI could even be a web app.
I do this all the time and get laughed at. I try to explain the exact same reasons but no one pays attention. I guess I just needed the Big Tech gatekeepers to tell the sheep that it works. Among sheep, it's not about the message but the messenger.
I kind of think the 80TB of video files might have contributed to that? Maybe it was easier to use the jump desktop to do it on another computer than it would have been to copy and pass around the video files?
Couldn't the traffic be LAN? Everyone keeps mentioning 'over the internet' - the device they're doing the editing on could be in a different room in the same building over gigabit++ speeds.
Nope cloud and local processing is always gonna be 2 things and not one will replace the other. Cloud has been around, and if you look at games nobody wants to play their game thru a service like Stadia.
Looks good, don't see the drawback for this usecase
"These editors aren't working on Macs, per se. They're working around them. Sure, there's an Apple logo in the top-left corner (two, actually), but it feels superfluous, knowing that the software isn’t directly on the machine and it just as easily be running on a Windows or Linux box a thousand miles away"
But the source AND target of the remote connections are both macs, pretty straightforward
I mean, of course. The source video files for an entire season of 4K TV are friggin' huge, and you want different editors to be able to work in different locations.
The article argues:
> To me, though, it highlights a huge issue with Apple’s current professional offerings. They are built to work on a single machine. At least for high-end use cases, the remote workflow threatens to cut them out of the equation entirely...
This is hardly a "huge issue". Plenty of people work on a single machine. Once your project gets too big, you move more and more to remote and cloud. It's a spectrum, and you want a machine flexible enough to handle both.
Kind of funny to me that they have to go so "thin client" with this.
You'd think there'd be some kind of "mipmap gateway" component to network-aware video editors, that incrementally re-renders scrub-quality and preview-quality renders of the timeline as the client tells it about project changes, and then streams those rendered changes back down the pipe to the client, proactively, into a local cache — without the client ever needing to (or even being allowed to!) hold the raw assets.
Then the local "fat client" editing UI could be snappy at pretty much all times — except for just after modifying the timeline, when it'd have to flush (some variable amount of) the preview cache. (And even then, the controls would still respond; just the preview and timeline-thumbnails would jitter, until the [active part of the] re-cache finished.)
Would this enable piracy? No! Who's going to want to release a 480p rip of a TV episode at this point? (And 480ps is all you need, for a functional live preview, when lining up ADB or B-roll or whatever else. Anything needing closer examination — VFX, say — could be rendered and sent by the gateway "on demand", as stills [on play-head stop] or as short clips [on first play after range-selection].)
(It would enable leaks... but so does RDP, if you combine it with local video-capture software. So that's nothing new.)
> requirements in its EULA that seem designed to protect its hardware business above all else
To get this you have to understand Apple's business model. They sell style, quality, and exclusivity, and ease of use. They can't ensure those things if they separate the hardware from the software. I'm sure they would love to make money from software licenses without the hardware. But it would end up creating new problems that would dilute the value of their product.
The proof is in the pudding. They're the most valuable company in the world because of their limitations, not despite them.
Doing video editing full-time over remote desktop must be painful. Perhaps a 20% productivity hit from that decision alone?
Kinda surprised professional video editing software hasn't been designed with this exact use case in mind - ie. A worker running local software doing editing, but then a remote server with tens of Tbytes of storage and high power GPU's.
The software would do standard definition and basic rendering fast locally, and simultaneously request the 8k data be rendered remotely and downloaded so a full fidelity preview can be seen after a few seconds.
Oh yeah, this is completely true. It would be a shoebox that barely ran Windows 98 and it wouldn't make much difference, and Apple's tools have completely failed to keep pace with this reality.
>...please use the original title, unless it is misleading or linkbait; don't editorialize.
>Please don't use HN primarily for promotion. It's ok to post your own stuff part of the time, but the primary use of the site should be for curiosity.
Why Apple's Severance gets edited over remote desktop software
(tedium.co)565 points by shortformblog 29 March 2025 | 341 comments
Comments
He said, "We don't collaborate at Apple because of the (perceived) risk of leaks. None of our tools are built for collaboration". Apple is famously closed about information sharing, to the point where on some floors every office requires its own badge, and sometimes even the cabinets within.
So it doesn't surprise me that their video editing tools are designed for a single user at a time.
Edit: This happened about six years ago, they have since added some collaboration tools, however it's more about the attitude at Apple in general and why their own tools lag on collaboration.
Edit 2: After the replies I thought I was going crazy. I actually checked my message history and found the discussion. I knew this happened pre-COVID, but it was actually in 2013, 12 years ago. I didn't think it was that long ago.
It was quite common to have remote desktop cards on high end machines so that you could hide them away somewhere quiet. The edit stations/Flame/Baselite machines all hada fucktonne of 15k sas drives in them, so were really noisy.
You couldn't invite a director to see what you were doing, when all you can hear is disk/fan whine.
They were quite expensive because they needed to be able to encode and send 2k video in decent bitdepth (ie not 420, but 444), and low latency. Worse still they needed to be calibrateable so that you could make sure that the colour you saw was the colour on the other end.
Alas, I can't remember what they are called, thankfully, because they are twats to manage.
I also don't buy the author's rationale for remote editing; it's oddly archaic: "high-end video production is quite storage-intensive, which is why your favorite YouTuber constantly talks about their editing rigs and network-attached storage. By putting this stuff offsite, they can put all this data on a real server."
Storage is cheap now, and desktop computers are more than powerful enough for any video editing. Any supposed advantage of remote "real servers" is going to be squandered by having to send everything over the Internet. The primary benefit of remote editing (and the much-hyped "camera to cloud") is fast turnaround, which you need for stuff like reality TV and news. But a dramatic series like Severance?
It is pretty baffling that Apple would create a PR vehicle that impugns its products like this. It would be better to say nothing. After Apple acquired Shake, they splashed Lord of the Rings, King Kong, and other major tentpoles on the Apple homepage at every opportunity... of course not mentioning that Weta was rendering those movies on hundreds of Linux servers instead of Macs. But at least Shake was the same product across all platforms, and it really was the primary effects tool on all those movies.
"they do not mention the use of Jump Desktop, which seems like a missed opportunity to promote a small-scale Mac developer. C’mon Apple, do better.)
Oh boy, this is just a minor infraction in Apple's history of disrespect toward developers. They do this, and worse, to major development partners too. I'm not going to name names, but after one such partner funded the acquisition of material on its own equipment and that material was used in a major product keynote... Apple not only neglected to credit or even mention that partner, but proceeded to show the name of a totally uninvolved competitor in its first slide afterward. The level of betrayal there was shocking.
It's a clever way to have your media centralized and yet have access to editors all over the world.
And a modern AVID system does not struggle with a few editors accessing the same footage.
First of all it's usually a proxy format and Secondly the storage can deliver a combined 800MB pr box sustained for x number of editors at the same time.
Yes I avid feel free to ask.
My home internet is a fiber gigabit 3g/3g up/down. Tucked away under the staircase is where my fiber ONT terminates and it is my server room. I have half a dozen boxes running various things. 4 symmetric 2012 i7 mac minis running linux KVM, and hosting various critical services - pihole, home automation, Homekit Secure Video etc.
Then there a giant former gaming PC with 7 HDD bays running the entire storage backend for a whole load of GoPro/Osmo/Insta360 videos I capture. Rclone to Google Photos for back-up. I don't edit any videos. Just there to capture memories so I can at some point when AI tools get good enough just have it generate clips. Same box runs my plex server with HW transcoding.
Then there is the actual gaming PC, a mini-ITX running steam remote play. Has power, a network cable and a fake HDMI dongle that emulates a monitor to trick the GPU into thinking something is actually plugged in.
Basically everything I do with desktop PCs at home is via some sort of remote interface.
Remote gaming is probably the most demanding of all of these. Low-latency HW-accelerated solutions eg: Parsec / steam-link are incredible technologies.
I carry an AppleTV + PS5 controllers to friends' houses and play the latest games across the internet.
Sounds like this author didn't watch the whole video. They are completely open about the fact that the editing team collaborated through remoting. At 5:20 an editor specifically says they "remoted into the Mac mini."
The second half of the post raises an arguably good question about the need for fancy Macs when cloud-based workflows only require glorified terminals. But that too may misplaced here -- it's entirely possible that the team members each do local editing work and then host their own collaboration sessions.
> he works on iMac, which remotes into a separate Mac mini that runs Avid
So the conjecture from the article that the mac mini isn't powerful enough is false
> In other words, little of the horsepower being used in this editing process is actually coming from the Mac Mini on this guy’s desk. Instead, it’s being driven by another Mac on the other side of a speedy internet connection
And based on other comments here, this is a pretty common way to do things.
Why the sensationalism?
[0] https://www.apple.com/newsroom/2025/03/how-the-mind-splittin...
I just wish it didn't require an internet connection for authentication
[1] https://www.jumpdesktop.com [2] https://parsec.app
Avid bent over backwards to optimize for that in their software. I can't imagine cloud/remote editing being a good artist tool.
https://en.m.wikipedia.org/wiki/Thin_client
Jump is excellent, BTW.
The article seems confused though. They say they are confused if Macs are being used to edit the show, but since the editors are remoting from one Mac to another it seems unambiguous.
The flavor of both the local machine and remote machine makes a difference. The OS of the machine you're remoting to makes the biggest difference, but since different OS's have their own ways of handling input devices, the local machine's OS is significant too. Every combo has its quirks, but I find Mac to Mac over Jump to be good.
I think that's why a lot of tech companies now give their employees Apple laptops (they are easy for employees to self-support) but use everything but Apple in the data centers.
If cloud companies have the opportunity they will divide the resources of that Mac into 30 vms and then meter access to the point where it would have been cheaper to go out and acquire the hardware yourself.
Unpopular opinion, but Apple should stick to its guns and maybe create a physical Mac rack server with legal and technical restrictions on maximum tenancy.
Otherwise what is the point of doing it on a mac?
Here is a talk from Netflix about cloud workspace for their artists https://aws.amazon.com/solutions/case-studies/netflix-workst...
Isn't the editing software on Macs? Can't see what point is being made here.
i think there have also been a handful of purpose built remote desktop packages that were purpose built.
* More powerful machines centrally located
* COVID-19 practices make lots of people in one place undesirable
* It's easy for rogue editors to steal stuff, and this prevents that
"These editors aren't working on Macs, per se. They're working around them. Sure, there's an Apple logo in the top-left corner (two, actually), but it feels superfluous, knowing that the software isn’t directly on the machine and it just as easily be running on a Windows or Linux box a thousand miles away"
But the source AND target of the remote connections are both macs, pretty straightforward
The article argues:
> To me, though, it highlights a huge issue with Apple’s current professional offerings. They are built to work on a single machine. At least for high-end use cases, the remote workflow threatens to cut them out of the equation entirely...
This is hardly a "huge issue". Plenty of people work on a single machine. Once your project gets too big, you move more and more to remote and cloud. It's a spectrum, and you want a machine flexible enough to handle both.
You'd think there'd be some kind of "mipmap gateway" component to network-aware video editors, that incrementally re-renders scrub-quality and preview-quality renders of the timeline as the client tells it about project changes, and then streams those rendered changes back down the pipe to the client, proactively, into a local cache — without the client ever needing to (or even being allowed to!) hold the raw assets.
Then the local "fat client" editing UI could be snappy at pretty much all times — except for just after modifying the timeline, when it'd have to flush (some variable amount of) the preview cache. (And even then, the controls would still respond; just the preview and timeline-thumbnails would jitter, until the [active part of the] re-cache finished.)
Would this enable piracy? No! Who's going to want to release a 480p rip of a TV episode at this point? (And 480ps is all you need, for a functional live preview, when lining up ADB or B-roll or whatever else. Anything needing closer examination — VFX, say — could be rendered and sent by the gateway "on demand", as stills [on play-head stop] or as short clips [on first play after range-selection].)
(It would enable leaks... but so does RDP, if you combine it with local video-capture software. So that's nothing new.)
To get this you have to understand Apple's business model. They sell style, quality, and exclusivity, and ease of use. They can't ensure those things if they separate the hardware from the software. I'm sure they would love to make money from software licenses without the hardware. But it would end up creating new problems that would dilute the value of their product.
The proof is in the pudding. They're the most valuable company in the world because of their limitations, not despite them.
Kinda surprised professional video editing software hasn't been designed with this exact use case in mind - ie. A worker running local software doing editing, but then a remote server with tens of Tbytes of storage and high power GPU's.
The software would do standard definition and basic rendering fast locally, and simultaneously request the 8k data be rendered remotely and downloaded so a full fidelity preview can be seen after a few seconds.
>...please use the original title, unless it is misleading or linkbait; don't editorialize.
>Please don't use HN primarily for promotion. It's ok to post your own stuff part of the time, but the primary use of the site should be for curiosity.
https://news.ycombinator.com/newsguidelines.html