For algorithms, a little memory outweighs a lot of time

(quantamagazine.org)

Comments

cperciva 21 May 2025
Minus the fuzz: A multitape Turing machine running in time t can be simulated using O(sqrt(t log t)) space (and typically more than t time).

https://arxiv.org/abs/2502.17779

xlii 22 May 2025
From the „Camel Book”, one of my favorite programming books (not because it was enlightening, but because it was entertaining); on the Perl philosophy:

“If you’re running out of memory, you can buy more. But if you’re running out of time, you’re screwed.”

whatever1 21 May 2025
Lookup tables with precalculated things for the win!

In fact I don’t think we would need processors anymore if we were centrally storing all of the operations ever done in our processors.

Now fast retrieval is another problem for another thread.

LPisGood 21 May 2025
I think it is very intuitive that more space beats the pants off of more time.

In time O(n) you can use O(n) cells on a tape, but there are O(2^n) possible configurations of symbols on a tape of length n (for an alphabet with 2 symbols), so you can do so much more with n space than with n time.

ChrisMarshallNY 22 May 2025
Interesting. It's one of those things that I’ve always “just assumed,” without thinking about it.

I did a lot of raster graphics programming, in my career, and graphics work makes heavy use of lookup tables.

Yesterday, I posted a rather simple tool I wrote[0]: a server that “frontloads” a set of polygons into a database, and then uses them, at query time. It’s fairly fast (but I’m sure it could be a lot faster). I wrote it in a few hours, and got pretty good performance, right out of the starting gate.

Pretty basic stuff. I doubt the pattern is unique, but it’s one that I’ve used for ages. It’s where I try to do as much “upfront” work as possible, and store “half-baked” results into memory.

Like I said, I always “just worked that way,” and never really thought about it. There’s been a lot of “rule of thumb” stuff in my work. Didn’t have an MIT professor to teach it, but it’s sort of gratifying to see that it wasn’t just “old wives” stuff.

There’s probably a ton of stuff that we do, every day, that we don’t think about. Some of it almost certainly originated from really smart folks like him, finding the best way (like the “winding number” algorithm, in that server[1]), and some of it also probably comes from “grug-brained programmers,” simply doing what makes sense.

[0] https://news.ycombinator.com/item?id=44046227

[1] https://github.com/LittleGreenViper/LGV_TZ_Lookup/blob/e247f...

senfiaj 22 May 2025
I always thought that the "reverse" relationship between the time and the space requirements is explained by the simple fact that when you have a set of algorithms with certain time and space requirements and you constrain one of these parameters, the other parameter will not be necessarily (in practice oftentimes not) the most optimal. But it doesn't necessarily mean that the faster algorithm will require less memory or vice versa.

This "reverse" relationship works with other parameters. Such as in sorting algorithms, where besides time and space requirements you have stability requirement. If you constrain the sorting algorithms by stability, you won't have any advantage (will likely to have some disadvantage) in performance over non constrained algorithms. And wee see that the known stable sorting algorithms are either slower or require more memory. Nobody has yet invented a stable sorting algorithm that has comparable performance (both time and space) to non stable sorting algorithms. That's it.

felineflock 22 May 2025
IvanK_net 21 May 2025
I am confused. If a single-tape turing machine receives a digit N in binary, and is supposed to write N ones on the tape, on the right side of the digit N, it performs N steps.

If you expect N ones at the output, how can this machine be simulated in the space smaller than N?

This machine must decrement the digit N at the beginning of the tape, and move to the end of the tape to write "1", so it runs in time O(N^2), not O(N)? (as it takes N "trips" to the end of the tape, and each "trip" takes 1, 2, 3 .. N steps)

Since turing machines can not jump to any place on a tape in constant time (like computers can), does it have any impact on real computers?

schmorptron 22 May 2025
As an aside, I really enjoy a lot of the quanta magazine articles! They manage to write articles that both appeal to compsci people as well as interested "outsiders". The birds-eye view and informal how and why they pack into their explanation style often gives me new perspectives and things to think about!
bgnn 22 hours ago
The poetic style of Quanta makes it impossible to understand what does this mean. Can someone familiar with the topic explain is this applicable to real world computers or just a theoretical scenario? Does this mean we need more memory in computers even if they need to run at a lower clock speed?
ziofill 22 May 2025
At the cost of sounding ridiculous: can there be a notion of "speed of light" in the theory of computation, determining the ultimate limit of memory (space) vs runtime?
kazinator 22 May 2025
Give algorithm designers a little more memory, and they will find a way to shave off the time.

Give operating system development organizations a lot more memory and they will find a way to worsen the response time.

amelius 22 May 2025
> Williams’ proof established a mathematical procedure for transforming any algorithm — no matter what it does — into a form that uses much less space.

Ok, but space is cheap, and we usually want to trade processing time for space.

I.e., the opposite.

willmarquis 21 May 2025
This is just a reminder that memory isn’t just a constraint, it’s a resource.
nottorp 22 May 2025
You can determine both the time and space complexity for any algorithm, should you need to. And in some cases you can adjust to have more of one or the other, depending on your particular constraints.

More at eleven.

vishnugupta 22 May 2025
Rainbow Tables FTW!
golly_ned 22 May 2025
“ If the problem is solved next week, Williams will be kicking himself. Before he wrote the paper, he spent months trying and failing to extend his result”

What a strange, sad way to think about this. Academia is perverse.