One thing I'd suggest, for any hardware product, is that when doing your bill of materials to provide links and show estimated costs. Sure, these will change but having a rough idea of the costs is really helpful, especially when perusing on from things like HN. It can be a big difference for someone to decide if they want to try it on their own or not. It is the ballpark figures that matter, not the specifics.
You did all that research, write it down. If for no one but yourself! Providing links is highly helpful because names can be funky and helps people (including your future self) know if this is the same thing or not. It's always noisy, but these things reduce noise. Importantly, they take no time while you're doing the project (you literally bought the parts, so you have the link and the price). It saves yourself a lot of hassle, not just for others. Document because no one remembers anything after a few days or weeks. It takes 10 seconds to write it down and 30 minutes to do the thing all over again, so be lazy and document. I think this is one of the biggest lessons I learned when I started as an engineer. You save yourself so much time. You just got to fight that dumb part in your head that is trying to convince you that it doesn't save time. (Same with documenting code[0])
Here. I did a quick "15 minute" look. May not be accurate
Lidar:
One of:
LD06: $80 https://www.aliexpress.us/item/3256803352905216.html
LD19: $70 https://www.amazon.com/DTOF-D300-Distance-Obstacle-Education/dp/B0B1V8D36H
STL27L: $160 https://www.dfrobot.com/product-2726.html
Camera and Lens: $60 https://www.amazon.com/Arducam-Raspberry-Camera-Distortion-Compatible/dp/B0B1MN721K
Raspberry Pi 4: $50
NEMA17 42-23 stepper: $10 https://www.amazon.com/SIMAX3D-Nema17-Stepper-Motor/dp/B0CQLFNSMJ
That gives us $200-$280 before counting the power supply and buck converter.
[0] When I wrote the code only me and god understood what was going on. But as time marched on, now only god knows.
Max range 12 meters. That's when it seems to start to get expensive. The light source, filters, and sensors all have to get better.
Good enough for most small robots. Maybe good enough for the minor sensors on self-driving cars, the ones that cover the vehicle perimeter so kids and dogs are reliably sensed. The big long-range LIDAR up top is still hard.
The sketchfab examples are fantastic, to be able to move around in a 3D space, like it's some kind of scifi simulation.
The mouse controls are confusing the heck out of me. It shows a 'grab' icon but nothing about it grabs as the movement direction is the opposite, feels completely unnatural.
It may be in the project as I just scanned through i (but will read through it properly soon), but do you have any of the data for accuracy? Say, over 10M (Or less, if this lidar doesn't work at that distance).
I'm familiar with the FARO scanners which have a different type of mechanism. Their accuracy is good enough for building things.
I've discovered there's several markets for scanners… among those are people who need accuracy and people who are creating content for media like games.
Thank you so much for sharing this project. It's truly unbelievable.
I've been toying with photogrammetry a little bit lately, specifically for scanning indoor rooms and spaces. So far I'm finding metashape the most suitable for it, but some of the precision isn't great (but I'm still improving my technique). I mostly want to convert the interior of one real building into a digital model for preservation and analysis. I've briefly considered LIDAR, but put it in the too hard/expensive bucket. This project seems to challenge that assumption.
What does the software post-processing look like for this? Can I get a point cloud that I can then merge with other data (like DSLR photographs for texturing)?
I see in their second image[1] some of the wall is not scanned as it was blocked by a hanging lamp, and possibly the LIDAR could not see over the top of the couch either. Can I merge two (or more) point clouds to see around objects and corners? Will software be able to self-align common walls/points to identify its in the same physical room, or will that require some jiggery-pokery? Is there a LIDAR equivalent of coded targets or ARTags[0]? Would this scale to multiple rooms?
Is this even worth considering, or will it be more hassle than its worth compared to well-done photogrammetry?
(Apologies for the peak-of-mount-stupid questions, I don't know what I don't know)
Hi! Thanks for sharing this amazing work. I’m curious about the scalability and performance of PiLiDAR when deployed on large-scale outdoor datasets. Have you benchmarked it on datasets like SemanticKITTI or nuScenes? If so, could you share any insights on runtime, memory usage, and how well it generalizes beyond the indoor scenes used in your paper?
Oh hey! This is exactly what I was looking for just a couple weeks ago! I've had parts to prototype something roughly equivalent to this sitting in my cart on Amazon for a couple weeks now, but I've been very uncertain on my choice of actual lidar scanner.
I'll have to look into this as a starting point I get back from Easter vacation
For home improvement projects, This could be quite useful for generating point cloud map of places hard to get to. Like I have drywall installations I would love to get behind and check how things look, this would be great for that.
GY-521 in particular and MPU6050 in general make quite poor IMUs. Why do you use them? And what for in this particular case?
What do they do in this set up?
It's not obvious what the heck this is without reading into it. A full 4pi steradian scanner? a 360 degree 1 channel LIDAR? A fisheye camera plus some single channel LIDAR plus monocular depth estimation networks to cover everything not in the plane of the lidar?
It would be great to clarify what it is in the first sentence.
It's impressive that the cost of usable LIDAR tech is well within the reach of personal projects now. The sensors used on the first self-driving cars (from companies like SICK, etc.) likely perform much better but the price point of multiple k$ is not really viable for experimentation at home.
Not to make everything political, but I wonder how the US tariffs will affect electronics-adjacent hobbies. Anecdotally, the flashlight community on Reddit has been panicking a little about this.
Raspberry Pi Lidar Scanner
(github.com)677 points by Venn1 19 April 2025 | 185 comments
Comments
One thing I'd suggest, for any hardware product, is that when doing your bill of materials to provide links and show estimated costs. Sure, these will change but having a rough idea of the costs is really helpful, especially when perusing on from things like HN. It can be a big difference for someone to decide if they want to try it on their own or not. It is the ballpark figures that matter, not the specifics.
You did all that research, write it down. If for no one but yourself! Providing links is highly helpful because names can be funky and helps people (including your future self) know if this is the same thing or not. It's always noisy, but these things reduce noise. Importantly, they take no time while you're doing the project (you literally bought the parts, so you have the link and the price). It saves yourself a lot of hassle, not just for others. Document because no one remembers anything after a few days or weeks. It takes 10 seconds to write it down and 30 minutes to do the thing all over again, so be lazy and document. I think this is one of the biggest lessons I learned when I started as an engineer. You save yourself so much time. You just got to fight that dumb part in your head that is trying to convince you that it doesn't save time. (Same with documenting code[0])
Here. I did a quick "15 minute" look. May not be accurate
That gives us $200-$280 before counting the power supply and buck converter.[0] When I wrote the code only me and god understood what was going on. But as time marched on, now only god knows.
Max range 12 meters. That's when it seems to start to get expensive. The light source, filters, and sensors all have to get better.
Good enough for most small robots. Maybe good enough for the minor sensors on self-driving cars, the ones that cover the vehicle perimeter so kids and dogs are reliably sensed. The big long-range LIDAR up top is still hard.
[1] https://www.ldrobot.com/
The mouse controls are confusing the heck out of me. It shows a 'grab' icon but nothing about it grabs as the movement direction is the opposite, feels completely unnatural.
But to be alive when it's possible for gifted individuals to create technology like this is just incredible.
It may be in the project as I just scanned through i (but will read through it properly soon), but do you have any of the data for accuracy? Say, over 10M (Or less, if this lidar doesn't work at that distance).
I'm familiar with the FARO scanners which have a different type of mechanism. Their accuracy is good enough for building things.
I've discovered there's several markets for scanners… among those are people who need accuracy and people who are creating content for media like games.
Thank you so much for sharing this project. It's truly unbelievable.
What does the software post-processing look like for this? Can I get a point cloud that I can then merge with other data (like DSLR photographs for texturing)?
I see in their second image[1] some of the wall is not scanned as it was blocked by a hanging lamp, and possibly the LIDAR could not see over the top of the couch either. Can I merge two (or more) point clouds to see around objects and corners? Will software be able to self-align common walls/points to identify its in the same physical room, or will that require some jiggery-pokery? Is there a LIDAR equivalent of coded targets or ARTags[0]? Would this scale to multiple rooms?
Is this even worth considering, or will it be more hassle than its worth compared to well-done photogrammetry?
(Apologies for the peak-of-mount-stupid questions, I don't know what I don't know)
0: https://en.wikipedia.org/wiki/ARTag 1: https://github.com/PiLiDAR/PiLiDAR/raw/main/images/interior....
I'll have to look into this as a starting point I get back from Easter vacation
It would be great to clarify what it is in the first sentence.
Not to make everything political, but I wonder how the US tariffs will affect electronics-adjacent hobbies. Anecdotally, the flashlight community on Reddit has been panicking a little about this.