Ok maybe i'm too focused about this potential solution already. I will take a step back and explain the problem.
We want to do live-long-slam. Our approach would be to keep a set of submaps that covers the whole area and to remove those submaps that are to some degree covered by newer submaps. (hence #409 ) The problem we see is, that our submaps contain an error from drifting that creates discontinuous and/or double walls when stitching them together. I think the "cartographer way" to improve this would be to reduce the submap size. But we have already quite small ones and performance seams to reduce if we go much lower. By intuition I would also guess that the repetitiveness of the environments where our robots drive dictates a lower bound on the submap size.
Anyway, loop closures were working well after some tuning, so the OccupancyGrid code, that was recently removed, created a very decent map – much better than stitching together the imperfect submaps. So we thought to improve the submaps probability_grid using the same process of replaying the respective scans along the optimized trajectory. On second thought though; question to you : Would this also improve the scan matching against the submap? To my understanding scans are matched against the probability grid, so improving it should improve the matches. After all what we want to achieve is improved localization. Nice looking maps are just considered an indicator for reliability.
About your questions: 1. The output of asset_writer main does in fact look much better. The double walls are gone (see attached images, esp. lower end) 2. Ok I was mostly referring to the following snippet. Could you explain what exactly is going to be removed? I'm not sure if i got you right
C++ range_data_inserter.Insert(carto::sensor::TransformRangeData( Decompress(node.constant_data->range_data), node.pose.cast<float>()), &probability_grid); } </float>
3. Thanks for the pointer I was not even aware of this code and the option to define
actions for the asset_writer. However: if we would want to improve the scan matching a "single large (sub)map" would not help... would it? Or are you suggesting to replace a couple of submaps with such a optimized bigger submap? And then to use this for localization? That could actually work for us as well.
4. Sounds cool, but this was not initially our aim when we opened this issue
____ Attached images