Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

[Question] Is it possible to run on the internal IMU only? #1

Closed
kamibukuro5656 opened this issue Nov 2, 2020 · 26 comments
Closed

Comments

@kamibukuro5656
Copy link
Contributor

Hello.
Thank you for publishing your excellent work.

Now, I have a question about IMU.
I'm trying to see if it is possible to run LiLi-OM using only the internal IMU of Horizon.
Have you ever tried to run LiLi-OM with only the internal IMU?

I have calculated the Orientation by using the Madgwick Filter and inputting it, as shown below.
Screenshot from 2020-11-01 23-16-35 (2)

This attempt worked well in the case of the "FR-IOSB-Long.bag".
Screenshot from 2020-11-01 23-16-35 (3)

However, it did not work with "2020_open_road.bag", which is available in the official Livox Loam repository.
(https://terra-1-g.djicdn.com/65c028cd298f4669a7f0e40e50ba1131/demo/2020_open_road.bag)
Screenshot from 2020-11-01 23-45-07 (2)

Can these problems be solved by parameter tuning?
Or is the built-in IMU and Madgwick Filter not accurate enough?

I would appreciate any advice you can give me.

@radeonwu
Copy link

radeonwu commented Nov 2, 2020

i have the similar query.
It seems the backend module subscribes to the external imu_topic: "/imu/data", which contains orientation values.

@kailaili
Copy link

kailaili commented Nov 2, 2020

Thank you for your interest in the work and all the enquiries. Here's the answer from our side:

  • We noticed that the accelerometer readings from the Livox internal IMU is normalized to 1, which cannot be recovered to its actual value reliably. The external one (Xsens Mti-670) provides generally more accurate inertial measurements.

  • The external IMU does provide orientation values given by the filtering algorithm of Xsens. However, they are not used in the backend fusion. The system only requires 6-axis IMU readings (accelerometer + gyroscope).

  • You can definitely use the internal IMU readings of Livox Horizon by recovering the accelerometer values in some way (we also tried that). And the Madgwick filter should suffice for filtering. Since the hardware is changed in this case, you may need to tune the parameters to have a functional performance.

I hope this can help you. Don't hesitate if you have any further questions in this regard.

@kamibukuro5656
Copy link
Contributor Author

Thank you for your answer.
It was very helpful.

However, I have a different opinion about the built-in IMU accelerometer being normalized to 1.
The accelerometer simply appears to be scaled to a tenth or close to it.
The graph shows a comparison of the acceleration of the built-in IMU and the Xsens Mti-670.
In this graph, the acceleration of the internal IMU is multiplied by 10 and fed into the Madgwick Filter.
The output values look almost identical.

What do you think about this?

Screenshot from 2020-11-03 10-04-40
*The data is FR-IOSB-Long.bag.

@kailaili
Copy link

kailaili commented Nov 3, 2020

Hi, thanks for posting the graph.
In fact, we're also unsure about the normalization thing (the norm is not strictly 1 either). Multiplying the accelerometer measurements with a factor of 10 was also what we've tried at the beginning before we used the external IMU as it looks reasonable (We didn't preprocess to get the orientation estimates though). The internal IMU can deliver reasonably good results by multiplying some factor to the accelerometer and retune the parameters. But instead of using values that we're unsure about, we would rather provide something reliably inferable. The /imu/data values are raw readings from Xsens MTi-670.

@kamibukuro5656
Copy link
Contributor Author

Thank you.

I understand the problems with the internal IMU and the reasons for using an external IMU.

I am going to try to work on this problem for a while.
I would like to share the information with you if there are any developments.

@radeonwu
Copy link

radeonwu commented Nov 5, 2020

noticed kamibukuro5656 has closed this thread, yet world like to reopen it to further discuss about the suitability of 6-axis imu output for this repo.

As we understand, the livox internal imu is 6-axis and its output contains accel and gyro values but no direct orientation elements; while the external xsens imu is 9-axis which outputs orientation value. Encouraged by KAMIBUKURO5656 that he could run the sample FR-IOSB-Long dataset with lili-om by adding imu filter to create orientation estimate, I did some similar test with the KA_Urban_East dataset, and here is the result I got,

  • imu filter wise, I tried both imu_filter_madgwick and imu_complementary_filter
  • for raw input to imu filter, I tried both /livox/imu from the internal and /imu/data from the external imu (for /imu/data, I zeroed the orientation element value here)
    But it turns out similar result as Kamibukuro5656 got on 2020_open_road.bag.
    Some further dig into the imu_filter shows that in case no magnetometer, the imu_filter outputs orientation based on some random initial value, at least for yaw angle.

So my question is, does a 6-axis imu output can satisfy the running requirement of this repo? if yes, how exactly?
Thanks for your reply in advanced.

@kailaili
Copy link

kailaili commented Nov 5, 2020

In principal, there's no need to even run a filter additionally. The 9-axis IMU information is read by the system as standard ros msg but the orientation is never used anywhere except for orientation initialization at the first frame. It just uses the orientation reading at the very first frame to transform the frame to the ENU coordinates. Other than that, it never uses any orientation readings (including raw measurements or anything from any filter).

For doing the same thing, you can also average the accelerometer of the first several frames to initialize the orientation. The assumption here is that the sensor starts from static state (e.g., you put it on the ground or hold it still), so that the averaged acceleration can be regarded as the gravity. Such an orientation initialization is also applicable to our recorded data sets.

In summary, the system only needs a proper orientation initialization (either of the two ways mentioned above) and 6-axis IMU readings.

@radeonwu
Copy link

radeonwu commented Nov 7, 2020

hi, kailaili,
Thank you for the reply.
I believe you are right. For KA_Urban_East dataset, I manipulated the /imu/data input from the external imu a bit before feeding it to backendFusion,
a. at the 1st time, re-publish the full /imu/data message; from that onwards, keep the orientation value as zero
b. keep the orientation value as zero all the time
As a result, under a) lili-om runs properly, and under b) the result is wrong.
Further under setting b), adding imu_filter_madgwick for pre-processing, lili-om can also run KA_Urban_East dataset with correct result.

About the 2nd orientation initialization approach in your post, I do feel a bit curious, as by that, the z-axis will align to world U while x and y axis are not aligned to world E and N. Does it mean for initial orientation, z alignment is enough? Correct me if i am wrong. [was trying to read through the backendFusion src code and the paper, but still not fully understand it yet]

@kamibukuro5656
Copy link
Contributor Author

kamibukuro5656 commented Nov 7, 2020

Hi radeonwu

There are a couple of things I can answer regarding your question.
First of all, Lili-OM needs proper roll and pitch angles as initial values.
In case b, it won't work because the initial values of roll and pitch angles are 0 and not appropriate.
Secondly, if we apply the madgwick filter, the madgwick filter will decompose the initial acceleration to estimate the initial roll and pitch angles.
This is why it works well when the madgwick filter is applied.

If there is any mistake in my understanding, please correct it.

@kamibukuro5656
Copy link
Contributor Author

I also got the following answer from Livox about the unit of built-in IMU.
Livox-SDK/livox_ros_driver#63

The unit is G.
Therefore, it seems correct to multiply the acceleration of the internal IMU by 9.8 times.

@radeonwu
Copy link

radeonwu commented Nov 7, 2020

Thanks kamibukuro5656 for the prompt reply.
For "initial orientation reading at the very first frame to transform the frame to the ENU coordinates", just wondering if anyone can help point to the corresponding lines in backendFusion.cpp?

@radeonwu
Copy link

radeonwu commented Nov 7, 2020

I also got the following answer from Livox about the unit of built-in IMU.
Livox-SDK/livox_ros_driver#63

The unit is G.
Therefore, it seems correct to multiply the acceleration of the internal IMU by 9.8 times.

This is good update, and thanks for the sharing.
Regarding livox internal imu's output, other than accel-z, I do find some other element values are different compared to external imu. Two examples are from the KA_Urban_East dataset.

Screenshot from 2020-11-04 00-01-41
Screenshot from 2020-11-03 23-57-45

@kailaili
Copy link

kailaili commented Nov 7, 2020

Thanks kamibukuro5656 for the prompt reply.
For "initial orientation reading at the very first frame to transform the frame to the ENU coordinates", just wondering if anyone can help point to the corresponding lines in backendFusion.cpp?

you can locate it from 624th line in BackendFusion.cpp in the function imuHandler.

@kailaili
Copy link

kailaili commented Nov 7, 2020

I also got the following answer from Livox about the unit of built-in IMU.
Livox-SDK/livox_ros_driver#63
The unit is G.
Therefore, it seems correct to multiply the acceleration of the internal IMU by 9.8 times.

This is good update, and thanks for the sharing.
Regarding livox internal imu's output, other than accel-z, I do find some other element values are different compared to external imu. Two examples are from the KA_Urban_East dataset.

Screenshot from 2020-11-04 00-01-41
Screenshot from 2020-11-03 23-57-45

As mentioned before, we also tried the internal IMU at the beginning but we've noticed that the internal one is not comparably good as the external Xsens MTi-670. Therefore, we've provided the external one. But definitely you can continue trying out with the internal one since the unit issue is clarified.

@kailaili
Copy link

kailaili commented Nov 7, 2020

Hi radeonwu

There are a couple of things I can answer regarding your question.
First of all, Lili-OM needs proper roll and pitch angles as initial values.
In case b, it won't work because the initial values of roll and pitch angles are 0 and not appropriate.
Secondly, if we apply the madgwick filter, the madgwick filter will decompose the initial acceleration to estimate the initial roll and pitch angles.
This is why it works well when the madgwick filter is applied.

If there is any mistake in my understanding, please correct it.

Thank you for your update. The second initialization method that I mentioned above is basically what the madgwick does. A side note: orientations in lili-om are always represented by quaternions. So, if you want to have a "zero" orientation, it should be [1,0,0,0]^T.

@kamibukuro5656
Copy link
Contributor Author

Hello.

Finally, I was able to get good results with only the built-in IMU of Horizon.
I used Horizon to scan a small park in my neighborhood.
As a result, Lili-OM worked well with the built-in IMU only.
The SLAM results match up well with the aerial photos, as shown in the figure below.

birds_eye
Screenshot from 2020-11-08 20-29-27

Finally, I'll summarize what I did to get these results.

  1. Multiply the acceleration of the internal IMU by 9.8
  2. Estimate the appropriate initial roll and pitch angles (using the Madgwick Filter)
  3. Perform parameter tuning of Lili-OM

Once again, thanks for publishing the great results.
Thank you.

@radeonwu
Copy link

radeonwu commented Nov 8, 2020

Hi, kamibukuro5656,
It looks great.
Can I further check with you,

  1. for internal IMU output, only the acceleration at Z direction need multiply 9.8, or accel-x/accel-y also need multiply?
  2. for Lili-OM, which parameters tuning are essential for the performance?
    thanks

@kailaili
Copy link

kailaili commented Nov 8, 2020

Hello.

Finally, I was able to get good results with only the built-in IMU of Horizon.
I used Horizon to scan a small park in my neighborhood.
As a result, Lili-OM worked well with the built-in IMU only.
The SLAM results match up well with the aerial photos, as shown in the figure below.

birds_eye
Screenshot from 2020-11-08 20-29-27

Finally, I'll summarize what I did to get these results.

1. Multiply the acceleration of the internal IMU by 9.8

2. Estimate the appropriate initial roll and pitch angles (using the Madgwick Filter)

3. Perform parameter tuning of Lili-OM

Once again, thanks for publishing the great results.
Thank you.

Happy to hear that!

@kailaili kailaili pinned this issue Nov 11, 2020
@kailaili kailaili unpinned this issue Nov 11, 2020
@kailaili kailaili pinned this issue Nov 13, 2020
@radeonwu
Copy link

@kailaili @kamibukuro5656 ,

I am back again for help ^_^. I kept trying the internal IMU data with Madgwick Filter, yet haven't succeeded on my own dataset. Wondering where I might have done wrong, or certain parameter should be further tunned? Your suggestions are greatly appreciated.

The most common symptom is after the lili-om program runs for sometime, the trajectory starts to drifts. A screenshot attached here for illustration.

Screenshot from 2020-11-15 20-44-57

The rqt_graph screenshot is also included here,
Screenshot from 2020-11-15 21-04-32

I recorded a small dataset using livox, with link attached here for reference,
https://drive.google.com/file/d/1vHGjzoyxBcN9wFLxiz0qAz63DWeNQhfs/view?usp=sharing
The data contains a small loop, driving in and out of a carpark; topics include single lidar, single internal IMU and an external IMU (also 6 axis, for comparison).

@FlorianPfaff FlorianPfaff unpinned this issue Nov 16, 2020
@kailaili kailaili pinned this issue Nov 18, 2020
@radeonwu
Copy link

Would update about this issue I myself raised last time.
The reason that previously the algorithm does not work properly in this dataset is, I overlooked the special definition of external imu's angular velocity +/-, i.e., the +/- of pitch and yaw velocity value has to be reversed. After that, the algorithm could finish the mapping process.
Anyway it's due to my own mistake. I will open another separate issue to discuss non-converge at U-Turn situations.

@kailaili @kamibukuro5656 ,

I am back again for help ^_^. I kept trying the internal IMU data with Madgwick Filter, yet haven't succeeded on my own dataset. Wondering where I might have done wrong, or certain parameter should be further tunned? Your suggestions are greatly appreciated.

The most common symptom is after the lili-om program runs for sometime, the trajectory starts to drifts. A screenshot attached here for illustration.

Screenshot from 2020-11-15 20-44-57

The rqt_graph screenshot is also included here,
Screenshot from 2020-11-15 21-04-32

I recorded a small dataset using livox, with link attached here for reference,
https://drive.google.com/file/d/1vHGjzoyxBcN9wFLxiz0qAz63DWeNQhfs/view?usp=sharing
The data contains a small loop, driving in and out of a carpark; topics include single lidar, single internal IMU and an external IMU (also 6 axis, for comparison).

@YushengWHU
Copy link

Hello.

Finally, I was able to get good results with only the built-in IMU of Horizon.
I used Horizon to scan a small park in my neighborhood.
As a result, Lili-OM worked well with the built-in IMU only.
The SLAM results match up well with the aerial photos, as shown in the figure below.

birds_eye
Screenshot from 2020-11-08 20-29-27

Finally, I'll summarize what I did to get these results.

  1. Multiply the acceleration of the internal IMU by 9.8
  2. Estimate the appropriate initial roll and pitch angles (using the Madgwick Filter)
  3. Perform parameter tuning of Lili-OM

Once again, thanks for publishing the great results.
Thank you.

Thanks for your work, but how do you project the lidar map results onto the google map, by directly transfering coordinates into BLH or by other methods? Cause I have seen one only using some picture editing programs like PS to finish that job.

@simon-rob
Copy link

@kamibukuro5656 I am keen to try the Horizon using the internal IMU, have you posted you changes/code so that I could give it a try?

Many thanks

Simon

@kamibukuro5656
Copy link
Contributor Author

@simon-rob

I have submitted the script needed to run LiLi-OM on the internal IMU.
Please check pull request #19.

@simon-rob
Copy link

@kamibukuro5656

Many thanks for submitting this, I will give it a go!

@mbcel
Copy link

mbcel commented May 11, 2021

I am also seeing a timeshift between the internal imu and the external imu (good visible at angular_velocity/y in plot below), which could be catastrophic for the filter. Is this also anywhere compensated for?

However, I am also not sure if rqt_plot uses the msg header timestamps or system time for the x-axis, so this could also just be due to my setup.

imu_plot

@aiqibaobei
Copy link

I think this question helps me a lot to deploy this project with internal imu.And if anyone wants to use internal imu,remember set x、y acc and angular velocity to negative.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants