Is there an optimal choice of configuration space for Lie group integration schemes applied to constrained MBS?
June 18, 2024 ยท Declared Dead ยท ๐ arXiv.org
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Andreas Mueller, Zdravko Terze
arXiv ID
2407.03353
Category
math.NA: Numerical Analysis
Cross-listed
cs.RO
Citations
2
Venue
arXiv.org
Last Checked
2 months ago
Abstract
Recently various numerical integration schemes have been proposed for numerically simulating the dynamics of constrained multibody systems (MBS) operating. These integration schemes operate directly on the MBS configuration space considered as a Lie group. For discrete spatial mechanical systems there are two Lie group that can be used as configuration space: $SE\left( 3\right) $ and $SO\left( 3\right) \times \mathbb{R}^{3}$. Since the performance of the numerical integration scheme clearly depends on the underlying configuration space it is important to analyze the effect of using either variant. For constrained MBS a crucial aspect is the constraint satisfaction. In this paper the constraint violation observed for the two variants are investigated. It is concluded that the $SE\left( 3\right) $ formulation outperforms the $SO\left( 3\right) \times \mathbb{R}^{3}$ formulation if the absolute motions of the rigid bodies, as part of a constrained MBS, belong to a motion subgroup. In all other cases both formulations are equivalent. In the latter cases the $SO\left( 3\right) \times \mathbb{R}^{3}$ formulation should be used since the $SE\left( 3\right) $ formulation is numerically more complex, however.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
๐ Similar Papers
In the same crypt โ Numerical Analysis
R.I.P.
๐ป
Ghosted
R.I.P.
๐ป
Ghosted
Deep learning-based numerical methods for high-dimensional parabolic partial differential equations and backward stochastic differential equations
R.I.P.
๐ป
Ghosted
PDE-Net: Learning PDEs from Data
R.I.P.
๐ป
Ghosted
Efficient tensor completion for color image and video recovery: Low-rank tensor train
R.I.P.
๐ป
Ghosted
Tensor Ring Decomposition
R.I.P.
๐ป
Ghosted
Machine learning approximation algorithms for high-dimensional fully nonlinear partial differential equations and second-order backward stochastic differential equations
Died the same way โ ๐ป Ghosted
R.I.P.
๐ป
Ghosted
Language Models are Few-Shot Learners
R.I.P.
๐ป
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
๐ป
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
๐ป
Ghosted