Maximizing Ink in Partial Edge Drawings of k-plane Graphs
August 23, 2019 Β· Declared Dead Β· π International Symposium Graph Drawing and Network Visualization
"No code URL or promise found in abstract"
Evidence collected by the PWNC Scanner
Authors
Matthias Hummel, Fabian Klute, Soeren Nickel, Martin NΓΆllenburg
arXiv ID
1908.08905
Category
cs.CG: Computational Geometry
Cross-listed
cs.DS
Citations
8
Venue
International Symposium Graph Drawing and Network Visualization
Last Checked
2 months ago
Abstract
Partial edge drawing (PED) is a drawing style for non-planar graphs, in which edges are drawn only partially as pairs of opposing stubs on the respective end-vertices. In a PED, by erasing the central parts of edges, all edge crossings and the resulting visual clutter are hidden in the undrawn parts of the edges. In symmetric partial edge drawings (SPEDs), the two stubs of each edge are required to have the same length. It is known that maximizing the ink (or the total stub length) when transforming a straight-line graph drawing with crossings into a SPED is tractable for 2-plane input drawings, but NP-hard for unrestricted inputs. We show that the problem remains NP-hard even for 3-plane input drawings and establish NP-hardness of ink maximization for PEDs of 4-plane graphs. Yet, for k-plane input drawings whose edge intersection graph forms a collection of trees or, more generally, whose intersection graph has bounded treewidth, we present efficient algorithms for computing maximum-ink PEDs and SPEDs. We implemented the treewidth-based algorithms and show a brief experimental evaluation.
Community Contributions
Found the code? Know the venue? Think something is wrong? Let us know!
π Similar Papers
In the same crypt β Computational Geometry
R.I.P.
π»
Ghosted
R.I.P.
π»
Ghosted
Dynamic Planar Convex Hull
R.I.P.
π»
Ghosted
TEMPO: Feature-Endowed TeichmΓΌller Extremal Mappings of Point Clouds
R.I.P.
π»
Ghosted
Explainable Artificial Intelligence for Manufacturing Cost Estimation and Machining Feature Visualization
R.I.P.
π»
Ghosted
Coresets for Clustering in Euclidean Spaces: Importance Sampling is Nearly Optimal
R.I.P.
π»
Ghosted
Momen(e)t: Flavor the Moments in Learning to Classify Shapes
Died the same way β π» Ghosted
R.I.P.
π»
Ghosted
Language Models are Few-Shot Learners
R.I.P.
π»
Ghosted
PyTorch: An Imperative Style, High-Performance Deep Learning Library
R.I.P.
π»
Ghosted
XGBoost: A Scalable Tree Boosting System
R.I.P.
π»
Ghosted