I want to use DeleteCases
(or any appropriate function) to remove elements of a list, which in turn are lists of fixed size. The rule I wish to apply is that any element of the list which already appear elsewhere in the list, up to any number of -1 is to be removed. But I don't know how to do this kind of tricky pattern matching. For example, I have the following list:
myData = {{h -> 255.155, c -> 0, s -> -10000.},
{h -> -255.155, c -> 0, s -> 10000.},
{h -> 0, c -> 0, s -> 10000.},
{h -> 0, c -> 0, s -> -10000.},
{h -> 255.155, c -> 0, s -> 10000.},
{h -> -255.155, c -> 0, s -> -10000.},
{h -> -255.155, c -> 1870.83, s -> 3535.53},
{h -> 255.155, c -> -1870.83, s -> -3535.53},
{h -> 0, c -> 1870.83, s -> 3535.53},
{h -> 0, c -> -1870.83, s -> -3535.53},
{h -> 255.155, c -> 1870.83, s -> 3535.53},
{h -> -255.155, c -> -1870.83, s -> -3535.53},
{h -> 255.155, c -> -4000., s -> 0},
{h -> -255.155, c -> 4000., s -> 0},
{h -> 0, c -> 4000., s -> 0},
{h -> 0, c -> -4000., s -> 0},
{h -> 255.155, c -> 4000., s -> 0},
{h -> -255.155, c -> -4000., s -> 0},
{h -> 255.155, c -> 1870.83, s -> -3535.53},
{h -> -255.155, c -> -1870.83, s -> 3535.53},
{h -> 0, c -> 1870.83, s -> -3535.53},
{h -> 0, c -> -1870.83, s -> 3535.53},
{h -> 255.155, c -> -1870.83, s -> 3535.53},
{h -> -255.155, c -> 1870.83, s -> -3535.53},
{h -> 255.155, c -> 0, s -> 0},
{h -> -255.155, c -> 0, s -> 0},
{h -> 0, c -> 0, s -> 0}}
As can be seen the first element myData[[1]]
(which is a list of three items) is a duplicate (up to the minus signs) myData[[2]]
myData[[5]]
myData[[6]]
. But I don't know how to get Mathematica to remove these.
As an added bonus, it would be nice that, among the ones that are duplicates up to -1, the one with the least number of negatives is kept, and all others are removed. (In the example above, myData[[5]]
would be among those that would be kept.)
Answer
The problem with both Union
- based and DeleteDuplicates
- based solutions is that both functions have quadratic complexity in the size of the list, for an explicit comparison function. Here is code which should be much faster for larger lists:
Reap[
Sow[#, Abs[{{h, c, s} /. #}]] & /@ myData,
_,
First@#2[[Ordering[UnitStep[-{h, c, s} /. #2]]]] &
][[2]]
(*
{{h->255.155,c->0,s->10000.},{h->0,c->0,s->10000.},
{h->255.155,c->1870.83,s->3535.53},{h->0,c->1870.83,s->3535.53},
{h->255.155,c->4000.,s->0},{h->0,c->4000.,s->0},
{h->255.155,c->0,s->0},{h->0,c->0,s->0}}
*)
It is based on tagging the list entries with some function of the entry that serves as an equivalence tag, and then postprocessing the resulting lists. It should have a near-linear complexity with respect to the length of the list.
Comments
Post a Comment