Bug introduced in 10.0 and fixed in 11.0
I am using ArrayPlot
to show some arrays with values from 0 to 1. I want 0 to be black and 1 to be white, so I am using:
ColorFunction -> Function[a, RGBColor[a, a, a]]
(I realize I could use GrayLevel
, but I eventually want to play with the components of RGBcolor
).
The problem is that, in the context of ArrayPlot
, this color function seems to misbehave with zero arrays. It plots the array with cells that are empty (opacity 0) instead of black. Even one value in the array greater than zero, however small, and all the cells plot fine.
Note that I am using ColorFunctionScaling -> False
.
I finally discovered a work-around by using the explicit alpha channel of RGBColor
, but I think that should not be necessary.
Code example
{ArrayPlot[{{0}},
ColorFunction -> Function[a, RGBColor[a, a, a]], ColorFunctionScaling -> False],
ArrayPlot[{{0}},
ColorFunction -> Function[a, RGBColor[a, a, a, 1]], ColorFunctionScaling -> False]}
The first produces an empty square (transparent), the second a black one. Both should be black, no?
Other information
- The same happens with
MatrixPlot
. - No problem with other plot functions, like
DensityPlot
, orListDensityPlot
(even withInterpolationOrder -> 0
) - I'm using Mathematica 10.4.0.0
Comments
Post a Comment