Skip to main content

graphics - Coloring with Hue for a function on a lattice grid


I wish to color a 2-dimensional lattice grid according to the value of a function at each lattice-node. More specifically, if I have 9 angles in a 3x3 array,


angles={{0, π, π}, {0, 0, π/2}, {π/2, 0, 3 π/3}}

then one can plot these angles on a lattice-grid by the following code in Mathematica:



angles = {{0, π, π}, {0, 0, π/2}, {π/2, 0, 3 π/3}};
GraphicsGrid[
Map[Graphics[{
LightGray, Circle[{0, 0}, 1],
Hue[#/(2 π), .6, .8], Thick, Arrowheads[Medium], Arrow[{{0, 0}, {Cos[#], Sin[#]}}]}] &,
angles, {2}]]

Here, the colouring is done using the Hue command according to the value of each angle. However, now I want to compute a function f[i,j]; to be more specific,


f[i_,j_]:=Cos[angles[[i + 1, j]] - angles[[i, j]]] + Cos[angles[[i, j+1]] - angles[[i, j]]]; 


with


angles[[n+1,i_]]:=angles[[1,i]];
angles[[i_,n+1]]:=angles[[i,1]];

i.e., the boundary conditions.


In the first code, Hue is used with the angles[[i,j]] (through #), which is probably straightforward. But is it possible to use f[i,j] instead, where f[i,j] is defined as above?


P.S. This question is related to: data visualization on a lattice grid


Thanks!


dbm


P.S. The final code, thanks to BoLe's answers:



Clear["Global`*"];
n := 10
angles = Table[RandomReal[{-Pi, Pi}], {i, n}, {j, n}];
f[here_, down_, right_] := Cos[down - here] + Cos[right - here]
g[list_, {i_, j_}] :=
Module[{m, n}, {m, n} = Dimensions[list]; {list[[i, j]],
If[i != m, list[[i + 1, j]], -list[[1, j]]],
If[j != n, list[[i, j + 1]], -list[[i, 1]]]}]
GraphicsGrid[
MapIndexed[

Graphics[{LightGray, Circle[{0, 0}, 1],
Hue[Rescale[f @@ g[angles, #2], {-2, 2}, {0, 1}]], Thin,
Arrowheads[Small], Arrow[{{0, 0}, {Cos[#], Sin[#]}}]}] &,
angles, {2}]]


Comments

Popular posts from this blog

mathematical optimization - Minimizing using indices, error: Part::pkspec1: The expression cannot be used as a part specification

I want to use Minimize where the variables to minimize are indices pointing into an array. Here a MWE that hopefully shows what my problem is. vars = u@# & /@ Range[3]; cons = Flatten@ { Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; Minimize[{Total@((vec1[[#]] - vec2[[u[#]]])^2 & /@ Range[1, 3]), cons}, vars, Integers] The error I get: Part::pkspec1: The expression u[1] cannot be used as a part specification. >> Answer Ok, it seems that one can get around Mathematica trying to evaluate vec2[[u[1]]] too early by using the function Indexed[vec2,u[1]] . The working MWE would then look like the following: vars = u@# & /@ Range[3]; cons = Flatten@{ Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; NMinimize[ {Total@((vec1[[#]] - Indexed[vec2, u[#]])^2 & /@ R...

functions - Get leading series expansion term?

Given a function f[x] , I would like to have a function leadingSeries that returns just the leading term in the series around x=0 . For example: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x)] x and leadingSeries[(1/x + 2 + (1 - 1/x^3)/4)/(4 + x)] -(1/(16 x^3)) Is there such a function in Mathematica? Or maybe one can implement it efficiently? EDIT I finally went with the following implementation, based on Carl Woll 's answer: lds[ex_,x_]:=( (ex/.x->(x+O[x]^2))/.SeriesData[U_,Z_,L_List,Mi_,Ma_,De_]:>SeriesData[U,Z,{L[[1]]},Mi,Mi+1,De]//Quiet//Normal) The advantage is, that this one also properly works with functions whose leading term is a constant: lds[Exp[x],x] 1 Answer Update 1 Updated to eliminate SeriesData and to not return additional terms Perhaps you could use: leadingSeries[expr_, x_] := Normal[expr /. x->(x+O[x]^2) /. a_List :> Take[a, 1]] Then for your examples: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x), x] leadingSeries[Exp[x], x] leadingSeries[(1/x + 2 + (1 - 1/x...

What is and isn't a valid variable specification for Manipulate?

I have an expression whose terms have arguments (representing subscripts), like this: myExpr = A[0] + V[1,T] I would like to put it inside a Manipulate to see its value as I move around the parameters. (The goal is eventually to plot it wrt one of the variables inside.) However, Mathematica complains when I set V[1,T] as a manipulated variable: Manipulate[Evaluate[myExpr], {A[0], 0, 1}, {V[1, T], 0, 1}] (*Manipulate::vsform: Manipulate argument {V[1,T],0,1} does not have the correct form for a variable specification. >> *) As a workaround, if I get rid of the symbol T inside the argument, it works fine: Manipulate[ Evaluate[myExpr /. T -> 15], {A[0], 0, 1}, {V[1, 15], 0, 1}] Why this behavior? Can anyone point me to the documentation that says what counts as a valid variable? And is there a way to get Manpiulate to accept an expression with a symbolic argument as a variable? Investigations I've done so far: I tried using variableQ from this answer , but it says V[1...