Skip to main content

expression manipulation - Assume symbolic elements as reals in TensorExpand


Let's assume that a, b are complex variables, and A[t],B[u], C[t,u] are matrices.


In[1]   TensorExpand[(a A[t].B[u] + b C[t, u]).C[t, u]]

Out[1] (b C[t, u]).C[t, u] + (a A[t].B[u]).C[t, u]

Now this is not quite what I want. Since a and b are complex, the parentheses are unnecessary. But wait, I can let Mathematica know, that they are not vectors or matrices:


In[1]  $Assumptions = Element[{a, b}, Complexes]
TensorExpand[(a A[t].B[u] + b C[t, u]).C[t, u]]
Out[1] b C[t, u].C[t, u] + a A[t].B[u].C[t, u]

But I want to take this one step further, let's write a function that takes care of the job:


In[1]  myExpand[expr_] := Module[{},
$Assumptions = Element[_Symbol, Complexes];

Return[TensorExpand[expr]];
]
Out[1] (b C[t, u]).C[t, u] + (a A[t].B[u]).C[t, u]

Well, that didn't work as I wanted. But I can still read all the symbolic variables:


In[1]  symbolicQ[x_] := MatchQ[Head[x], Symbol];
myExpand[expr_] := Module[{},
$Assumptions = Element[DeleteDuplicates[Select[Level[expr, Infinity], symbolicQ]], Complexes];
Return[TensorExpand[expr]];
]


However, this feels clumsy. Is there a better way? What is the better way? If in doubt, choose the more performant way.



Answer



This works:


myExpand[expr_] := Assuming[
Cases[expr, _Symbol, All] ∈ Reals,
TensorExpand@expr
]

myExpand[(a A[t].B[u] + b C[t, u]).C[t, u]]

(* b C[t, u].C[t, u] + a A[t].B[u].C[t, u] *)

A few notes:



  • Your Module is unnecessary - it doesn't do anything like this

  • Return is also unnecessary (even with Module). See here

  • If you want to temporarily set $Assumptions, use Assuming. In general (i.e. when there's no function that does it for you, use Block: Block[{$var=…},…]

  • Use Cases[expr,pat,level] instead of Select[Level[expr,level],check]

  • Use MatchQ[expr,_head] instead of MatchQ[Head[expr],head]. This also makes using Cases more natural in this case



Comments

Popular posts from this blog

functions - Get leading series expansion term?

Given a function f[x] , I would like to have a function leadingSeries that returns just the leading term in the series around x=0 . For example: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x)] x and leadingSeries[(1/x + 2 + (1 - 1/x^3)/4)/(4 + x)] -(1/(16 x^3)) Is there such a function in Mathematica? Or maybe one can implement it efficiently? EDIT I finally went with the following implementation, based on Carl Woll 's answer: lds[ex_,x_]:=( (ex/.x->(x+O[x]^2))/.SeriesData[U_,Z_,L_List,Mi_,Ma_,De_]:>SeriesData[U,Z,{L[[1]]},Mi,Mi+1,De]//Quiet//Normal) The advantage is, that this one also properly works with functions whose leading term is a constant: lds[Exp[x],x] 1 Answer Update 1 Updated to eliminate SeriesData and to not return additional terms Perhaps you could use: leadingSeries[expr_, x_] := Normal[expr /. x->(x+O[x]^2) /. a_List :> Take[a, 1]] Then for your examples: leadingSeries[(1/x + 2)/(4 + 1/x^2 + x), x] leadingSeries[Exp[x], x] leadingSeries[(1/x + 2 + (1 - 1/x...

mathematical optimization - Minimizing using indices, error: Part::pkspec1: The expression cannot be used as a part specification

I want to use Minimize where the variables to minimize are indices pointing into an array. Here a MWE that hopefully shows what my problem is. vars = u@# & /@ Range[3]; cons = Flatten@ { Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; Minimize[{Total@((vec1[[#]] - vec2[[u[#]]])^2 & /@ Range[1, 3]), cons}, vars, Integers] The error I get: Part::pkspec1: The expression u[1] cannot be used as a part specification. >> Answer Ok, it seems that one can get around Mathematica trying to evaluate vec2[[u[1]]] too early by using the function Indexed[vec2,u[1]] . The working MWE would then look like the following: vars = u@# & /@ Range[3]; cons = Flatten@{ Table[(u[j] != #) & /@ vars[[j + 1 ;; -1]], {j, 1, 3 - 1}], 1 vec1 = {1, 2, 3}; vec2 = {1, 2, 3}; NMinimize[ {Total@((vec1[[#]] - Indexed[vec2, u[#]])^2 & /@ R...

What is and isn't a valid variable specification for Manipulate?

I have an expression whose terms have arguments (representing subscripts), like this: myExpr = A[0] + V[1,T] I would like to put it inside a Manipulate to see its value as I move around the parameters. (The goal is eventually to plot it wrt one of the variables inside.) However, Mathematica complains when I set V[1,T] as a manipulated variable: Manipulate[Evaluate[myExpr], {A[0], 0, 1}, {V[1, T], 0, 1}] (*Manipulate::vsform: Manipulate argument {V[1,T],0,1} does not have the correct form for a variable specification. >> *) As a workaround, if I get rid of the symbol T inside the argument, it works fine: Manipulate[ Evaluate[myExpr /. T -> 15], {A[0], 0, 1}, {V[1, 15], 0, 1}] Why this behavior? Can anyone point me to the documentation that says what counts as a valid variable? And is there a way to get Manpiulate to accept an expression with a symbolic argument as a variable? Investigations I've done so far: I tried using variableQ from this answer , but it says V[1...