Skip to main content

scoping - What are some advanced uses for Block?


I read the answers to this question (What are the use cases for different scoping constructs?) and this one (Condition, Block, Module - which way is the most memory and computationally efficient?).


According to those, Block is safer (if something aborts, it restores the values) and faster (perhaps something to do with the low-level pointer redirection that I believe it uses) than Module, but less memory-efficient if the function is defined a certain way.


That being said, (1) why does Leonid say that Module is "safer" when it doesn't have as-good garbage collection, and (2) if I am to use Module for most of the time, what are some of the "advanced" uses which require Block?



Answer




Safety


Module is safer than Block because:




  • It is a lexical scoping construct, which means that variable bindings are only tied to a specific piece of code. Variables outside that piece of code are never affected by these bindings.


    In contrast, Block basically binds a variable to a piece of execution stack, not a piece of code. Such bindings are much harder to understand and debug, since execution stack is not something carved in stone, it is dynamic and usually data-dependent.




  • The way Module resolves variable collisions is such that the integrity of inner or outer level bindings is never broken (at least in theory - in practice the lexical scoping is emulated in Mathematica and can be broken, but let's say this is very unlikely to happen by itself).


    In contrast, nested Block-s will simply have the variable value be the one (re)defined most recently, and also those different Block-s can be in different functions - while nested Module-s normally are in one function.





Both these points lead to the same conclusion that code which uses Block is harder to understand and debug. Basically, it is almost the same as using global variables (which are however guaranteed to get back their values after Block executes).


Advanced uses of Block


Probably the main one is to change the order of evaluation non-trivially, in a way not easily possible with other constructs. Block-ed functions or symbols forget what they were, and therefore evaluate to themselves. This often allows to alter the order of evaluation of expressions in non-trivial ways.


I will show a couple of examples.


Example: emulating OptionValue


Here is one, from this answer: a possible emulation of OptionValue, which is one of the most magical parts of the pattern-matcher:


Module[{tried},
Unprotect[SetDelayed];

SetDelayed[f_[args___, optpt : OptionsPattern[]], rhs_] /;
!FreeQ[Unevaluated[rhs], autoOptions[]] :=
Block[{tried = True},
f[args, optpt] :=
Block[{autoOptions}, autoOptions[] = Options[f]; rhs]] /; ! TrueQ[tried];
Protect[SetDelayed];]

the usage:


Options[foo] = {bar -> 1};
foo[OptionsPattern[]] := autoOptions[]

foo[]


(* {bar -> 1} *)

Villegas-Gayley trick of function's redefinition


(call:f[args___])/;!TrueQ[inF]:=
Block[{inF=True},
your code;
call

]

allows you to inject your own code into another function and avoid infinite recursion. Very useful, both for user-defined and built-in functions


Safe memoization


fib[n_]:=
Block[{fib},
fib[0]=fib[1]=1;
fib[k_]:= fib[k] = fib[k-1] + fib[k-2];
fib[n]
]


The point here being that the memoized values will be cleared automatically at the end.


Making sure the program does not end up in an illegal state in case of Aborts or exceptions


a = 1; b = 2;
Block[{a = 3, b = 4},
Abort[]
]

The point here is that the values of a and b are guaranteed to be not altered globally by code inside Block, whatever it is.


Change the order of evaluation, or change some function's properties



Comparison operators are not listable by default, but we can make them:


Block[{Greater},
SetAttributes[Greater, Listable];
Greater[{1, 2, 3, 4, 5}, {5, 4, 3, 2, 1}]
]

(* {False, False, False, True, True} *)

Preventing premature evaluation


This is a generalization of the standard memoization idiom f[x_]:=f[x] = ..., which will work on arguments being arbitrary Mathematica expressions. The main problem here is to treat arguments containing patterns correctly, and avoid premature arguments evaluation. Block trick is used to avoid infinite recursion while implementing memoization.



ClearAll[calledBefore];
SetAttributes[calledBefore, HoldAll];
Module[{myHold},
Attributes[myHold] = {HoldAll};
calledBefore[args___] :=
(
Apply[Set,
Append[
Block[{calledBefore},
Hold[Evaluate[calledBefore[Verbatim /@ myHold[args]]]

] /. myHold[x___] :> x
], True]];
False
)
]

Block is used here to prevent the premature evaluation of calledBefore. The difference between this version and naive one will show upon expressions involving patterns, such as this:


calledBefore[oneTimeRule[(head:RuleDelayed|Rule)[lhs_,rhs_]]]
calledBefore[oneTimeRule[(head:RuleDelayed|Rule)[lhs_,rhs_]]]


(*
False
True
*)

where the naive f[x_]:=f[x]=... idiom will give False both times.


Creating local environments


The following function allows you to evaluate some code under certain assumptions, by changing the $Assumptions variable locally. This is just a usual temporary changes to global variables expressed as a function.


ClearAll[computeUnderAssumptions];
SetAttributes[computeUnderAssumptions, HoldFirst];

computeUnderAssumptions[expr_, assumptions_List] :=
Block[{$Assumptions = And[$Assumptions, Sequence @@ assumptions]},
expr];

Local UpValues


This example came from a Mathgroup question, where I answered using Block trick.


The problem is as follows: one has two (or more) long lists stored in indexed variables, as follows:


sym[1] = RandomInteger[10^6, 10^6];
sym[2] = RandomInteger[10^6, 10^6];
sym[3] = ...


One has to perform a number of operations on them, but somehow knows (symbolically) that Intersection[sym[1],sym[2]] == 42 (not true for the above lists, but this is for the sake of example). One would therefore like to avoid time-consuming computation


Intersection[sym[1],sym[2]];//AbsoluteTiming

(*
{0.3593750, Null}
*)

in such a case, and use that symbolic knowledge. The first attempt is to define a custom function like this:


ClearAll[myIntersection];

Attributes[myIntersection] = {HoldAll};
myIntersection[sym[i_], sym[j_]] := 42;
myIntersection[x_, y_] := Intersection[x, y];

this uses the symbolic answer for sym[_] arguments and falls back to normal Intersection for all others. It has a HoldAll attribute to prevent premature evaluation of arguments. And it works in this case:


myIntersection[sym[1], sym[2]]

(* 42 *)

but not here:



a:=sym[1];
b:=sym[2];
myIntersection[a,b];//Timing

(* {0.359,Null} *)

The point is that having given myIntersection the HoldAll attribute, we prevented it from match the sym[_] pattern for a and b, since it does not evaluate those and so does not know what they store, at the moment of the match. And without such capability, the utility of myIntersection is very limited.


So, here is the solution using Block trick to introduce local UpValues:


ClearAll[myIntersectionBetter];
Attributes[myIntersectionBetter] = {HoldAll};

myIntersectionBetter[args___] :=
Block[{sym},
sym /: Intersection[sym[a_], sym[b_]] := 42;
Intersection[args]];

what this does is that it Block-s the values of sym[1], sym[2] etc inside its body, and uses UpValues for sym to softly redefine Intersection for them. If the rule does not match, then the "normal" Intersection automatically comes into play after execution leaves Block. So now:


myIntersectionBetter[a,b]

(* 42 *)


This seems to be one of the cases where it would be rather hard to achieve the same result by other means. Local UpValues I find a generally useful technique, used it in a couple more situations where they also saved the day.


Enchanced encapsulation control


This will load the package but not add its context to the $ContextPath:


Block[{$ContextPath}, Needs[your-package]]

This will disable any global modifications that the package being loaded could make to a given symbol:


Block[{symbolInQuestion}, Needs[the-package]]

There are many more applications, Block is a very versatile device. For some more intricate ones, see e.g. this answer - which provides means for new defintions to be tried before the older ones - a feature which would be very hard to get by other means. I will add some more examples as they come to mind.


Comments

Popular posts from this blog

plotting - Filling between two spheres in SphericalPlot3D

Manipulate[ SphericalPlot3D[{1, 2 - n}, {θ, 0, Pi}, {ϕ, 0, 1.5 Pi}, Mesh -> None, PlotPoints -> 15, PlotRange -> {-2.2, 2.2}], {n, 0, 1}] I cant' seem to be able to make a filling between two spheres. I've already tried the obvious Filling -> {1 -> {2}} but Mathematica doesn't seem to like that option. Is there any easy way around this or ... Answer There is no built-in filling in SphericalPlot3D . One option is to use ParametricPlot3D to draw the surfaces between the two shells: Manipulate[ Show[SphericalPlot3D[{1, 2 - n}, {θ, 0, Pi}, {ϕ, 0, 1.5 Pi}, PlotPoints -> 15, PlotRange -> {-2.2, 2.2}], ParametricPlot3D[{ r {Sin[t] Cos[1.5 Pi], Sin[t] Sin[1.5 Pi], Cos[t]}, r {Sin[t] Cos[0 Pi], Sin[t] Sin[0 Pi], Cos[t]}}, {r, 1, 2 - n}, {t, 0, Pi}, PlotStyle -> Yellow, Mesh -> {2, 15}]], {n, 0, 1}]

plotting - Plot 4D data with color as 4th dimension

I have a list of 4D data (x position, y position, amplitude, wavelength). I want to plot x, y, and amplitude on a 3D plot and have the color of the points correspond to the wavelength. I have seen many examples using functions to define color but my wavelength cannot be expressed by an analytic function. Is there a simple way to do this? Answer Here a another possible way to visualize 4D data: data = Flatten[Table[{x, y, x^2 + y^2, Sin[x - y]}, {x, -Pi, Pi,Pi/10}, {y,-Pi,Pi, Pi/10}], 1]; You can use the function Point along with VertexColors . Now the points are places using the first three elements and the color is determined by the fourth. In this case I used Hue, but you can use whatever you prefer. Graphics3D[ Point[data[[All, 1 ;; 3]], VertexColors -> Hue /@ data[[All, 4]]], Axes -> True, BoxRatios -> {1, 1, 1/GoldenRatio}]

plotting - Adding a thick curve to a regionplot

Suppose we have the following simple RegionPlot: f[x_] := 1 - x^2 g[x_] := 1 - 0.5 x^2 RegionPlot[{y < f[x], f[x] < y < g[x], y > g[x]}, {x, 0, 2}, {y, 0, 2}] Now I'm trying to change the curve defined by $y=g[x]$ into a thick black curve, while leaving all other boundaries in the plot unchanged. I've tried adding the region $y=g[x]$ and playing with the plotstyle, which didn't work, and I've tried BoundaryStyle, which changed all the boundaries in the plot. Now I'm kinda out of ideas... Any help would be appreciated! Answer With f[x_] := 1 - x^2 g[x_] := 1 - 0.5 x^2 You can use Epilog to add the thick line: RegionPlot[{y < f[x], f[x] < y < g[x], y > g[x]}, {x, 0, 2}, {y, 0, 2}, PlotPoints -> 50, Epilog -> (Plot[g[x], {x, 0, 2}, PlotStyle -> {Black, Thick}][[1]]), PlotStyle -> {Directive[Yellow, Opacity[0.4]], Directive[Pink, Opacity[0.4]],