Dr. William Bricken: Laws of Form and Boolean Algebra
This post contains some cool philosophical and scientific ( Logic) material, EXCLUSIVELY posted here in public, after its recent first appearance in the “Laws of Form Forum” Yahoo group (where I am a member since 2003).
Dr. William Bricken has also given me his permission to re-post what follows.
–Who is Dr. Bricken?
Dr. William Bricken is currently a Research Professor of Education at the University of Washington and a Consultant for Interval Research Corporation, where he is working on unifying hardware and software approaches to computation. Dr. Bricken’s prior positions include Principal Scientist of UW’s Human Interface Technology Laboratory, where he designed and implemented the Virtual Environment Operating System and interactive tools of the VR environment; Director of Autodesk Research Lab, which developed the Cyberspace CAD prototype of virtual reality; and Principal Research Scientist at Advanced Decision Systems, where he pioneered high-performance inference engines, visual programming systems, and instructable interfaces. Dr. Bricken holds a multidisciplinary Ph.D. in Research Methodology, Educational Psychology and Computer Science from Stanford, and degrees in Statistics (MS Stanford), Education (DipEd, Monash University, Australia), and Social Psychology (BA, UCLA). Before entering industry, Dr. Bricken was a Assistant Professor of Education at University of Hawaii and at Monash University, specializing in General Methods of Teaching…
On Jul 25, 2009, at 2:18 AM, omadeon2 wrote:
Now, since this is a closed group (visible only to members) I must ask for your permission, to copy this comment of yours in my public blog
https://lawsofform.wordpress.com …as an answer to similar objections, raised there. (I believe by a person who has also asked similar questions here).
Thanks in advance (if permission is granted)
Sure, any postings I personally submit to this list can be taken as being in the public domain.
Um, so in response to requests to “show me the difference” between LoF and Boolean algebra, here’s a demonstration of difference by counting to two:
Spatial notations address different representational concepts than do string notations, just as a picture of a house conveys different experiential concepts than the word “house”. Spatial notation is iconic, to some extent it represents what it means, whereas string notation strictly separates syntax and semantics.
The “isomorphic with Boolean algebra” interpretation of LoF imposes a particular set of representational concepts on LoF that are foreign to LoF, concepts such as “commutative”, “dual”, “arity”, and “object”. LoF illustrates a single concept, “containment”, with a single token (the same concept takes two tokens in string notation). LoF has a singular basis, the minimal basis for Boolean algebra is two.
Since iconic languages are based on different concepts, they suggest a different way of seeing. Since they incorporate meaning, a great way to understand how they work is to spend time using these languages to solve difficult problems. (see footnote)
And since both LoF and Boolean algebra can be interpreted as “ “, there’s a (somewhat conventional) dilemma. LoF shows that there are different *formal* ways to organize our thoughts (and perceptions, emotions, etc) to achieve the same objectives. Although the output of a silicon circuit and a “LoF machine” would be identical, the internal architecture of each is substantively different. Output equivalence is not isomorphism, for that you need also to show some sort of process equivalence.
Nonetheless, LoF shows that we can achieve rational thought by visualizing bounded spaces and by deleting irrelevancies, as well as by the traditional means of rearranging textual symbols and “reading” strings to bring them back to life.
Footnote: my CS professor told us that to begin to understand Boolean computation, we should prove (ie algebraically, not by exhaustive search) the distributivity of if-then-else (a three-valued Boolean function commonly used as a branching structure in software languages): Show
(IF (IF a THEN b ELSE c) THEN d ELSE e) =
(IF a THEN (IF b THEN d ELSE e) ELSE (IF c THEN d ELSE e)
This can be done by hand, but it gets messy…
For folks with access to languages and implementations, it would be quite possible, should there be an interest, to build a benchmark list of twenty or so logic problems that are good for comparing algorithms (um, not to “race” algorithms by comparing speed of computation, cause that is very hardware dependent). When theorem provers were just getting refined, in the early 80s, researchers found that their implementations worked great for some problems, and poorly if at all for others. Turns out that implementations exhibit a feature that mathematical theories do not, they are sensitive to internal process and representation.
Morphism arguments give no consideration to the efficiency of achieving proof of an arbitrary assertion. Algorithm complexity theory works toward putting bounds on the computational cost of worst case and average case theorems. And benchmark comparison is intended to assess the utility of an implementation.
NOTE (by Omadeon): The original text by Dr. Bricken that inspired my request to re-post, is this:
On Aug 6, 2008, at 3:05 PM, Alex Harvey wrote:
Does anyone has access to Flaws of form, or can anyone provide a summary (and rebuttal?) of their criticisms?
Ahh, one of my favorite subjects. It is from Cull&Frank that we get
(1) “At best, Brown has produced a new axiomatization for Boolean algebra.”
In their article, Cull&Frank demonstrate that they have almost no understanding of LoF.
If I had to identify the crux of their misunderstanding, it would be
(2) “By allowing “1” to replace “mark” [they write SB‘s form of the mark here], “0” to replace the blank, “V” for concatenation, and “XOR“ [they write the circled-plus token here], the axioms appear as 1 V 1 = 1 [and] 1 XOR 1 = 0″
This is analogous to saying “let’s replace plus by minus, and then addition will be subtraction”.
A couple more quotes that indicate their, um, arrogance:
(3) “Brown violates the most basic truth of information theory: at least two symbols are required to convey any information.”
(4) “Brown merely replaces the ordinary ideographic notation of mathematics with a positional or analytic notation.”
(5) “… the shape, nature, etc. of the signs one chooses to convey information with are irrelevant to the mathematical content of what is conveyed.”
- Cull&Frank are basically advocating a very common attitude: if it is different that what I know, then it is wrong.
An alternative perspective is that Brown (well, actually C.S. Peirce did this at the turn of the 20th century) provides an example of a system that falsifies assertions (1),(3),(4), and (5) above.
LoF *is* ideographic and conventional mathematical notation is not. Conventional notation *is* positional and LoF is not, so statement (2) is at best, confused.
Consider the various logical interpretations of the mark, i.e:
True, not False, False implies False, True or False
This is sufficient to demonstrate that LoF is not Boolean Algebra (although it can be interpreted as Boolean Algebra). There is a many- to-one map from Boolean Algebra to LoF, so LoF is not isomorphic to Boolean Algebra. By replacing “blank” by “0”, the many-to-one map is degraded into an isomorphism, permitting these blind men to claim that because they are holding onto the elephant’s leg, all elephants are trees.
Since the deeper meaning of the absence of a mark is that emptiness permeates the representational substrate (a rather obvious observation), to replace emptiness by “0” would require arbitrarily many “0”s. That is, C&F are viewing the representational space as positional and structured, permitting, for instance, only one correct place to put the “0”. This is also apparent in the idea of writing “V” for concatenation.
Since LoF is a *spatial* notation, concatenation is not well defined. It is better called “sharing a space”, invoking none of burdensome structure of a linear, typographical notation. And with sharing space (think of a bunch of people in a room), there is no “V” that connects objects within the space into pairings.
Yes, SB violates C&F’s understanding of “information theory”, cause C&F forget that the Shannon/Weaver brand of information theory (that I presume they had in mind) addresses sequential streams of binary variations, not spatial arrays. I’m sorry, but since LoF is prima facia evidence of a system that uses only one token (in space) to communicate information, it seems rather bizarre that C&F would imply that they are not even able to look at LoF forms.
Aside: here’s another grand example of this type of blindness:
Additive systems for whole numbers have been in use for several thousands of years. You “add” by shoving piles of objects together; it is a visceral and visual spatial experience. In symbolic arithmetic, you add by memorizing token combination rules (the addition table). The common rules of algebra deny that additive systems exist.
Let’s see, for completion: interpreting “spatial containment” as XOR is deeply wrong. In particular, the essential property of LoF forms that permits their interpretation as Boolean Algebra is *pervasion*, that any form on the outside has complete access to the inside, at any depth (SB is not very clear about this, but Peirce is). The mark does not exclude crossing from the outside inwards, this asymmetry frees LoF of the dualism that permeates Boolean structures. So freed, it is then simple to build “logic” without any concept of False. Or, to be a bit more accurate, Truth is confounded with Existence.
There are many other aspects of self-contradiction in C&F’s article, indicting that the abusive tone is just that, not based on study or reflection, but a tirade against the unknown.
Related articles by Zemanta
- New Technology to Make Digital Data Self-Destruct (comsecllc.blogspot.com)
- UW Profs, Tech Execs Talk Next-Generation Graphics, Imaging, and Interfaces for Games (xconomy.com)