Wikipedia’s new references to Multiple Form Logic, etc.

•February 14, 2009 • 7 Comments
Inclusion-exclusion illustrated for three sets
Image via Wikipedia

Multiple Form Logic is my own extension of George Spencer-Brown‘s Laws of Form. Wikipedia’s recent references to it have been good news. They include two links to my older site, which is still valid, although there is now a preferred mirror-site, hosted  in my personal domain:  (

These Wikipedia references were apparently written by some (unknown) people (probably researchers from the official Laws of Form Forum). Here they are:


The Multiple Form Logic, by G.A. Stathis, “generalises [the primary algebra] into Multiple Truth Values” so as to be “more consistent with Experience.” Multiple Form Logic, which is not a boundary formalism, employs two primitive binary operations: concatenation, read as Boolean OR, and infix “#”, read as XOR. The primitive values are 0 and 1, and the corresponding arithmetic is 11=1 and 1#1=0. The axioms are 1A=1, A#X#X = A, and A(X#(AB)) = A(X#B).


Τhe Multiple Form Logic, by G.A. Stathis, owes much to the primary algebra.

BTW, in case you got the… wrong idea(!) I did not write these references! Oh no! 🙂 Besides, although I  feel greatful towards these… nice people, I do not agree with… everything they wrote! 🙂 E.g. as regards their reference to Multiple Form Logic, which is not a boundary formalism”… I believe that -on the contrary Multiple Form Logic IS a “boundary formalism”; a very fundamental and radical one, in fact! My view is also that Multiple Form Logic changes  the way we think of boundaries, as such; enhancing the ontological nature or (if you prefer) the existential fabric of  (our Reality consisting of -) boundaries, in (at least) two ways: Continue reading ‘Wikipedia’s new references to Multiple Form Logic, etc.’

A Lacanian view of Multiple Form Logic

•January 22, 2010 • 4 Comments
  1. Lacan’s Real, imaginary and symbolic in Multiple Form Logic

  2. A Lacanian interpretation of Logic implication in M.F. Logic

  3. A Lacanian interpretation of Multiple Form Logic’s Axiom 3

by OMADEON ©2010

Jacques Lacan

1. Lacan’s Real, imaginary and Symbolic in Multiple Form Logic

Multiple Form Logic can be interpreted in terms of Jacques Lacan’s three realms of the “Real“, the “Imaginary” and the “Symbolic“. I.e. Reality is an External Space, located outside the Imaginary, as well as outside the Symbolic. The imaginary contains the symbolic, endlessly (re-)creating (inside its own space) Symbolic representations of the Real.

2. A Lacanian interpretation of Logic implication in M.F. Logic

In Multiple Form Logic, logic implication is a configuration of Forms, where a (Symbolic) signifier is assigned (by the imagination) to a signified (Real) object. The Symbolic signifier is inside the boundary (of the Lacanian “imaginary”) and the Real (signified) is “out there”. The “implication operator” I, is the boundary of perception, i.e. the imagination itself:

The boundary of perception (or the Lacanian “imaginary”) I, contains the “premiss” P, as a Symbolic signifier of the Real “conclusion” R (object signified), which is “out there”.

  • So, Logic implication is (nothing but) the Act of perception, itself!

Continue reading ‘A Lacanian view of Multiple Form Logic’

Dr. William Bricken: Laws of Form and Boolean Algebra

•July 30, 2009 • 6 Comments
Boolean logic
Image via Wikipedia

This post contains some cool philosophical and scientific ( Logic) material, EXCLUSIVELY posted here in public, after its recent first appearance in the “Laws of Form Forum” Yahoo group (where I am a member since 2003).

Dr. William Bricken has also given me his permission to re-post what follows.

Who is Dr. Bricken?

Dr. William Bricken is currently a Research Professor of Education at the University of Washington and a Consultant for Interval Research Corporation, where he is working on unifying hardware and software approaches to computation. Dr. Bricken’s prior positions include Principal Scientist of UW’s Human Interface Technology Laboratory, where he designed and implemented the Virtual Environment Operating System and interactive tools of the VR environment; Director of Autodesk Research Lab, which developed the Cyberspace CAD prototype of virtual reality; and Principal Research Scientist at Advanced Decision Systems, where he pioneered high-performance inference engines, visual programming systems, and instructable interfaces. Dr. Bricken holds a multidisciplinary Ph.D. in Research Methodology, Educational Psychology and Computer Science from Stanford, and degrees in Statistics (MS Stanford), Education (DipEd, Monash University, Australia), and Social Psychology (BA, UCLA). Before entering industry, Dr. Bricken was a Assistant Professor of Education at University of Hawaii and at Monash University, specializing in General Methods of Teaching…


On Jul 25, 2009, at 2:18 AM, omadeon2 wrote:

Mr. Bricken,
Congrats for your eloquent and detailed response to Mr. Harvey,  explaining the relationship between LoF and Boolean algebra.

Now, since this is a closed group (visible only to members) I must ask for your permission, to copy this comment of yours in my  public blog
…as an answer to similar objections, raised there.  (I believe by a person who has also asked similar questions here).
Thanks in advance (if permission is granted)


William Bricken:

Sure, any postings I personally submit to this list can be taken as being in the public domain.

Um, so in response to requests to “show me the difference” between LoF and Boolean algebra, here’s a demonstration of difference by counting  to two:

Spatial notations address different representational concepts than do string notations, just as a picture of a house conveys different experiential concepts than the word “house”. Spatial notation is  iconic, to some extent it represents what it means, whereas string notation strictly separates syntax and semantics.

The “isomorphic with Boolean algebra” interpretation of LoF imposes a particular set of representational concepts on LoF that are foreign to LoF, concepts such as “commutative”, “dual”, “arity”, and “object”.  LoF illustrates a single concept, “containment”, with a single token  (the same concept takes two tokens in string notation). LoF has a  singular basis, the minimal basis for Boolean algebra is two.

Since iconic languages are based on different concepts, they suggest a different way of seeing. Since they incorporate meaning, a great way to understand how they work is to spend time using these languages to  solve difficult problems. (see footnote)

And since both LoF and Boolean algebra can be interpreted as “logic, there’s a (somewhat conventional) dilemma. LoF shows that there are different *formal* ways to organize our thoughts (and perceptions, emotions, etc) to achieve the same objectives. Although the output of a silicon circuit and a “LoF machine” would be identical, the internal architecture of each is substantively different. Output equivalence is not isomorphism, for that you need also to show some sort of process equivalence.

Nonetheless, LoF shows that we can achieve rational thought by visualizing bounded spaces and by deleting irrelevancies, as well as by the traditional means of rearranging textual symbols and “reading” strings to bring them back to life.

-William Bricken

Footnote: my CS professor told us that to begin to understand Boolean computation, we should prove (ie algebraically, not by exhaustive search) the distributivity of if-then-else (a three-valued Boolean  function commonly used as a branching structure in software languages): Show

(IF (IF a THEN b ELSE c) THEN d ELSE e) =

This can be done by hand, but it gets messy

For folks with access to languages and implementations, it would be  quite possible, should there be an interest, to build a benchmark list of twenty or so logic problems that are good for comparing algorithms (um, not to “race” algorithms by comparing speed of computation, cause  that is very hardware dependent). When theorem provers were just  getting refined, in the early 80s, researchers found that their  implementations worked great for some problems, and poorly if at all  for others. Turns out that implementations exhibit a feature that  mathematical theories do not, they are sensitive to internal process and representation.

Morphism arguments give no consideration to the efficiency of  achieving proof of an arbitrary assertion. Algorithm complexity  theory works toward putting bounds on the computational cost of worst  case and average case theorems. And benchmark comparison is intended  to assess the utility of an implementation.


NOTE (by Omadeon): The original text by Dr. Bricken that inspired my request to re-post, is this:

Re: [lawsofform] Criticisms of LoF, Flaws of Form, Cull & Frank, 1979 IJGS

On Aug 6, 2008, at 3:05 PM, Alex Harvey wrote:

Does anyone has access to Flaws of form, or can anyone provide a  summary (and rebuttal?) of their criticisms?

Hi Alex,

Ahh, one of my favorite subjects. It is from Cull&Frank that we get

(1) “At best, Brown has produced a new axiomatization for Boolean algebra.”

In their article, Cull&Frank demonstrate that they have almost no understanding of LoF.

If I had to identify the crux of their misunderstanding, it would be

(2) “By allowing “1” to replace “mark” [they write SB‘s form of the mark here], “0” to replace the blank, “V” for concatenation, and XOR [they write the circled-plus token here], the axioms appear as 1 V 1 = 1 [and] 1 XOR 1 = 0″

This is analogous to saying “let’s replace plus by minus, and then addition will be subtraction”.

A couple more quotes that indicate their, um, arrogance:

(3) “Brown violates the most basic truth of information theory: at least two symbols are required to convey any information.”

(4) “Brown merely replaces the ordinary ideographic notation of mathematics with a positional or analytic notation.”

(5) “… the shape, nature, etc. of the signs one chooses to convey information with are irrelevant to the mathematical content of what is conveyed.”

  • Cull&Frank are basically advocating a very common attitude: if it is different that what I know, then it is wrong.

An alternative perspective is that Brown (well, actually C.S. Peirce did this at the turn of the 20th century) provides an example of a system that falsifies assertions (1),(3),(4), and (5) above.

LoF *is* ideographic and conventional mathematical notation is not.  Conventional notation *is* positional and LoF is not, so statement  (2) is at best, confused.

Consider the various logical interpretations of the mark, i.e:

True, not False, False implies False, True or False

This is sufficient to demonstrate that LoF is not Boolean Algebra (although it can be interpreted as Boolean Algebra). There is a many- to-one map from Boolean Algebra to LoF, so LoF is not isomorphic to Boolean Algebra. By replacing “blank” by “0”, the many-to-one map is degraded into an isomorphism, permitting these blind men to claim that because they are holding onto the elephant’s leg, all elephants are trees.

Since the deeper meaning of the absence of a mark is that emptiness permeates the representational substrate (a rather obvious observation), to replace emptiness by “0” would require arbitrarily many “0”s. That is, C&F are viewing the representational space as positional and structured, permitting, for instance, only one correct place to put the “0”. This is also apparent in the idea of writing “V” for concatenation.

Since LoF is a *spatial* notation, concatenation is not well defined. It is better called “sharing a space”, invoking none of burdensome structure of a linear, typographical notation. And with sharing space (think of a bunch of people in a room), there is no “V” that connects objects within the space into pairings.

Yes, SB violates C&F’s understanding of “information theory”, cause C&F forget that the Shannon/Weaver brand of information theory (that I presume they had in mind) addresses sequential streams of binary variations, not spatial arrays. I’m sorry, but since LoF is prima facia evidence of a system that uses only one token (in space) to communicate information, it seems rather bizarre that C&F would imply that they are not even able to look at LoF forms.

Aside: here’s another grand example of this type of blindness:

Additive systems for whole numbers have been in use for several thousands of years. You “add” by shoving piles of objects together; it is a visceral and visual spatial experience. In symbolic arithmetic, you add by memorizing token combination rules (the addition table). The common rules of algebra deny that additive systems exist.

Let’s see, for completion: interpreting “spatial containment” as XOR is deeply wrong. In particular, the essential property of LoF forms that permits their interpretation as Boolean Algebra is *pervasion*, that any form on the outside has complete access to the inside, at any depth (SB is not very clear about this, but Peirce is). The mark does  not exclude crossing from the outside inwards, this asymmetry frees  LoF of the dualism that permeates Boolean structures. So freed, it is  then simple to build “logic” without any concept of False. Or, to be a bit more accurate, Truth is confounded with Existence.

There are many other aspects of self-contradiction in C&F’s article,  indicting that the abusive tone is just that, not based on study or reflection, but a tirade against the unknown.

-William Bricken

Reblog this post [with Zemanta]

‘What is Quantum Psychology?” by Eddie Oshins

•February 7, 2009 • 1 Comment

Probability densities for the electron at diff...

Eddie Oshins (1945-2003):

Eddie Oshins ’66 died late last year (2003), after a forty-year struggle with mental illness so severe that it would have justified a life of treatment, hospitalization, and withdrawal in anyone less brilliant and less courageous. In fact, Eddie is one of Reed’s greatest scientific achievers. In the late seventies, Eddie cracked the mathematical/ physics formulae smuggled out of a Soviet Gulag. Writtenon toilet paper in a virtually unreadable hand and notation, these were Yuri Orlov’s breakthrough insights on “grey logic”: of what happens between the “0” and the “1” in computer language: neither is not or is, but maybe. At the time of his death, Eddie was working on the “quantum physics” of  schizophrenia: something that fits in well with his own, brilliant popularization of Orlov’s math as adding the Yin/Yang to cybernetics. Before Orlov, Eddie was one of the few to “stand up” to the “current wisdom” that computers could “think.” Eddie, I think now you can fly free from the fear that never paralyzed you for long and just laugh, as you almost always could.

– Thomas Forstenzer, “goodbye, Eddie Oshins” (2004)


Continue reading ‘‘What is Quantum Psychology?” by Eddie Oshins’

Hello world! – new blog about “LAWS OF FORM”

•February 7, 2009 • 12 Comments

This is a blog about George Spencer-Brown’s ideas (expounded in “Laws of Form“) and also about his numerous philosophical disciples, a rather big crowd that includes extraordinary individuals, as well as ordinary people (who have been influenced by George Spencer-Brown‘s ideas): E.g. Louis Kaoufman, Richard Shoup , Art Collings, Dave Keenan, William Bricken, Tom McFarlane, Ben Goertzel, Eddie Oshins, Francisko Varela, Natalia Petrova, Jeff James, etc. and… myself – through Multiple Form Logic.

This blog’s header depicts the two fundamental axioms or “initials” of George Spencer-Brown‘s “Primary Arithmetic“: The image on the left is the “Law of Calling” and the one on the right is the “Law of Crossing“, i.e.

Law of CallingLaw of Calling

Law of CrossingLaw of Crossing

There is a certain revival of George Spencer-Brown’s ideas, taking place nowadays, and this blog will -hopefully- contribute to creative public discussions about GSB’s ideas.

gsbGeorge Spencer Brown

There is already a Yahoo group dedicated to “Laws of Form” (started many years ago by Mr. Richard Shoup) called the “Laws of Form forum”. However, all discussions in that Yahoo group are private, i.e. not visible to non-members (and to Search Engines).

Reblog this post [with Zemanta]