28 September 2006

It's turtles all the way down

I haven't yet grasped the relationship between proof and truth. And the ground is sufficiently tricky that I have a hard time trusting anything I read. I got something from the local public library called Fundamentals of Mathematics, which has an introductory chapter on model theory—but they go around proving things about models and logic using ordinary mathematical reasoning. This is like writing a Python compiler in Python: one feels that some bootstrapping must be involved somewhere. It's probably in a later chapter, but in the meantime my brain involuntarily rejects everything the book claims. Unproductive.

Metamath responds

Norm Megill, the man behind Metamath, responded to my recent post about Metamath and hygienic macros:

You are correct that substitution in Metamath is the simple replacement of a variable with a (well-formed) expression and thus isn't hygienic, in Scheme's terminology. In logic, the term used for “hygienic substitution” is “proper substitution,” which Metamath does not implement directly.

Instead, Metamath natively implements a simpler concept of “disjoint variables” as an optional proviso on an axiom (and inherited by theorems), that will forbid certain substitutions. In more standard terminology, this means, for example, that (if the axiom says so) a certain bound variable may not occur in a wff. In this case, the wff may have to be transformed, in the proof, to an equivalent wff with the bound variable renamed before we are allowed to substitute it for the wff variable of the axiom. In effect, the work of renaming bound variables, in order to make a sound (hygienic) substitution, is done by the proof itself rather than by the proof verifier. This simplifies the underlying primitive concepts needed by the Metamath language itself, with a consequence that the proof verifier can be very simple.

How simple is Metamath's verifier? About 300 lines of Python does the trick.

Simplicity is not the motivation in Lisp; in fact Lisp is generally more complicated than Scheme. Rather, Lisp programs actually take advantage of symbols getting crossed. (If you know Lisp macros, you know what I mean, and if you don't, I can't easily explain it.) I know Metamath does something similar in at least one place: its definition of proper substitution.

([y / x]φ ↔ ((x = yφ) ∧ ∃x(x = yφ)))

I wonder if similar tricks are used elsewhere.

27 September 2006

Amazingly bad

It reads like a deliberate insult:

To distribute a .NET Framework application both to 32- and 64-bit platforms, build two MSI packages, one targeted at a 32-bit and the other a 64-bit computer.

I have to say, Microsoft's support for 64-bit architectures is a total nightmare. MSI can't compete with that, but it's pretty darned awful.

.NET supposedly supports XCOPY deployment. Even MSI can make a directory and put files in it, and as far as I know the MSI database contains nothing platform-specific. What's the problem here?

I know the answer. The problem is that XCOPY deployment doesn't really work. Not for everybody, certainly not for apps with legacy dependencies. What's a tech writer supposed to do? Document how things would work if the world were ideal, because they might work that way for your app? Or give everyone the ugly workaround that most real-world apps need to apply?

Blaaargh.

26 September 2006

A bouquet of Penny Arcade

It's a comic strip about video games. Be forewarned: most of the joke with Penny Arcade is that the characters are phenomenally foul-mouthed.

I picked these out for oliviao, even though it seems likely there's not a single Penny Arcade strip she would actually like...

“This is that Star Ocean?”

“I'm watching my sim watch TV.”

“I thought you said we were going to level up together!”

“Ann, as your uncle, I respect your youthful vigor.”

“Dude. I just got back from the Serenity preview showing!”

“You told me this book a) wasn't fantasy, and b) contained no dragons.”

25 September 2006

Metamath isn't hygienic

Every axiom, definition, and theorem in Metamath is actually a template where no matter what you plug in for the free variables, you get a statement that's considered true in the system. So, for example:

Q → (PQ)

Allows you to write sentences like

(A = 0) → ((∀ xR (0 < xA < x))(A = 0))

This is kind of like a Scheme macro, but a really surprising thing about Metamath is that the sort of substitution it employs is not hygienic. If you happen to be using a local variable x, and you plug in a formula that has x free, the formula will “capture” the x in its new context.

In Scheme, if you plug a bit of code into another bit of code this way, and both bits of code are using a variable named x, Scheme behaves as though it's silently renaming one of the variables so the two don't conflict... unless the two x's actually refer to the same global variable or something—you can see it gets rather complicated. Lisp doesn't bother. It just plugs in the code. If variables get crossed, they get crossed.

Apparently doing it the Lisp way makes the Metamath proof-checker a lot simpler. Interesting.

15 September 2006

A strange postgrad class

This is the kind of course I'd like to take: PHYS771: Quantum Computing Since Democritus. There's a tolerably cool math puzzle at the end of the first lecture.

08 September 2006

No, not SOAP. We're talking actual soap here.

Scott Aaronson has a cool article (PDF) that looks at half a dozen proposed ways of solving really difficult computational problems—such as using soap bubbles or black holes, or time-traveling, or playing Russian roulette.

The paper is playful but serious—a serious survey of the far-out. A lot of it is mathematical gibberish to me, but it's amusing all the way through, and there's a fun “think big—no bigger than that” section at the end.

I have one super-geeky nitpick, though:

Even many computer scientists do not seem to appreciate how different the world would be if we could solve NP-complete problems efficiently. I have heard it said, with a straight face, that a proof of P = NP would be important because it would let airlines schedule their flights better, or shipping companies pack more boxes in their trucks! One person who did understand was Gödel. In his celebrated 1956 letter to von Neumann (see [69]), in which he first raised the P versus NP question, Gödel says that a linear or quadratic-time procedure for what we now call NP-complete problems would have “consequences of the greatest magnitude.” For such an procedure “would clearly indicate that, despite the unsolvability of the Entscheidungsproblem, the mental effort of the mathematician in the case of yes-or-no questions could be completely replaced by machines.”

But it would indicate even more.... [Ed.: color added]

Here Scott shifts carelessly between two completely different hypotheticals (blue and orange). The statement he lambasts, that P=NP would be important for logistics, is actually an overstatement. A proof that P=NP (blue) almost certainly would have no practical consequences at all, unless I'm wildly mistaken, because a great many problems in P are not practically solvable. Who cares if every problem that can be solved in O(2n) time can also be solved in O(n1000000) time? In practice the latter is probably worse.

By contrast, a discovery that NP-complete problems can be solved in quadratic time (orange) would be a mind-blowing, probably world-changing event. But this is unlikely in the extreme.

06 September 2006

Gradual typing

Seen on Lambda the Ultimate: Gradual Typing for Functional Languages, a paper that introduces a new flavor of lambda calculus which they call (this is an approximation) λ?.

Hard to explain why language geeks should be excited about this. I am.

We present a formal type system that supports gradual typing for functional languages, providing the flexibility of dynamically typed languages when type annotations are omitted by the programmer and providing the benefits of static checking when function parameters are annotated.

The proofs in the paper are machine-verified!

Relationships

I have a big file full of thoughts about programming languages. Re-reading it today I found this (edited):

Component-container relationships. I suspect these are all over the place. This is always a hassle. I wish someone would make it easy.

One idea is to put the component class lexically inside the container class:

        class Container {
            class Component {
                // each Component is part of a Container
            }
        }

Hmmm... I need to think about this some more.

And I realized: I now know the solution to this problem. In fact I've implemented it. Relationships.

I mean relationships in the sense of entity-relationship diagrams. Relationships describe how entities relate to each other, like the bolded words below:

His new paper cites 37 sources. (This could perhaps be written in code like this: paper1.cites(paper2))

Tom Hanks starred in the movie Big. (actor.starredIn(movie))

The patient Joe Bloggs is treated in the cardiology ward. (patient.isTreatedIn(ward))

Of course parent-child relationships are one case of this, but there are many others.

Okay. Now consider some hypothetical Python code:

    class Component(object):
        ...

    class Container(Component):
        ...

    # Define the relationship between these two classes...
    containment = OneToManyRel(
        Container, 'container', 'contains', Component, 'contents')

Here's what the OneToManyRel does:

  • Adds a container property to class Component. Each Component's container is always either a Container object or None. It's initially None, but if you like, you can just initialize it in Component.__init__ like any other attribute.

  • Adds the reverse property, contents, to class Container. This property is an object that looks like a set. It's initially empty. The two properties are automatically kept in sync, so if you do thing.container = None, the thing is automatically removed from its container's contents set.

  • Adds a method Container.contains(Component) that can be used to query the relationship.

  • Keeps a data structure of all the relationships, so you can do containment.items(), which returns a list of (Container, Component) pairs.

Here's a taste of how powerful this can be:

    class BuildStep(object):
        ...

    class File(object):
        ...

    # Each BuildStep produces some Files called its targets.
    # Each File is produced by one BuildStep called its buildStep.
    produces = OneToManyRel(BuildStep, 'buildStep', 'produces', File, 'targets')

    # Each BuildStep depends on some Files called its sources.
    # Each File is depended upon by some BuildSteps called its dependants.
    dependsOn = ManyToManyRel(BuildStep, 'dependants', 'dependsOn', File, 'sources')

That's very brief, compared to what you'd have to type to get this the traditional way.

So I have an implementation. In addition to OneToManyRel and ManyToManyRel, it has things called SymmetricRel and EquivalenceRel. (Warts: My current implementation doesn't use weakrefs, so it keeps objects alive until explicitly removed, which is kind of silly. Also, the properties currently look like iterators, not sets.) The whole thing is 355 lines of Python code.

All this is inspired by Inform 7 relations. I wonder about Ruby. I can only assume Ruby on Rails automatically does stuff like this for databases; I don't know about regular old in-memory objects though. I imagine this would be a breeze to implement in Ruby.

Anyway. I don't know how generally useful it will be, but it's certainly encouraging to come across an old gripe about the exact problem this solves.