How to model an Alloy type

Alloy has a very interesting type system. Each value is actually a tuple set. Scalars are just a tuple set with an arity of 1 and size of 1.

In the parser, each expression has a type. This type defines the possible signatures that can be used for each column. For a scalar number, the type is [[Int]], for ident the type is [[univ],[univ]]. If you have the following declaration:

pred foo[ n : some Foo+Bar ] {}

The type for n will be [[Foo,Bar]]. That is the structure is:

   Type ::= ColumnType[arity]
   ColumnType ::= some Sig

This modeling leaves out the multiplicity, some in the foo[ n:some Foo+Bar] example. This cardinality is part of the declaration. However, it is also possible to specify the multiplicities inside of a relation:

pred foo[ n : Foo one-> lone Bar ] {}

This information is stored in the expression, not in the type.

One the Java side I’ve got a lot of freedom and I want to use this freedom to create a very good API. I have a TType interface to represent the Alloy type.

public interface TType {
    TColumn[] getColumns();
    int arity();
}
public interface TColumn {
    TSignature[] sets();
}

am thinking of adding a multiplicity to the TConstrain:

  enum Multiplicity {
                       lone,
                       one,
                       some,
                       set
    }

public interface TColumn {
    TSignature[] sets();
    Multiplicity multiplicity();
}

The advantage is that the multiplicity is now integrated in the Alloy type system in a uniform way. Currently it is partly stored in the declaration and partly in the expression and partly in the type.

This does not affect the inner working of Alloy in any way, it is only a layer on top of it that will be used to write Java applications. That said, it might influence over the long run how Alloy evolves.

I highly appreciate any feedback.

I’m not an expert on the type system of Alloy, but your post reminded me of the fact that, AFAIK, there are discrepancies between the type system as it is described in [Edwards, Jackson, and Torlak. A type system for object models. 2004] and as it is implemented in the tool (the former is richer). I suppose it would be nice to base your work, if possible, on the article so that the outcome is ready for a possible re-implementation of the type system. In particular, a type is a union of products of basic types and, if I’m not mistaken, an expression may have various arities (in the case of overloading). Also, the authors seem to argue against the consideration of multiplicities in types: “Type distinctions based on multiplicity – between singletons and general sets, and between functions and relations – spoil the uniformity of the navigation syntax. (In OCL, for example, the target multiplicity of a relation determines whether an out-of-domain navigation results in undefined or an empty set.)”.

I found this copy but that does not contain anything about multiplicity?

Not sure I can see the practical effect of this spoiling? Currently in Alloy, the Type object seems ok but we keep the cardinality constraints in different places? It feels like having the cardinality inside the ColumnType would make the model a lot more uniform?

For example:

pred foo[ n : Int ] {}
run { foo[ Int ]  || foo[none]}

If we did type checking on cardinality constraints, this could give two type errors? (And prevent some pain, especially for newcomers …)