to me, the problem:
? t ==?== ?->?
where the first ? has an empty local context
is always Uncertain... to be fully understood
why unification gives Failure. It may be correct if
t has type (Rel k). I wrap it in the refiner.
the refiner was not checking that the resulting type
of an application was the expected one in case
the application is ((lambda...) ?...) when ?...
resolves to the empty ? vector.
the TypeError exception is back in place inside the Type modules
basic_rg: some typing-related code is back in place inside the Type module
log: some improvements in the interface
The ~delta argument was accidentally saved in the closure of aux that was
passed to all functions defined by the reduction strategy. So, most of the
terms where shared by the KAM environment in whd normal form with delta=max_int
(that means, not delta/iota-reduced at all, just beta-normal).
Thus many heavy computations where not shared, completely loosing the point
of having a KM.
The solution was to fix the API so that delta is not hidden anymore, then
really sharing computations with delta=0 (the most common case, I think).
The bad behaviour is still present for other values of delta, and an imperative
cache may be emplyed.
The ~delta argument was accidentally saved in the closure of aux that was
passed to all functions defined by the reduction strategy. So, most of the
terms where shared by the KAM environment in whd normal form with delta=max_int
(that means, not delta/iota-reduced at all, just beta-normal).
Thus many heavy computations where not shared, completely loosing the point
of having a KM.
The solution was to fix the API so that delta is not hidden anymore, then
really sharing computations with delta=0 (the most common case, I think).
The bad behaviour is still present for other values of delta, and an imperative cache may be emplyed.
basic_rg: bugfix in AST to allow attributes in global entries
xml: the exported XML data now comply a DTD
icons: crux icon for the browser's address line
Makefile: "lint" entry added for XML validation
- Bug fixed in definition of big_op.
- ltb now goes to booleans
- the meaning of big_x n is to sum from 0 to (pred n);
in particular big_x O is the neutral element.
Thus the isomorphism is broken.
Ferruccio Guidi [Thu, 20 Aug 2009 09:57:51 +0000 (09:57 +0000)]
- sort inclusion must be restricted to term backbone in order to avoid
validity of non-normalizing terms (see the omega.aut example)
- command-line options -a and -p rearranged
Ferruccio Guidi [Sun, 16 Aug 2009 21:13:06 +0000 (21:13 +0000)]
- performance data added for reference
- interface of the Hierarchy module improved
- library for managining abstract layer representation files added
- toplevel improved: the analysis of the automath source is now optional
Since the introduction of saturation, an assert false is now possible
(i.e. locked meta vs non flexible term since some saturations have not been
fully performed yet)
Setoids, setoids1, sets, and the like. The mess begins.
Note: the (partially) interesting part of the development is that canonical
structures (provided by unification hints) allow to perform setoid rewriting
by hieroglyphs without having to use "rich operators". On the other hand, after
the application what you get is always an enriched structure and thus either
it is normalized away or the theory becomes a mix of rich/unrich. The latter
phenomena happens anyway because, in algebraic structures, you need to put
things in rich structures to have all the properties you need.
algebra/magmas does not work because refinement of projection is too weak