text\<open>
An Isar proof body serves as mathematical notepad tocompose logical
content, consisting of types, terms, facts. \<close>
subsection \<open>Types and terms\<close>
notepad begin txt\<open>Locally fixed entities:\<close> fix x \<comment> \<open>local constant, without any type information yet\<close> fix x :: 'a \ \variant with explicit type-constraint for subsequent use\
fix a b assume"a = b"\<comment> \<open>type assignment at first occurrence in concrete term\<close>
txt\<open>Definitions (non-polymorphic):\<close>
define x :: 'a where "x = t"
txt\<open>Abbreviations (polymorphic):\<close> let ?f = "\x. x" term"?f ?f"
txt\<open>Notation:\<close>
write x (\<open>***\<close>) end
subsection \<open>Facts\<close>
text\<open>
A fact is a simultaneous list of theorems. \<close>
subsubsection \<open>Producing facts\<close>
notepad begin
txt\<open>Via assumption (``lambda''):\<close> assume a: A
txt\<open>Via proof (``let''):\<close> have b: B \<proof>
txt\<open>Via abbreviation (``let''):\<close> note c = a b
end
subsubsection \<open>Referencing facts\<close>
notepad begin txt\<open>Via explicit name:\<close> assume a: A note a
txt\<open>Via implicit name:\<close> assume A note this
txt\<open>Via literal proposition (unification with results from the proof text):\<close> assume A note\<open>A\<close>
assume"\x. B x" note\<open>B a\<close> note\<open>B b\<close> end
subsubsection \<open>Manipulating facts\<close>
notepad begin txt\<open>Instantiation:\<close> assume a: "\x. B x" note a note a [of b] note a [where x = b]
txt\<open>Backchaining:\<close> assume 1: A assume 2: "A \ C" note 2 [OF 1] note 1 [THEN 2]
txt\<open>Symmetric results:\<close> assume"x = y" note this [symmetric]
assume"x \ y" note this [symmetric]
txt\<open>Adhoc-simplification (take care!):\<close> assume"P ([] @ xs)" note this [simplified] end
subsubsection \<open>Projections\<close>
text\<open>
Isar facts consist of multiple theorems. There isnotationto project
interval ranges. \<close>
notepad begin assume stuff: A B C D note stuff(1) note stuff(2-3) note stuff(2-) end
subsubsection \<open>Naming conventions\<close>
text\<open> \<^item> Lower-case identifiers are usually preferred.
\<^item> Facts can be named after the main term within the proposition.
\<^item> Facts should \<^emph>\<open>not\<close> be named after the command that
introduced them (@{command "assume"}, @{command "have"}). This is
misleading and hard to maintain.
\<^item> Natural numbers can be used as ``meaningless'' names (more
appropriate than \<open>a1\<close>, \<open>a2\<close> etc.)
\<^item> Symbolic identifiers are supported (e.g. \<open>*\<close>, \<open>**\<close>, \<open>***\<close>). \<close>
subsection \<open>Block structure\<close>
text\<open>
The formal notepad is block structured. The fact produced by the last
entry of a block is exported into the outer context. \<close>
notepad begin
{ have a: A \<proof> have b: B \<proof> note a b
} note this note\<open>A\<close> note\<open>B\<close> end
text\<open>Explicit blocks as well as implicit blocks of nested goal
statements (e.g.\ @{command have}) automatically introduce one extra
pair of parentheses in reserve. The @{command next} command allows to ``jump'' between these sub-blocks.\<close>
notepad begin
{ have a: A \<proof> next have b: B proof - show B \<proof> next have c: C \<proof> next have d: D \<proof> qed
}
txt\<open>Alternative version with explicit parentheses everywhere:\<close>
{
{ have a: A \<proof>
}
{ have b: B proof -
{ show B \<proof>
}
{ have c: C \<proof>
}
{ have d: D \<proof>
} qed
}
}
text\<open> For example, see \<^file>\<open>~~/src/HOL/Isar_Examples/Group.thy\<close>. \<close>
subsection \<open>Special names in Isar proofs\<close>
text\<open> \<^item> term \<open>?thesis\<close> --- the main conclusion of the
innermost pending claim
\<^item> term \<open>\<dots>\<close> --- the argument of the last explicitly
stated result (forinfix application this is the right-hand side)
\<^item> fact \<open>this\<close> --- the last result produced in the text \<close>
notepad begin have"x = y" proof - term ?thesis show ?thesis \<proof> term ?thesis \<comment> \<open>static!\<close> qed term"\" thm this end
text\<open>Calculational reasoning maintains the special fact called
``\<open>calculation\<close>'' in the background. Certain language
elements combine primary \<open>this\<close> with secondary \<open>calculation\<close>.\<close>
subsection \<open>Transitive chains\<close>
text\<open>The Idea is to combine \<open>this\<close> and \<open>calculation\<close>
via typical \<open>trans\<close> rules (see also @{command print_trans_rules}):\<close>
thm trans thm less_trans thm less_le_trans
notepad begin txt\<open>Plain bottom-up calculation:\<close> have"a = b"\<proof> also have"b = c"\<proof> also have"c = d"\<proof> finally have"a = d" .
txt\<open>Variant using the \<open>\<dots>\<close> abbreviation:\<close> have"a = b"\<proof> also have"\ = c" \ also have"\ = d" \ finally have"a = d" .
txt\<open>Top-down version with explicit claim at the head:\<close> have"a = d" proof - have"a = b"\<proof> also have"\ = c" \ also have"\ = d" \ finally show ?thesis . qed next txt\<open>Mixed inequalities (require suitable base type):\<close> fix a b c d :: nat
have"a < b"\<proof> also have"b \ c" \ also have"c = d"\<proof> finally have"a < d" . end
subsubsection \<open>Notes\<close>
text\<open> \<^item> The notion of \<open>trans\<close> rule is very general due to the
flexibility of Isabelle/Pure rule composition.
\<^item> User applications may declare their own rules, with some care
about the operational details of higher-order unification. \<close>
subsection \<open>Degenerate calculations\<close>
text\<open>The Idea is to append \<open>this\<close> to \<open>calculation\<close>, without rule composition.
This is occasionally useful to avoid naming intermediate facts.\<close>
notepad begin txt\<open>A vacuous proof:\<close> have A \<proof> moreover have B \<proof> moreover have C \<proof> ultimately have A and B and C . next txt\<open>Slightly more content (trivial bigstep reasoning):\<close> have A \<proof> moreover have B \<proof> moreover have C \<proof> ultimately have"A \ B \ C" by blast end
text\<open>Note that For multi-branch case splitting, it is better to use @{command
consider}.\<close>
section \<open>Induction\<close>
subsection \<open>Induction as Natural Deduction\<close>
text\<open>In principle, induction is just a special case of Natural
Deduction (see also\secref{sec:natural-deduction-synopsis}). For
example:\<close>
thm nat.induct print_statement nat.induct
notepad begin fix n :: nat have"P n" proof (rule nat.induct) \<comment> \<open>fragile rule application!\<close> show"P 0"\<proof> next fix n :: nat assume"P n" show"P (Suc n)"\<proof> qed end
text\<open> In practice, much more proof infrastructure is required.
The proof method @{method induct} provides:
\<^item> implicit rule selection and robust instantiation
\<^item> context elements via symbolic case names
\<^item> support for rule-structured induction statements, with local
parameters, premises, etc. \<close>
notepad begin fix n :: nat have"P n" proof (induct n) case 0 show ?case\<proof> next case (Suc n) from Suc.hyps show ?case\<proof> qed end
subsubsection \<open>Example\<close>
text\<open>
The subsequent example combines the following proof patterns:
\<^item> outermost induction (over the datatype structure of natural
numbers), to decompose the proof problem in top-down manner
\<^item> calculational reasoning (\secref{sec:calculations-synopsis}) tocompose the result in each case
\<^item> solving local claims within the calculation by simplification \<close>
lemma fixes n :: nat shows"(\i=0..n. i) = n * (n + 1) div 2" proof (induct n) case 0 have"(\i=0..0. i) = (0::nat)" by simp alsohave"\ = 0 * (0 + 1) div 2" by simp finallyshow ?case . next case (Suc n) have"(\i=0..Suc n. i) = (\i=0..n. i) + (n + 1)" by simp alsohave"\ = n * (n + 1) div 2 + (n + 1)" by (simp add: Suc.hyps) alsohave"\ = (n * (n + 1) + 2 * (n + 1)) div 2" by simp alsohave"\ = (Suc n * (Suc n + 1)) div 2" by simp finallyshow ?case . qed
text\<open>This demonstrates how induction proofs can be done without
having to consider the raw Natural Deduction structure.\<close>
subsection \<open>Induction with local parameters and premises\<close>
text\<open>Idea: Pure rule statements are passed through the induction
rule. This achieves convenient proof patterns, thanks to some
internal trickery in the @{method induct} method.
Important: Using compact HOL formulae with\<open>\<forall>/\<longrightarrow>\<close> is a
well-known anti-pattern! It would produce useless formal noise. \<close>
notepad begin fix n :: nat fix P :: "nat \ bool" fix Q :: "'a \ nat \ bool"
have"P n" proof (induct n) case 0 show"P 0"\<proof> next case (Suc n) from\<open>P n\<close> show "P (Suc n)" \<proof> qed
have"A n \ P n" proof (induct n) case 0 from\<open>A 0\<close> show "P 0" \<proof> next case (Suc n) from\<open>A n \<Longrightarrow> P n\<close> and\<open>A (Suc n)\<close> show "P (Suc n)" \<proof> qed
have"\x. Q x n" proof (induct n) case 0 show"Q x 0"\<proof> next case (Suc n) from\<open>\<And>x. Q x n\<close> show "Q x (Suc n)" \<proof> txt\<open>Local quantification admits arbitrary instances:\<close> note\<open>Q a n\<close> and \<open>Q b n\<close> qed end
text\<open>The @{method induct} method can isolate local parameters and
premises directly from the given statement. This is convenient in
practical applications, but requires some understanding of what is
going on internally (as explained above).\<close>
notepad begin fix n :: nat fix Q :: "'a \ nat \ bool"
fix x :: 'a assume"A x n" thenhave"Q x n" proof (induct n arbitrary: x) case 0 from\<open>A x 0\<close> show "Q x 0" \<proof> next case (Suc n) from\<open>\<And>x. A x n \<Longrightarrow> Q x n\<close> \<comment> \<open>arbitrary instances can be produced here\<close> and\<open>A x (Suc n)\<close> show "Q x (Suc n)" \<proof> qed end
subsection \<open>Advanced induction with term definitions\<close>
text\<open>Induction over subexpressions of a certain shape are delicate to formalize. The Isar @{method induct} method provides
infrastructure for this.
Idea: sub-expressions of the problem are turned into a defined induction variable; often accompanied with fixing of auxiliary
parameters in the original expression.\<close>
notepad begin fix a :: "'a \ nat" fix A :: "nat \ bool"
assume"A (a x)" thenhave"P (a x)" proof (induct "a x" arbitrary: x) case 0 note prem = \<open>A (a x)\<close> and defn = \<open>0 = a x\<close> show"P (a x)"\<proof> next case (Suc n) note hyp = \<open>\<And>x. n = a x \<Longrightarrow> A (a x) \<Longrightarrow> P (a x)\<close> and prem = \<open>A (a x)\<close> and defn = \<open>Suc n = a x\<close> show"P (a x)"\<proof> qed end
text\<open>
Isabelle/Pure ``theorems'' are always natural deduction rules,
which sometimes happen to consist of a conclusion only.
The framework connectives \<open>\<And>\<close> and \<open>\<Longrightarrow>\<close> indicate the
rule structure declaratively. For example:\<close>
thm conjI thm impI thm nat.induct
text\<open>
The object-logic is embedded into the Pure framework via an implicit
derivability judgment\<^term>\<open>Trueprop :: bool \<Rightarrow> prop\<close>.
Thus any HOL formulae appears atomic to the Pure framework, while
the rule structure outlines the corresponding proof pattern.
This can be made explicit as follows: \<close>
notepad begin
write Trueprop (\<open>Tr\<close>)
thm conjI thm impI thm nat.induct end
text\<open>
Isar provides first-class notationfor rule statements as follows. \<close>
text\<open>
Introductions and eliminations of some standard connectives of
the object-logic can be written as rule statements as follows. (The proof ``@{command "by"}~@{method blast}'' serves as sanity check.) \<close>
lemma"(P \ False) \ \ P" by blast lemma"\ P \ P \ Q" by blast
lemma"P \ Q \ P \ Q" by blast lemma"P \ Q \ (P \ Q \ R) \ R" by blast
lemma"P \ P \ Q" by blast lemma"Q \ P \ Q" by blast lemma"P \ Q \ (P \ R) \ (Q \ R) \ R" by blast
lemma"(\x. P x) \ (\x. P x)" by blast lemma"(\x. P x) \ P x" by blast
lemma"P x \ (\x. P x)" by blast lemma"(\x. P x) \ (\x. P x \ R) \ R" by blast
lemma"x \ A \ x \ B \ x \ A \ B" by blast lemma"x \ A \ B \ (x \ A \ x \ B \ R) \ R" by blast
lemma"x \ A \ x \ A \ B" by blast lemma"x \ B \ x \ A \ B" by blast lemma"x \ A \ B \ (x \ A \ R) \ (x \ B \ R) \ R" by blast
subsection \<open>Isar context elements\<close>
text\<open>We derive some results out of the blue, using Isar context
elements and some explicit blocks. This illustrates their meaning
wrt.\ Pure connectives, without goal states getting in the way.\<close>
notepad begin
{ fix x have"B x"\<proof>
} have"\x. B x" by fact
{ obtain x :: 'a where "B x" \ have C \<proof>
} have C by fact
end
subsection \<open>Pure rule composition\<close>
text\<open>
The Pure framework provides means for:
\<^item> backward-chaining of rules by @{inference resolution}
\<^item> closing of branches by @{inference assumption}
Both principles involve higher-order unification of \<open>\<lambda>\<close>-terms
modulo \<open>\<alpha>\<beta>\<eta>\<close>-equivalence (cf.\ Huet and Miller). \<close>
notepad begin assume a: A and b: B thm conjI thm conjI [of A B] \<comment> \<open>instantiation\<close> thm conjI [of A B, OF a b] \<comment> \<open>instantiation and composition\<close> thm conjI [OF a b] \<comment> \<open>composition via unification (trivial)\<close> thm conjI [OF \<open>A\<close> \<open>B\<close>]
thm conjI [OF disjI1] end
text\<open>Note: Low-level rule composition is tedious and leads to
unreadable~/ unmaintainable expressions in the text.\<close>
text\<open>Idea: Canonical proof decomposition via @{command fix}~/
@{command assume}~/ @{command show}, where the body produces a
natural deduction rule to refine some goal.\<close>
notepad begin fix A B :: "'a \ bool"
have"\x. A x \ B x" proof - fix x assume"A x" show"B x"\<proof> qed
have"\x. A x \ B x" proof -
{ fix x assume"A x" show"B x"\<proof>
} \<comment> \<open>implicit block structure made explicit\<close> note\<open>\<And>x. A x \<Longrightarrow> B x\<close> \<comment> \<open>side exit for the resulting rule\<close> qed end
text\<open>
Idea: Previous facts and new claims are composed with a rule from
the context (or background library). \<close>
notepad begin assume r\<^sub>1: "A \<Longrightarrow> B \<Longrightarrow> C" \<comment> \<open>simple rule (Horn clause)\<close>
have A \<proof> \<comment> \<open>prefix of facts via outer sub-proof\<close> thenhave C proof (rule r\<^sub>1) show B \<proof> \<comment> \<open>remaining rule premises via inner sub-proof\<close> qed
have C proof (rule r\<^sub>1) show A \<proof> show B \<proof> qed
have A and B \<proof> thenhave C proof (rule r\<^sub>1) qed
have A and B \<proof> thenhave C by (rule r\<^sub>1)
next
assume r\<^sub>2: "A \<Longrightarrow> (\<And>x. B\<^sub>1 x \<Longrightarrow> B\<^sub>2 x) \<Longrightarrow> C" \<comment> \<open>nested rule\<close>
have A \<proof> thenhave C proof (rule r\<^sub>2) fix x assume"B\<^sub>1 x" show"B\<^sub>2 x" \ qed
txt\<open>The compound rule premise \<^prop>\<open>\<And>x. B\<^sub>1 x \<Longrightarrow> B\<^sub>2 x\<close> is better
addressed via @{command fix}~/ @{command assume}~/ @{command show} in the nested proof body.\<close> end
text\<open>There is nothing special about logical connectives (\<open>\<and>\<close>, \<open>\<or>\<close>, \<open>\<forall>\<close>, \<open>\<exists>\<close> etc.). Operators from
set-theory or lattice-theory work analogously. It is only a matter
of rule declarations in the library; rules can be also specified
explicitly. \<close>
notepad begin have"x \ A" and "x \ B" \ thenhave"x \ A \ B" ..
have"x \ A" \ thenhave"x \ A \ B" ..
have"x \ B" \ thenhave"x \ A \ B" ..
have"x \ A \ B" \ thenhave C proof assume"x \ A" thenshow C \<proof> next assume"x \ B" thenshow C \<proof> qed
next have"x \ \A" proof fix a assume"a \ A" show"x \ a" \ qed
text\<open>
Combining these characteristics leads to the following general scheme for elimination rules with cases:
\<^item> prefix of assumptions (or ``major premises'')
\<^item> one or more cases that enable to establish the main conclusion in an augmented context \<close>
notepad begin assume r: "A\<^sub>1 \ A\<^sub>2 \ \ \assumptions\
(\<And>x y. B\<^sub>1 x y \<Longrightarrow> C\<^sub>1 x y \<Longrightarrow> R) \<Longrightarrow> \<comment> \<open>case 1\<close>
(\<And>x y. B\<^sub>2 x y \<Longrightarrow> C\<^sub>2 x y \<Longrightarrow> R) \<Longrightarrow> \<comment> \<open>case 2\<close>
R \<comment> \<open>main conclusion\<close>"
have A\<^sub>1 and A\<^sub>2 \<proof> thenhave R proof (rule r) fix x y assume"B\<^sub>1 x y" and "C\<^sub>1 x y" show ?thesis \<proof> next fix x y assume"B\<^sub>2 x y" and "C\<^sub>2 x y" show ?thesis \<proof> qed end
text\<open>Here \<open>?thesis\<close> is used to refer to the unchanged goal
statement.\<close>
subsection \<open>Rules with cases\<close>
text\<open>
Applying an elimination rule to some goal, leaves that unchanged
but allows to augment the contextin the sub-proof of each case.
Isar provides some infrastructure to support this:
\<^item> native language elements to state eliminations
\<^item> symbolic case names
\<^item> method @{method cases} to recover this structure in a
sub-proof \<close>
lemma assumes A\<^sub>1 and A\<^sub>2 \<comment> \<open>assumptions\<close> obtains
(case\<^sub>1) x y where "B\<^sub>1 x y" and "C\<^sub>1 x y"
| (case\<^sub>2) x y where "B\<^sub>2 x y" and "C\<^sub>2 x y" \<proof>
subsubsection \<open>Example\<close>
lemma tertium_non_datur: obtains
(T) A
| (F) "\ A" by blast
notepad begin fix x y :: 'a have C proof (cases "x = y" rule: tertium_non_datur) case T from\<open>x = y\<close> show ?thesis \<proof> next case F from\<open>x \<noteq> y\<close> show ?thesis \<proof> qed end
notepad begin fix x :: foo have C proof (cases x) case Foo from\<open>x = Foo\<close> show ?thesis \<proof> next case (Bar a) from\<open>x = Bar a\<close> show ?thesis \<proof> qed end
subsection \<open>Elimination statements and case-splitting\<close>
text\<open>
The @{command consider} states rules for generalized elimination andcase
splitting. This is like a toplevel statement \<^theory_text>\<open>theorem obtains\<close> used within
a proof body; or like a multi-branch \<^theory_text>\<open>obtain\<close> without activation of the localcontext elements yet.
The proof method @{method cases} is able touse such rules with
forward-chaining (e.g.\ via \<^theory_text>\<open>then\<close>). This leads to the subsequent pattern for case-splitting in a particular situation within a proof. \<close>
notepad begin
consider (a) A | (b) B | (c) C \<proof> \<comment> \<open>typically \<^theory_text>\<open>by auto\<close>, \<^theory_text>\<open>by blast\<close> etc.\<close> thenhave something proof cases case a thenshow ?thesis \<proof> next case b thenshow ?thesis \<proof> next case c thenshow ?thesis \<proof> qed end
subsection \<open>Obtaining local contexts\<close>
text\<open>A single ``case'' branch may be inlined into Isar proof text
via @{command obtain}. This proves \<^prop>\<open>(\<And>x. B x \<Longrightarrow> thesis) \<Longrightarrow>
thesis\<close> on the spot, and augments the context afterwards.\<close>
notepad begin fix B :: "'a \ bool"
obtain x where"B x"\<proof> note\<open>B x\<close>
txt\<open>Conclusions from this context may not mention \<^term>\<open>x\<close> again!\<close>
{ obtain x where"B x"\<proof> from\<open>B x\<close> have C \<proof>
} note\<open>C\<close> end
end
¤ Dauer der Verarbeitung: 0.17 Sekunden
(vorverarbeitet)
¤
Die Informationen auf dieser Webseite wurden
nach bestem Wissen sorgfältig zusammengestellt. Es wird jedoch weder Vollständigkeit, noch Richtigkeit,
noch Qualität der bereit gestellten Informationen zugesichert.
Bemerkung:
Die farbliche Syntaxdarstellung ist noch experimentell.