Is automatic theorem proving and proof searching easier in linear and other propositional substructural logics which lack contraction?
Where can I read more about automatic theorem proving in these logics and the role of contraction in proof search?
Is automatic theorem proving and proof searching easier in linear and other propositional substructural logics which lack contraction?
Where can I read more about automatic theorem proving in these logics and the role of contraction in proof search?
Other resources could be found referenced in Kaustuv Chaudhuri's thesis "The Focused Inverse Method for Linear Logic", and you might be interested in Roy Dyckhoff's "Contraction-Free Sequent Calculi", which is about contraction but not about linear logic.
There are opportunities for efficient proof search in linear logic, but I don't think current work indicates that it's easier than proof search in non-substructural logic. The problem is that if you want to prove $C \vdash(A \otimes B)$ in linear logic, you have an extra question that you don't have in normal proof search: is $C$ used to prove $A$ or is $C$ used to prove $B$? In practice, this "resource nondeterminism" is a big problem in performing proof search in linear logic.
Per the comments, Lincoln et al's 1990 "Decision problems for propositional linear logic" is a good reference if you want to get technical about words like "easier."
Assuming that the complexity of the provability problem would satisfy you, the landscape of complexities of substructural logics with and without contraction is somewhat complex. I'll try to survey here what is known for propositional linear logic and propositional logic. The short answer is that contraction sometimes helps (e.g. LLC is decidable, while LL isn't), and sometimes doesn't (e.g. MALL is PSPACE-complete, MALLC is ACKERMANN-complete).
Some of the main open questions here are whether MELL is decidable, and what is the complexity of the implicational fragment T$_\to$ of ticket logic (a variant of R).
No, it is only ever harder.
Just as the decision problem for intuitionistic propositional logic is harder than of classical propositional logic, so to is linear propositional logic harder still. With either exponentials (which don't lack contraction) or various flavours of noncommutative connective, the logic becomes undecidable and even the weakling classical MALL is PSPACE complete. By contrast, the decision problem for classical propositional logic is co-NP complete, and for intuitionistic propositional logic, PSPACE complete. (Offhand, I don't know the complexity of intuitionistic MALL.)
I recommend Pat Lincoln's exposition in section 6 of his Linear logic, SIGACT News 1992. We have learned a bit more since then, that is, we have results for a large family of linear logics, but the basic picture is there.
In a certain way, this is what makes proof search for linear logic interesting, since hardness of the decision problem makes space for more interesting notions of computation, and linear logic is hard in so many different ways. Andrej pointed to Dale Miller's An Overview of Linear Logic Programming; this is a good place to look since Miller has done more to develop the idea of proof search as computation as anyone else.
Perhaps Dale Miller's Overview of linear logic programming is a good strarting point?