@article {haesaert2004non, title = {A non-probabilistic Approach to Inductive Prediction}, year = {2004}, abstract = {

The underlying idea behind the adaptive logics of inductive generalization is that most inductive reasoning can be explicated by simple qualitative means. Therefore, those classical models are selected that are ‘as uniform as possible ’ with respect to a certain set of (empirical) data. This led to the question if the same idea of uniformity can be applied if no generalizations are derivable. It is clear that in this case one may be still interested to make some direct inductive predictions. The main problem with this kind of prediction is that we lack a decision theory for it. In the present paper we make some proposals to deal with this problem. Our purpose here is to get more control over the difficult aspects of inductive prediction. In order to do so, we will not proceed in a probabilistic context, but we will apply the idea of minimizing the abnormalities in uniform models, an idea that derives from the adaptive logic programm. 1 Aim of this paper In our [1], we have presented some adaptive logics for induction based on Classical Logic (henceforth: CL). The underlying idea of these adaptive logics of induction is that most inductive reasoning does not proceed in terms of probabilities, and cannot be explicated in terms of probabilities, but can be explicated by rather simple qualitative means. In that paper we presented for example the adaptive logic for inductive generalization IL +m: from a set of data and (possibly falsified) background knowledge, inductive generalizations are ‘derived ’ 1. In the same paper we also

}, author = {Haesaert, Lieven} }