James Nguyen (University of Notre Dame)
It is commonly assumed that idealised models misrepresent their target systems. And as such the role(s) of idealisations in science seem mysterious: how can misrepresentations accurately predict, explain, or generate understanding about, target system behaviour? In this paper I argue that this mystery is mistaken. That models are distortions of their target systems does not entail that they misrepresent them. By drawing on interpretational accounts of scientific representation -- accounts that diverge from the idea that models are `intended copies' of their targets -- I argue that models which distort aspects of their target systems, even with respect to essential features of the target behaviour, can nevertheless be considered accurate representations, even with respect to those very features, assuming that they are interpreted correctly. I illustrate this way of thinking about idealisation in science with the example of interpreting flat 2D maps of the earth's surface as well as models involving taking some parameter to a limit, both of which are commonly assumed to contain `essential idealisations'.