There is a spectrum of different ways of understanding what information is. These range from quantitative analyses–which treat information in terms of uninterpreted patterns of data–to qualitative analyses, which treat information semantically, as a matter of meaningful structures. What is the relation between these different approaches? Are there invariants underlying the different uses of the term ‘information’ in disciplines such as computer science, economics, genetics, neuroscience, physics, and statistics? Is there some sort of logic, which transcends these different uses? How, for example, are we able to transform quantitative information deriving from meteorological sensors into the sorts of qualitative information that is useful for human decision-making? According to Shannon and Weaver, information refers to the degree of uncertainty present in a message. Does a view along these lines provide a viable starting point for a unified analysis of information, or must we look elsewhere? This issue of The Monist will take questions such as these as a basis for fostering new research in the philosophy of information. - call for papers at The Monist, advisory editor Luicano Floridi (due April 30, 2016)In case we want to put the Monist in our sights again, this issue looks related to the idea of a "continuous information theory" that I've been blathering about from time to time. I haven't yet done enough background research to say whether that's new, or how it relates to other field theories.
About Floridi -- here are some of his works:
Floridi, Luciano. The Philosophy of Information. Oxford: Oxford University Press, 2011.
—. The Fourth Revolution: How the Infosphere is Reshaping Human Reality. Oxford: Oxford University Press, 2013.
—. Information: A Very Short Introduction. Oxford: Oxford University Press, 2012.
And another shorter one that I looked through:
Comments
Post a Comment