Graeme Ross: “Once again the over-riding need to measure the immeasurable raises it‘s ugly head. Statistics are proof of ignorance. Numbers are not knowledge. It has been mooted that we are a mental construct that incorporates multiple persona in our subconscious and semi-conscious mind. Find the theory for yourself. I wont quote what you can find yourselves. If we are a construct, ever-changing, ever-evolving in complexity and moment-to-moment inner focus, and if, as it has been mooted, we have constant and endless conversation with these ever-changing inner mental persona, then it follows that without capturing that process in mid-flight (as it were) we can‘t deduce the reasoning that results from these conversations. Therefore we are not able to quantify these processes in any way at all. It is ephemeral. Thought takes place in the interval between knowing and asking. Trying to build a machine that will think would take far more resources than mankind will ever possess.”
Abstract: This paper outlines the Independent Core Observer Model (ICOM) Theory of Consciousness defined as a computational model of consciousness that is objectively measurable and an abstraction produced by a mathematical model where the subjective experience of the system is only subjective from the point of view of the abstracted logical core or conscious part of the system where it is modeled in the core of the system objectively. Given the lack of agreed-upon definitions around consciousness theory, this paper sets precise definitions designed to act as a foundation or baseline for additional theoretical and real-world research in ICOM based AGI (Artificial General Intelligence) systems that can have qualia measured objectively.
Published via Conference/Review Board: ICIST 2018 – International Conference on Information Science and Technology – China – April 20-22nd. (IEEE conference) [release pending] and https://www.itm-conferences.org/
Introduction The Independent Core Observer Model Theory of Consciousness is partially built on the Computational Theory of Mind (Rescorla 2016) where one of the core issues with research into artificial general intelligence (AGI) is the absence of objective measurements and data as they are ambiguous given the lack of agreed-upon objective measures of consciousness (Seth 2007). To continue serious work in the field we need to be able to measure consciousness in a consistent way that is not presupposing different theories of the nature of consciousness (Dienes and Seth 2012) and further not dependent on various ways of measuring biological systems (Dienes and Seth 2010) but focused on the elements of a conscious mind in the abstract. With the more nebulous Computational Theory of Mind, research into the human brain does show some underlying evidence.
Comments are closed.