Cognition, historically considered uniquely human capacity, has been recently found to be the ability of all living organisms, from single cells and up. This study approaches cognition from an info-computational stance, in which structures in nature are seen as information, and processes (information dynamics) are seen as computation, from the perspective of a cognizing agent. Cognition is understood as a network of concurrent morphological/morphogenetic computations unfolding as a result of self-assembly, self-organization, and autopoiesis of physical, chemical, and biological agents. The present-day human-centric view of cognition still prevailing in major encyclopedias has a variety of open problems. This article considers recent research about morphological computation, morphogenesis, agency, basal cognition, extended evolutionary synthesis, free energy principle, cognition as Bayesian learning, active inference, and related topics, offering new theoretical and practical perspectives on problems inherent to the old computationalist cognitive models which were based on abstract symbol processing, and unaware of actual physical constraints and affordances of the embodiment of cognizing agents. A better understanding of cognition is centrally important for future artificial intelligence, robotics, medicine, and related fields.
Both Cybersemiotics and Info-computationalist research programmes represent attempts to unify understanding of information, knowledge and communication. The first one takes into account phenomenological aspects of signification which are insisting on the human experience "from within". The second adopts solely the view "from the outside" based on scientific practice, with an observing agent generating inter-subjective knowledge in a research community. The process of knowledge production, embodied into networks of cognizing agents interacting with the environment and developing through evolution is studied on different levels of abstraction in both frames of reference. In order to develop scientifically tractable models of evolution of intelligence in informational structures from pre-biotic/chemical to living networked intelligent organisms, including the implementation of those models in artificial agents, a basic level language of Info-Computationalism has shown to be suitable. There are however contexts in which we deal with complex informational structures essentially dependent on human first person knowledge where high level language such as Cybersemiotics is the appropriate tool for conceptualization and communication. Two research projects are presented in order to exemplify the interplay of info-computational and higher-order approaches: The Blue Brain Project where the brain is modeled as info-computational system, a simulation in silico of a biological brain function, and Biosemiotics research on genes, information, and semiosis in which the process of semiosis is understood in info-computational terms. The article analyzes differences and convergences of Cybersemiotics and Info-computationalist approaches which by placing focus on distinct levels of organization, help elucidate processes of knowledge production in intelligent agents
In this paper, we analyze axiomatic and constructive issues of unconventional computations from a methodological and philosophical point of view. We explain how the new models ofalgorithms and unconventional computations change the algorithmic universe, making it open and allowing increased flexibility and expressive power that augment creativity. At the same time, the greater power of new types of algorithms also results in the greater complexity of the algorithmic universe, transforming it into the algorithmic multiverse and demanding new tools for its study. That is why we analyze new powerful tools brought forth by local mathematics, local logics, logical varieties and the axiomatic theory of algorithms, automata and computation. We demonstrate how these new tools allow efficient navigation in the algorithmic multiverse. Further work includes study of natural computation by unconventional algorithms and constructive approaches.
Three special issues of Entropy journal have been dedicated to the topics of “Information-Processing and Embodied, Embedded, Enactive Cognition”. They addressed morphological computing, cognitive agency, and the evolution of cognition. The contributions show the diversity of views present in the research community on the topic of computation and its relation to cognition. This paper is an attempt to elucidate current debates on computation that are central to cognitive science. It is written in the form of a dialog between two authors representing two opposed positions regarding the issue of what computation is and could be, and how it can be related to cognition. Given the different backgrounds of the two researchers, which span physics, philosophy of computing and information, cognitive science, and philosophy, we found the discussions in the form of Socratic dialogue appropriate for this multidisciplinary/cross-disciplinary conceptual analysis. We proceed as follows. First, the proponent (GDC) introduces the info-computational framework as a naturalistic model of embodied, embedded, and enacted cognition. Next, objections are raised by the critic (MM) from the point of view of the new mechanistic approach to explanation. Subsequently, the proponent and the critic provide their replies. The conclusion is that there is a fundamental role for computation, understood as information processing, in the understanding of embodied cognition.
The entropies of Shannon, Rényi and Kolmogorov are analyzed and compared together with their main properties. The entropy of some particular antennas with a pre-fractal shape, also called fractal antennas, is studied. In particular, their entropy is linked with the fractal geometrical shape and the physical performance.
The gas turbine was one of the most important technological developments of the early 20th century, and it has had a significant impact on our lives. Although some researchers have worked on predicting the performance of three-shaft gas turbines, the effects of the deteriorated components on other primary components and of the physical faults on the component measurement parameters when considering the variable inlet guide valve scheduling and secondary air system for three-shaft gas turbine engines have remained unexplored. In this paper, design point and off-design performance models for a three-shaft gas turbine were developed and validated using the GasTurb 13 commercial software. Since the input data were limited, some engineering judgment and optimization processes were applied. Later, the developed models were validated using the engine manufacturer's data. Right after the validation, using the component health parameters, the physical faults were implanted into the non-linear steady-state model to investigate the performance of the gas turbine during deterioration conditions. The effects of common faults, namely fouling and erosion in primary components of the case study engine, were simulated during full-load operation. The fault simulation results demonstrated that as the severity of the fault increases, the component performance parameters and measurement parameters deviated linearly from the clean state. Furthermore, the sensitivity of the measurement parameters to the fault location and type were discussed, and as a result they can be used to determine the location and kind of fault during the development of a diagnosis model.