In the span of 2 million years the human brain has nearly tripled in size. What accounted for such meteoric growth in a time span that is generally considered to be a veritable blip on the evolutionary radar?
Researchers believe that most of the mass expansion experienced in the human brain can be attributed to the evolutionary development in the part of our brain known as the prefrontal cortex. The basic activity of this brain region is considered to be the orchestration of thoughts and actions in accordance with our internal goals. This is most often referred to as executive function. Executive function relates to our ability to differentiate among conflicting thoughts, to determine good and bad, better and best, same and different, to evaluate the future consequences of current activities, the process of working toward a defined goal, prediction of outcomes, and expectations based on actions. Some neuroscientist refer to this region as the experience simulator.
Artificial intelligence, machine learning, and neural networks are all based on some form of emulated human cognition. The problem has always been that the techniques used to emulate cognition all take a logical approach vs an intuitive one. We can think of logic as low level cognition and intuition as higher level cognition. Intuition is based on wisdom, and wisdom in this sense can be thought of as the culmination of collective experiences; a collection of memories that we rely upon to simulate an experience…we don’t need to be hit by a train to know that playing on the train tracks increases the risk of a negative outcome because we can simulate that experience in our pre-frontal cortex. We can greatly improve artificial intelligence and machine learning algorithms by creating a mechanism to pre-process intuitive computation of sensory input. This can be accomplished in a number of ways, but perhaps the most promising of these approaches utilizes metaheuristic optimization.
Instead of employing the traditional approach where we attempt to emulate the human brain’s capacity to process millions and millions of neural responses simultaneously in real time, heuristics distribute the executive function to an assemblage of low level actors working collectively toward the common goal. This type of collective intelligence can be readily found in nature e.g., schools of fish, flocks of birds, ant farms, swarms of bee’s etc. In these scenarios, the executive function is accomplished by the collective intelligence of the assembled actors. There is no single leader fish in a school of fish that darts about in unison. Ants working collectively can create complex and intricate structures as high as eight feet tall, yet individual ants have a relatively low level of intelligence. Metaheuristic algorithms are nothing new, however their use as experience generators to create artificial intuition is. The potential utility of optimization algorithms for AI and AN is promising. Conceptually, such algorithms would be deployed to simulate "wisdom" via collective intelligence, thereby creating a relative intuition. For example, if challenged to predict the most optimal placement of a robotic appendage used to provide mobility for a device while traversing unfamiliar terrain, metaheuristic swarm algorithms could be employed to simulate a neural network of sensory input. Similar optimization heuristics could also be used in population health research and cohort registries. PortOne Technology Group is experimenting with these concepts using particle swarm optimization algorithms in R. The video below is a visual representation of the PSO function at work (note that this simulation was performed in MathLab)
UPDATE: One of my colleagues suggested that I needed to provide more practical examples of how metaheuristic algorithms can be used to solve real world optimization, clustering, forecasting, and classification problems. My work in this area is purely experimental at this stage, but I believe in the power of metaheuristic algorithm approaches within the context of big data analytics. With that, here are just a few examples of practical applications that warrant exploration:
Swarm intelligence and deep learning within the context of big data, Internet of Things (IoT), cloud computing, and fog computing
Evolutionary algorithms to train neural networks, echo state networks, or recurrent networks
Parallelizing metaheuristic algorithms on Hadoop/MapReduce for big data processing
Swarm intelligence algorithms for security challenges in big data environments
Bio-inspired algorithms in green computing
Clustering in big data analytics using metaheuristic algorithms
Variants of cuckoo search algorithms and flower pollination algorithms for fog-to-cloud computing and IoT
Swarm intelligence algorithms for optimizing fog, IoT, and cloud computing protocols
Swarm intelligence based intrusion detection for cybersecurity of fog, cloud, and IoT environments
Bio-inspired ontology-based big data and open data applications
Metaheuristic frameworks for big data simulation
Bio-inspired system architectures and infrastructures