(From now on biology-related topics will be discussed in this section of the website too)
The critical part of perception is to break down the environment and then reconstruct it. There are a lot more elements in the environment than it’s possible to consider so there should be an input filter that filters out a portion of the outside environment. Input sensors (such as the eye, ear, etc) will then decompose those elements and then they will be reconstructed in the brain.
So the mechanism to generate the approximation of the environment is key. That is done by the wave probability. For example, when an object is viewed from different perspectives its shape will differ (for example a circle can be seen as an ellipse when viewed from a different perspective). However, when input sensors receive info about different aspects of that object, it will be reconstructed in the brain. Then the wave probability assigns probability to past memories. Some are very far apart from that object and are filtered out but some memories get higher probabilities. the person might be uncertain at first which memories to link to that object but then as more data gets in the probability curve gets better and better and more relevant memories will be attached to that object.
So the discussion of problem solving where a complicated problem would be broken down into its constituent elements and perception of the environment are both linked together. Also, they are both linked to wave probability. Wave probability is the filtering mechanism that is critical in the process of the perception of the environment. It is said that rigorous mathematical precision cannot be used in physics. But maybe that is because approximation is encoded into nature for example similar to uncertainty. When an environment is observed the point at infinity cuts down a portion of the space. For example, imagine an infinitely long path with trees on both sides. Eventually, trees shrink into a point and the entire space beyond that will all be accumulated in the point at infinity. If there was no point at infinity there would be no observation because the path would continue forever. It is essential that the point at infinity cuts down a portion of space and provides an approximation. This may be true about any observable phenomenon in the physical world.
UPDATE 1:
If the approximation is indeed encoded into nature then since probability is linked to observation, then uncertainty and approximation are connected topics. One is derived from the other.
UPDATE 2:
The brain is the area of expertise of my dad. He is busy with his own research but I will ask him to advice me on this article on family gatherings.
I will write the proposal for the research and send it to the physician I mentioned a few days ago. I will ask her to be the co-author of the article.
UPDATE 3:
This is the proposal I will send her. Most of it has been discussed before.
Most environments consist of so many different elements that it is practically impossible to consider all of them. For example, a moving object is more probable to be noticed in a room where all other objects are fixed. So humans’ senses may ignore other elements in the room and concentrate on that object. If that moving object is removed, other objects that were not noticed at first might become noticeable.
The same is true in social contexts. When a catastrophe occurs, people tend to weigh more on things related to that incident and ignore other less relevant incidents. Therefore it’s more probable that those less relevant incidents will not be considered compared to more relevant incidents.
So a type of probability assignment process is necessary to attach different weights to different elements in the environment. This process relies on learning. That is due to, for example, evolutionary reasons some past incidents are stored in the memory and retrieved in similar situations. So those memories assign different weights to different elements in the environment. Standard probability theory lacks such a learning process. In fact, in some situations, probability relies on learning but in some other situations, standard probability is totally detached from the concept of learning. For example, to discover the probability of something, the experiment should be repeated many many times. As the number of experiments gets bigger and bigger the precision of the probability distribution gets more and more accurate. This is a learning process. It means all previous outcomes should be recorded, and the probability distribution is produced based on that.
However, in other situations, learning and memory are completely detached from the standard probability. Consider this example:There are N balls in a bag and of one the balls is distinct. Based on the standard probability, the probability of randomly picking that distinct ball is 1/N. But based on the standard probability each experiment is independent. That is if a ball is picked each time and is returned to the bag afterwards, the probability of the distinct ball being picked equals 1/N every time. In mathematical terms those experiments are independent. It means it doesn’t matter which balls have been picked in the previous experiments. That is the same as saying the process lacks memory. However, in the real world things are different. If one picks balls repeatedly and each time returns the balls into the bag, and never the distinct ball is picked, as time goes on, psychologically one assigns more and more weight to the distinct ball being picked in the next round. That is because in this experiment memory is included.
It will be explained in later writings that this new type of probability that includes memory (also called wave probability) is closely linked to quantum mechanics and how the universe functions on the quantum scale.
This article aims to study if the same ideas can be applied to how the brain assigns probability distributions to different elements in the environment and also different memories. In other words, whether weight assignment in the brain is based on the wave probability.