You could also call neural networks "Brainstorms" -Courtesy of ideachampions.com- |
In the text these networks are presented as two opposing viewpoints, and while I do acknowledge that semantic information networks can stand on their own, parallel distributed networks take cues from the semantic network and act in conjunction with the connectionist models. While semantic models of information distribution work in a strictly logical fashion (such as one would find in an encyclopedia or thesaurus), any other model of information processing will incorporate episodic memory into the network of connections. Human memory and computer A.I. function similarly in this sense. While it might be logical and part of a semantic network to connect the terms 'monkey' and 'banana' to one another, it would require a personal (episodic) experience to make a connection between the terms 'lamborghini' and 'banana'. Depending on the strength of this neural link, 'lamborghini' and 'banana' may become a shorter link to one another and a more readily accessible link than the semantic 'monkey' 'banana' link. You need look no further than the internet accessible A.I. that is Cleverbot to see an example of this network in action. You may find when you ask it a series of rational or logical question that should have a "straightforward" answer it may occasionally get an answer wrong. In the cases where this happens, it has been taught these answers by the outside world, thus the connection between the original semantic knowledge has become farther displaced from its source than the new or 'episodic' knowledge.
This picture has nothing to do with anything. I just like it... I also want to see if it sparks any implicit memory for my professor. ('answer' in the next section, but it won't be implicit anymore once I have given you the cue to trigger the memory.) -Courtesy of abcpastor.wordpress.com- |
Very clear early memory. Now you know more abt the theory behind the how and why of the experience. Thoughtful consideration of the other mtl as well. Lots of "grist for the mill" with respect to opportunities fo further introspection.
ReplyDelete