Last week we discussed AI: Artifical Intelligence. This week we were asked to consider music a part of our journey through the theme of cybercultures. As I have been wondering about the term “artificial intelligence” I have concluded, perhaps later than others, that AI simply refers to intelligence that is created outside the holder of the intellect itself. That intelligence is then inserted somehow into the recipient, activated and implemented. I know this is not profound but I have to go through my process here. The real question I have been struggling with is how does the robot or cyborg, as an “intelligent” entity, grow?
I looked at clips from a variety of films and other posts submitted by classmates. Most notably and what I spoke in our Google Hangout session, was the societal parameters robots/cyborgs will be expected to live by and, will artifical “beings” be able not only to mimic human emotions but understand the subtleties those emotions must take in given circumstances. Music can be one of the areas that may be the most difficult to measure in terms of intelligent application. Robots, like Data in Star Trek: The Nest Generation, can mimic thousands of musicians. The question is however, can he “feel” the music he is playing? Composers will tell us that music is felt and it is emotional. Hence we come back to the question of whether robots/cyborgs can really assimilate or be able to produce, in and of themselves, emotions.