A central commonality between culture and subjectivity is that each owes a considerable extent of its definition to the existence of mere differences between preference or taste—however arbitrary or meaningful these may turn out to be.
Tag Archives: culture
Postmodernism and behavior
Let us define postmodernism here as emphasizing nurture (cultural or societal influence) over nature (biological influence).
Where do we ground human behavior? Does nature or nurture determine it?
If behavior is granted to be psychological, rather than only physiological, then we must contend with whether we act with agency.
The more agentically we act, the stronger nurture’s role is.
In other words: the more free will we humans possess, the more postmodern we are.
We shouldn’t ignore our biology. But to what extent does it constrain or define what we do?
Genetics predisposes us toward certain behaviors over others. This equates with tendency, but not hard determinism.
Similarly, our environments play a role in what we can do, should do, and ultimately do.
What do societal and/or cultural influence look like? Society consists in two or more people who agree (i.e., they enter into a “contract”) on certain axioms for living. These comprise said society’s ethics.
Culture, I have argued, consists in preferred modes of being and doing. Such constitutes our style, or “art” of being. We have “tastes” for and against certain modes of living.
Societally, then, our behavior is governed by our agreed-upon ethics. Culturally, what we do should capitalize on our desired ways of being.
Arguably, society is largely a modern construct. Culture may be granted to be more evolutionarily recent: it is postmodern.
Postmodern living is how we freely choose to cope with facticity (including law and the world of physical objects: what we are “thrown” into the world amidst). Societal living consists in fulfilling our formal roles as social beings, e.g. providing for our families and others.
Postmodernity grants us individual freedom, given that we act sufficiently as responsible social agents.

Reflections on gender vs. sex
I view gender as mostly a cultural construct, though in more mainstream literature (as well as certain legal contexts) it’s regarded as a social one.
The concept of gender is “newer” than its biological analogue (sex) in both science and, seemingly, pop discourse. As such, the collective understanding of the former is less-developed in Western society.
As a socio-cultural concept, then, gender is closer to character (a social concept) than to temperament (biological). By definition, the idea of sex becomes more relevant to one’s temperament than to character.

“Are we approaching robotic consciousness?” (video response)
[Original post: http://intjforum.com/showpost.php?p=5159026&postcount=2%5D
Video:
My thoughts (I’m ‘S’)—
Video: King’s Wise Men/NAO demonstration
S: Impressive. I’d be interested to see what future directions the involved researchers take things.
Prof. Bringsjord: “By passing many tests of this kind—however narrow—robots will build up and collect a repertoire of abilities that start to become useful when put together.”
S: Didn’t touch on how said abilities could be “added up” into a potential singular (human-like) robot. Though Bringsjord doesn’t explicitly seem to be committing to such, the video’s narrator himself claims to see it as “much like a child learning individual lessons about its actual existence”, and then “putting what it learns all together”; which is the same sort of reductionist optimism that drove and characterized artificial intelligence’s first few decades of work (before the field realized its understandings of mind and humanness were sorely needing). So the narrator lverbally) endorses the view that adding up robotic abilities is possible within a single unit, which I have yet to see proof or sufficient reason to be confident of.
(Being light, for a moment: Bringsjord could well have a capitalistic, division-of-labor sort of robotic-societal scenario ready-at-mind in espousing statements like this…)
Narrator: “The robot talked about in this video is not the first robot to seem to display a sense of self.”
S: ‘Self’ is a much trickier and more abstract notion to handle, especially in this context. No one in the video defines it, or tries to say whether or how it’s related to sentience or consciousness (defined in the two ways the narrator points to), and few philosophers and psychologists have done a good job with it as of yet, either. See Stan Klein’s work for the best modern treatment of self that I’ve yet come across.
Video: Guy with the synthetic brain
S: Huh…alright–neat, provided that’s actually real. Sort of creepy (uncanny valley, anyone?), but at least he can talk Descartes…not that I know why anyone would usefully care to do so, mind, at this specific point of time in cognitive science’s trajectory.
Dr. Hart: “The idea requires that there is something beyond the physical mechanisms of thought that experiences the sunrise, which robots would lack.”
S: Well, yeah: the “physical mechanisms of thought” don’t equal the whole, sum-total experiencer. Also, I’m not sure what he means by something being “beyond” the physical mechanisms of thought…sort of hits my ears as naive dualism, though that might only be me tripping on semantics.
Prof. Hart (?): “The ability of any entity to have subjective perceptual experiences…is distinct from other aspects of the mind, such as consciousness, creativity, intelligence, or self-awareness.”
S: Not much a fan of treating creativity and intelligence as “aspects of the mind”…same goes for consciousness, for hopefully more-obvious reasons. Maurice Merleau-Ponty is the one to look into with respect to “subjective perceptual experiences”, specifically his Phenomenology of Perception.
Narrator: “No artificial object has sentience.”
Well, naturally it’s hard to say w/r/t their status of having/not having “subjective perceptual experience”, but feelings are currently being worked on in the subfield of affective computing. (There’s still much work re. emotion to be done in psychology and the harder sciences before said subfield can *really* be considered in the context of robotic sentience, though.)
Narrator: “Sentience is the only aspect of consciousness that cannot be explained…many [scientists] go as far as to say it will never be explained by science.”
S: They may think so, and perhaps for good philosophical reasons; but that won’t stop, and indeed isn’t stopping some researchers from trying.
Narrator: “Before this [NAO robot speaking in the King’s Wise Men], nobody knew if robbots could ever be aware of themselves; and this experiment proves that they can be.”
Aware of themselves again leads to the problem briefly alluded to above, regarding the philosophical and scientific impoverishment of the notion of ‘self’. I know what the narrator is attempting to get at, but I still believe this point deserves pushing.
Narrator: “The question should be, ‘Are we nearing robotic phenomenological consciousness?’”
S: Yep! And indeed, you have people like Hubert Dreyfus arguing for “Heideggerian AI” as a remedy for AI’s current inability to exhibit “everyday coping”, i.e. operating with general intelligence and situational adaptability in the world (or “being-in-the-world”, a la Heidegger).
In cognitive science terms, this basically boils down to the main idea underlying embodied cognition, a big move away from the old Cartesian or “representational” view of mind.
Narrator: “When you take away the social constructs, categories, and classes that we all define ourselves and each other by, and just purely looking at what we are as humans and how incredibly complex we are as beings, and how remarkably well we function in a way, actually really amazing, and kind of beautiful, too…so smile, because being human means that you’re an incredible piece of work.”
S: I wish the narrator would have foregone the cheesy-but-necessary-for-his-documenting-purposes part about humans’ beauty and complexity, in favor of going a bit further into the obviously difficult and tricky territory of “social constructs, categories, and classes” that we “all define ourselves and each other by”: and how near or far robots can be said to be from having truly human-like socio-cultural sensibilities and competencies.