
They say; you are what you eat, and today, this is truer than ever. Our consumption and presence in the digital space actively constructs our identity. Ervin Goffman in The Presentation of Self in Everyday Life (1956) points to the performativity of identity in various social contexts. With AI, this has extended to the performance being choreographed by sophisticated algorithms optimized for enhanced engagement rather than authenticity, nudging our interests in specific directions. The ‘algorithmic self’ emerging out of such interactions is a product of the ‘gaze’ of the algorithm which, based on its vast collections of our data– past behaviors, interactions, and preferences – dictates what aspects of our identity are to be amplified, minimized or erased. The AI-mediated experience thus becomes quite ‘uncanny’, caught between the real and the hyperreal, familiar yet strange, controlled yet completely out of control.
We are categorized and assigned meaning through the algorithmic gaze. Unlike the human gaze, which is subjective and interpretive, this gaze is statistical, reductive, and indifferent to nuance. Identity in machine logic is whatever is quantifiable– race, gender or engagement metrics. AI imposes a rigid, mathematical ontology onto the fluid nature of human identity. For instance, social media platforms categorize users into predefined gender labels which means non-binary identities get ignored. Similarly, algorithmic job recruitment filters candidates through keyword-driven CV scanning, reducing professional worth to quantifiable metrics rather than holistic potential. Humans are reduced to patterns and data points, into commodities to be exploited. But as is true of any human invention, AI is limited. If one doesn’t fit its existing categories or it fails to recognize them, do they, in its logic, exist? “Gender Shades”, a study by Joy Buolamwini and Timnit Gebru at MIT found that AI often fails to recognize non-Western faces, reinforcing biases which may also lead to erasure from digital spaces. This algorithmic exclusion reflects deeper philosophical concerns relating to inclusion and egalitarian development.
Phenomenologically, presence is fundamental to identity; our consciousness is always directed towards something, shaping how we perceive and experience reality. All these metrics are annihilated by the algorithm. We are not ‘present’ in the digital but we do have an ‘assigned’ identity; our conscious experience is dictated by the algorithm. There is little agency we can exercise on our conscious perception. Identity is reduced to mere data points, tailored to suit machine logic, stripped of its depth and reality. Our ‘feeds’, ranging from regular content to ads function as scripts to not just who we are but what we are expected to be. Digital identification has real world implications. Algorithms, optimized for engagement, push forth content we are more likely to consume, leading to the creation of ‘echo chambers’ and endless feedback loops. Exposure to diverse opinions is reduced and identity forms in a limited unidirectional manner– the individual ends up as a simulacrum.
This forces us to question – when identity is sculpted by AI, do we shape ourselves or are we curated performances, optimized for machine logic?
Edited by Sreepriya Ramesh
