Tuesday, May 28, 2019

Queer Object T4T LAB 2019

T4T LAB 2019 Texas A&M University. Invited Professor: Joris Putteneers.
Team: Nicholas Houser, James Hardt, Nathan Gonzalez, Daniel Wang.

Our project is discussed in terms of simulation software and algorithmic processing of interpretations of object as data across several contextual territories. The fluctuation and ability to transcend across a series of mediums: raw, cooked, and synthetic, is the Queer Object.

The queer object references contextual territory through a series of pointing but never defining through the Derridean notion of indifferance. The algorithm interfacing within itself, shows the notion of indifferance, moments where the algorithm does not necessarily decide what to do but rather just does. The algorithm works the same way in establishing its understanding through a series of interfacing with contextual objects

The first relationship is territorial, this is an epistemological quest by looking for, pointing at, through the notion of artifacting across the raw, cooked ,and synthetic. The algorithm operates as the smooth territory itself, where the contextual territory operates as the vehicle for striation. The algorithm is not bound to a territory but places itself through a series of referential objects that are able to be artifacted from the contextual territory.

The algorithm can exist in all forms of territory because it is always pointing at but never actually defines these moments. The territory is defined the moment the context is referenced and artifacted.
The act of indifference and vicarious causation of referenced objects results in the contextual territory to become the vehicle of striation.

Raw, Cooked, Synthetic  
Each territory represents the nature of the Raw, Cooked, and Synthetic. The artifacts from each territory are results of incomplete understandings of contextual objects due to inddiferance.

The Raw is a proto territory that yields an ancestral algorithm. The artifacting process, since the algorithm interprets numerical data, consist of a series of photogrammetry of objects found within the proto environment, that are then translated into a cartesian coordinate system. The data was then ran through the scripting of the algorithm itself to be in its own image.

The Cooked is a territory that contains a series of objects created by CAD. This denotes ideas of construction, technicality, and complete control of Euclidean geometry. These objects are easier for the algorithm to synthesize because of the similar language of numerical data, thus causing an overlap in data and resulting in the kitbashing of referenced objects. The data was then run through the algorithm to put it into its own image.

The Synthetic is a hyper real territory. The hyper real is a system of software simulation simulating itself.  This territory only exist within the digital environment. The objects that exist within this territory are produced digitally and are only meant to act in the digital realm. The artifacts created, since the algorithm does not care to observe every data point of the object, are the results of the jumbling of data points and creation of simulated simulation. The hyperreality is the terminal stage of simulation.

Maturity & Queerness    
Since the algorithm is in a constant state of flux, both in posture and territoriality, maturity and senescence can be derived. The duration allows for the algorithm to continue the addition of artifacts.

Due to the constant act of referenced objects and striation of the algorithm through such, the queerness emerges. The position of queerness can be seen in the vector density drawing, section cuts and geometrical resolution. As the algorithm continues its maturation, the flux can be seen in a change of density of the framework of the vector trails. The vector trails are the foundation of the generative nature in the workflow project, as the overlap of trails unravel, the substance that can be territorialized by the artifacts is depleted.
The resolution of the geometrical framework is also at the will of the maturation of the algorithm. Areas can now be identified through resolution of artifact versus algorithm other than just in terms of geometrical relations but rather geometrical resolution and subdivision.

The flux of artifacts begins to divide and disassemble the area in which the algorithm can operate. The addition of these artifacts creates new spaces while divides others. Through the senescence of the algorithm and addition of artifacts, the posture of the algorithm changes as it matures. Upon referencing the contextual territory under constant acts of striation the queer object is exhausted. The duration of the algorithm can be seen through the decay and dismemberment of the original algorithm and growth of invasive artifacts. The duration does not compromise the substance of the object because it is an issue of kind and not to degree at which the posture is lost.

The algorithm does not need to exist in our current reality, rather we are laying the frame work for the next reality in which this will occupy, The architecture of the hyperreal, because what is the difference from the architecture of the a priori world of our reality.

The Hyperreal is the terminal stage of the simulation, because the hyper real is indistinguishable from reality, and thus the process is repeated from the raw to the cooked to the synthetic, recursively.

 The algorithmic exhaustion, theory, and queerness are all connected through the results of simulation software. The essence of simulation is queer. The aspects of flux that are produced in terms of fluidity change and indefinability relates exactly back to queer theory.

The workflow of this project showcases the view of machine vision and simulation software being applied to all principles of territory, section, elevation, program and aesthetics. They are all adaptive to different Variables that can alter both the simulation program and the algorithm which ultimately leads to the absolute exhaustion of workflow ending the life of the queer object and simulation. 

Saturday, May 25, 2019

Ornament as Crime T4T LAB 2019

T4T LAB 2019 Texas A&M University. Invited Professor: Joris Putteneers.
Team: Anna Cook, Courtney Ward, Francisco Anaya, Benjamin Hergert, Luis Rubio

Ornament as Crime

This project speculates on a prison in the post-singularity era. The prison is occupied by both humans and AI who have committed cyber crimes, and is governed by a council from both species. As a prisoner is admitted to this center, they are interviewed and assessed based on the severity of their crimes and their initial degree of contrition. Once this information has been obtained, they are sent to a specific chamber of the prison and exposed to a customized VR simulation. This reflects Foucault’s assertion within Discipline and Punish, in which the prison begins to operate in the same typology of the factory or school where one is subjected to the normalizing gaze.In creating this new prison typology, we are reinterpreting Foucault’s anthropocentric basis to fit the conditions of a post-singularity and post-anthropocentric society where the effects reach both human and AI.

This new ontology of cyber crimes is resolved through the progression within the prison from admittance to reintegration to society. This process is expedited where information gained through big data is synthesized into virtual reality in a post-human level of efficiency. A prisoner must pass through a series of chambers and experience parts of the simulations taking place in them before arriving at their own customized VR simulation. Because of the nature of VR, the prisoner would feel as though they have been incarcerated for a long period of time, when in reality, only a few hours have passed. This is further discussed by Foucault's analysis that the current prison system has begun to transfer from the punishment of the body into one that is centered on the punishment of the mind and the intent to commit crime. In each simulation, the prisoner is unaware that their environment is not real. The simulation corresponds to the crime committed, as a way for the convict to realize the severity of their actions. The punishment acts as rehabilitation in a new application of neuroplasticity in which the minds of both humans and AI are rewired to break the connections of criminal behavior and instead reinforce “proper” avenues of thought. This is completed through the VR simulation where PTSD is prevented by forcing the prisoners to confront these traumas.

This model is based in cities and can be implemented in multiple locations throughout the world as needed. Each model will be tethered to the city but floating above it, and thus acts as both a panopticon and a reflection of crime rates within the city. The wires that tether the object act as data collection structures and can grow and stretch as needed to better absorb information from points throughout the city. The physicality of these tethers is based on the psychological phenomenon known as the Hawthorne effect in which individuals modify an aspect of their behavior in response to their awareness of being observed. The algorithm acts not only as habitable space to define each chamber, but also as a signal jammer to block unauthorized communications from entering or leaving the center. Because of the algorithm’s computational nature, its form fluctuates based on the density of prisoners within.

The aesthetic agency is realized through an interpretation of ornament through VR generation. This is articulated through the appearance of sculpted surfaces organized into a hierarchy of elements read as a continuous whole. This ornamental evolution follows a grotesque interpretation in which fear and awe are intertwined through the asymmetrical expression of over-exaggerated repetitive elements. In this manner, the influx of crime data gathered from the city generates further ornamentation. This ornamentation shifts past notions of baroque and rococo and begins to define its own style, moving forward operating in the post-anthropocene. These concepts are represented through the form of a narrative collage in which the progression through the experience of the prison is displayed in a digital reinterpretation of the collage. The collaged images become something else—a new form that is neither representative of nor derivative of the original architecture that seeks to  further dilute reality.

The process of designing the prison is critical to the understanding of its operation as we move past the epistemological and move into the ontological. We have moved past the idea of “becoming digital” with design operations being performed in the real world using analog methods, concepts, and tools such as the mouse. Now we function completely digitally in the VR simulation where we are pulling from digital information and generating form through a post-process method of sculpting that operates outside of the bounds of physical and human limitations. In this way, the prison is created and exists within a new reality that doesn’t acknowledge its own existence as being digital. This demonstrates the effectiveness of VR as a new methodology for the generation of both form and concept, existing simultaneously.

This produces a program that blurs the lines between reality and simulation through strategies of manipulation of time and space in an effort to change societal perceptions of the purpose of prisons. This progresses past Foucault’s analysis of the treatment of prisoners and the effects of their separation from society by providing a solution in the form of a post-heterotopic existence: an in-between space that acts as a way to not to only alter an individual criminal, but as a way to repurpose the influence of the prison on society.

Monday, May 20, 2019

A.R.K. T4T LAB 2019

T4T LAB 2019 Texas A&M University. Invited Professor: Joris Putteneers.
Team: Austin Madrigale, Michael Marroquin, Esteban Armenta, Karen Cardenas


ARK is a sanctuary of objects speculating on a new ecosystem of accessing, displaying processing, and preserving in a post-singular context in an endeavor to ensure that digital information of continuing value remains accessible and usable. This archive addresses algorithmic operations sectionally, in three levels: the architectural system as a whole, the curatorial process, and the object as tool in how it relates to the system. Programmatically, objects are displayed in a gallery and garden, processed within an archival chamber, and finally laid to rest within the catacombs for deep storage and preservation. This process of preservation is cyclical, allowing for a reduction of time to exist in an object's quote unquote lifespan. The objects are not only preserved but are repurposed to be able to produce new objects, through data collection and sequencing.

This system utilizes algorithmic processes working at three scales:
1)    The first level is the ecosystem as a whole. The gardens above ground landmarked by follies outputted by the machine determined ideals of a picturesque, rejecting Kantian perspectives of the beautiful and sublime, in order inject its own aesthetic. A gallery where an object’s “hardware” is displayed and accessed by a subject curated entirely up to the machine’s discretion.

 Below the gardens, the liminal space of the archival chambers where an objects data is collected and processed in its transition to becoming digitally preserved,

and finally to storage within the catacombs where the object’s remains are stored in the “folders” within the poche. Due to the cyclical nature of this ecosystem, if objects are considered significant due to machine discretion, the object data will be outputted and sent to the gardens to be displayed or if deemed insignificant (for example: object duplicates) are then sent to the incinerator to be deleted.

2)    The next level of algorithmic processing, is curation. There is a step by step process in the object’s journey from arrival to storage that takes place within the archival chamber. This process is aided by artificially intelligent tools dubbed The Curators. The Curators are charged with classifying, appraising, sorting, collecting, and interpreting object data through photogrammetric scanning, material extraction, and cataloguing for digital and physical preservation.

In the Second Digital Turn, Mario Carpo describes the new technological advancements in classification and sorting that have allowed data to be collected. Technology giants such as Google and Amazon have allowed people to correctly sort information or files in a specific order, but machines are able to do the job at a more efficient rate. From today’s big data perspective, it is easy to see that classifications also function by structuring and formalizing a random stock of information, providing speedier access to all data in an inventory by way of indexing. In Amazon Warehouses, objects are not sorted by subject or category, but only based on the frequency of scale, following an order that would be meaningless to humankind. This model of machine classification is seen within the chambers of our architecture. The Curators define their own system of classification unbeknownst to human understanding. This leads to a developed library of both meta and material data in order to create an accurate rendering of authenticated content over time.

3)    At the smallest scale, the algorithm functions as a tool. A collection of objects repurposed as a difficult whole. Each “tool” is made from the “hardware” of other typologies. After a certain critical mass is outputted, when the the tool has enough information to be of use, it begins to aid in the curatorial process of the system. The result of these parts to whole is a completely new typology, the tool becomes a plane of objects through aggregation, delamination, bashing, growth, and decontextualization allowing for transtemporality of objects. The pieces are not fused or merged, retaining its own objectual qualities. The curator itself is not self sufficient, but rather becomes a product of its environment, a time capsule of collected and preserved objects. The object uses collected data to render new outcomes such as objects, spaces, and tools. The tools are no longer parts of the system but they are the system now.

This project sorts in two categories. A machine’s data processing output and a human subject’s understanding of this output. While both are independent of each other, both can be used as a resource for each other in this ecology. The system learns from human data while the human may also learn from the data outputs of the system, creating a symbiotic relationship between both system and human without any interdependence. This archive produces interpretations of architecture and human and machine interaction through new means of object collection and data processing.

Wednesday, May 15, 2019

Dump Vestige T4T LAB 2019

T4T LAB 2019 Texas A&M University. Invited Professor: Joris Putteneers.
Team: Emily Majors, Cynthia Castro, Jeannelle Fernandez, Alejandra Valdovinos

Dump Vestige

As Timothy Morton asserts, “the fantasy we have regarding trash lies in that it disappears [and] dissolves.” In the United States alone, humans are generating trash at a rate of 4.6 pounds per day per person, which translates to 251 million tons per year. As a result, greenhouse gas production is increasing and studies indicate that the earth will become uninhabitable for human life by the year 2060. Hundreds of species rely on organic waste produced by human activity for survival.  This raises the question of how an ecosystem dependent on the production of human waste, such as the garbage dump can survive without humans available to generate input?

This machine uses big data collected from digital waste and physical waste in order to optimize dump emissions with the intent of sustaining both the earth and the ecology of the rubbish dump, privileging the dumps’ agenda to preserve itself in the case of human extinction through a process of machine learning and synthetic trash manufacturing.

Occupying the territory of the dump, the self-generating structure operates cyclically, fluctuating, expanding, and contracting over time as more garbage accumulates and system optimization occurs. The cycle begins with the insertion of an algorithmic primitive that collects, learns, and expands until it begins phases of consolidation and optimization. The cycle begins again as the machine updates and refines its understanding of the dump.

The machine determines the desired composition and form for optimized trash based on a gained understanding of the chemical composition of trash required for a positive impact on the ecosystem. The physical collection mechanism is interested in collecting samples of organic material and in rescuing lost data found in e-waste material such as computers, hard drives, mobile devices, etc. The machine combines on-site collection and observation techniques with its access to digital waste found in the cloud to better process garbage input.

Although the preservation of human life is not the machine’s intent, the machine’s ability to produce optimized waste that could fertilize soil, purify water, or counter the effects of carbon emissions could potentially postpone human extinction. Human extinction or not, the machine is primarily concerned with self-preservation through optimized synthetic trash manufacturing.

The machine is not pushing any aesthetic agenda. The machine derives its aesthetic regime from its own assimilation of how the machine becomes a part of its ecology, acquires big data, and produces as needed. It establishes a completely new aesthetic regime based on the algorithm big data allowed it to produce, but it is not assimilating any known aesthetic. The media exhibited in the presentation represents our speculation on the qualities of the machine’s aesthetic at all scales; large, in elevation and plan, smaller in interior and exterior machine detailing. The smallest scale of speculation can be observed in our photography and film studies.