Tag: data

  • Reading Before Prompting

    Reading Before Prompting

    Step aside Virtual Reality and Blockchain, Artificial Intelligence is now the king of hype. As ever, Neil Leach provides a thought provoking lecture in which he summarises AI in Architecture; all through the Digital Learning Futures You Tube page.

    On the back of my PhD thesis about Architecture and data, I’m currently fighting the Instagram FOMO and urge to become part of a new wave of “prompt engineers” using General Adversarial Network (GANS) diffusion tools such as Midjourny, DALL-E or Dream Studio. While the prompt tools are a lot of fun, it remains to be seen if these tools will result in new types of real world material forms, as opposed to speculative pixel based images. Before I join the prompt hype I’m aiming to read the literature mentioned in Leach’s lecture. I have included a list below to whet your appetite.

    Ai in Architecture and Design

    Bernstein, P. (2022). Machine Learning: Architecture in the Age of Artificial Intelligence. United Kingdom: RIBA Publishing.

    Carta, S. (2022). Machine Learning and the City: Applications in Architecture and Urban Design. United Kingdom: Wiley.

    Chaillou, S. (2022). Artificial Intelligence and Architecture: From Research to Practice. Germany: Walter de Gruyter GmbH.

    del Campo, M. (n.d.). Neural Architecture: Design and Artificial Intelligence. United States: Oro Editions.

    Imdat A., Prithwish B., Pratap T. (2022). Artificial Intelligence in Urban Planning and Design: Technologies, Implementation, and Impacts. Netherlands: Elsevier Science.

    Leach, N. (2021). Architecture in the Age of Artificial Intelligence: An Introduction to AI for Architects. United Kingdom: Bloomsbury Publishing.

    Leach, N., del Campo, M.  Eds (2022).Machine Hallucinations: Architecture and Artificial Intelligence. United Kingdom: Wiley.

    Manovich, L. (2018). AI Aesthetics. Russia: Strelka Press.

    AI and Society

    Clark, A. (2003). Natural-born Cyborgs: Minds, Technologies, and the Future of Human Intelligence. United Kingdom: Oxford University Press.

    Harari, Y. N. (2017). Homo Deus: A Brief History of Tomorrow. United States: HarperCollins.

    Hayles, N. K. (1999). How we became posthuman. Chicago: University of Chicago Press.

    Mitchell, M. (2020). Artificial Intelligence: A Guide for Thinking Humans. United States: Picador.

    Tegmark, M. (2017). Life 3.0: Being Human in the Age of Artificial Intelligence. United Kingdom: Penguin Books Limited.

  • The Architecture of Data Surveillance

    The Architecture of Data Surveillance

    My research investigates the influence of data and related technologies on architectural culture. I’m intrigued by how our methods of measuring the world shape our understanding of reality and how this understanding impacts our actions that, in turn, reshape that reality. The built environment is saturated with data. While buildings might not yet directly engage with data technologies, they support social activities at the service of data production and circulation. This article examines how data surveillance, which is migrating from online to offline spaces, is beginning to impact the role of architecture and its physical products.

    Architectures Response

    To grasp the impact of data surveillance, we first need to comprehend what architecture is and how it intersects with data and technology. This begins with understanding the general role of architecture. Buildings, designed with architectural thought, are complex arrangements of objects and spaces. These spaces – both physical and psychological – provide environments for people to live, work, and thrive. Architecture is crucial in shaping social interactions. It does this by creating spaces that foster idea and belief exchanges, helping to shape our culture. It also communicates visually, through building exteriors and internal spaces. In doing so, architecture uses materials to reflect culture, while also setting the stage for its evolution. Architects, to different extents, consider the existing context in their designs to understand prevalent beliefs and values. Through its spatial arrangements, architecture serves broader social, economic, political, and cultural needs.

    Data subservient to material

    Architecture, which shapes how people move and interact, also influences how people connect with digital networks. This network interaction suggests a new understanding of architecture as the physical space where digital interactions occur, from social chat to finding information and even buying products online. Digital bits, or machine data, set the backdrop for these interactions and help shape people’s decisions. Take a look at private spaces like shopping centres and airports – they’re already designed with this ‘data-influenced’ architecture. Shoshana Zuboff, in her book ‘The Age of Surveillance Capitalism’, (Zuboff, 2019) explores how online tracking now impacts our physical world. She emphasises that digital platforms, which make money from excess data, have started influencing physical spaces. The “Smart City” explores data-rich urban development, but generally, these technical projects view architecture and data as separate; inert buildings produce space, while data and analysis provide information. But this is changing. Zuboff uses the example of Sidewalks Lab’s project in Toronto to illustrate how urban planning and design decisions are starting to be shaped by data infrastructure as much as physical space. When precincts aim to maximise data collection for civic and commercial use, infrastructure influences public space design and becomes crucial in architectural plans.

    Toronto Quayside

    The urban plan for Toronto Quayside aimed to use information about the environment to organise its precincts. The approach of Sidewalk Labs was to utilise digital sensors to monitor sound, air quality, light levels, and humidity to determine the most appropriate activities for each area. These activities could be commercial, residential, civic, or industrial. At the root of this approach was an intention to optimise property value and investment returns. The key architectural concept borrowed to enable this optimisation was adaptive space, an approach from the cybernetic 1960s that imagined architecture as a constantly reconfigurable material system.

    The images from Sidewalk Labs offer a glimpse of this adaptive aim. The early renders from Michael Green architects suggest a civic street space organised around temporary furniture and structures that were easily removed or relocated. The images suggest an architecture reduced to a stable and static structural grid obeying the spatial reconfigurable logic of a shopping mall. From a property viewpoint, the concept makes sense. If a business isn’t getting enough attention, it can be readily replaced by one that might make more money. However, data infrastructure increases the speed of this change, suddenly altering the spatial purpose and experience. This change may be easy for those who are conditioned to the rapid change of digital society, but it potentially alienates those who are not digital natives.

    Buildings do not exist forever, but they do serve as a register of time that reflects culture. When buildings reconfigure around data, buildings express a new sense of time, based on information flow. If data is the new backdrop to the urban, surplus becomes the new desire and shapes new expectations for buildings. What happens when a building is reduced to a structure, and its social, cultural and political influence is redistributed to an interior reconfiguration?

    Toronto Quayside — Early concept by Michael Green Architects

    Material subservient to the data.

    The role of architecture goes beyond being just a structural grid; it’s crucial to consider the wider effects this reduction could have on buildings and urban spaces. Imagine a city where architecture is guided not by human needs, but by data flow—where movements within the city are dictated by the potential value of behavioural information. This approach doesn’t only affect the physical space; it could also reshape our cultural understanding of the city.

    Think about a restaurant in Toronto’s Quayside area. Normally, such a place would make financial decisions based on the lived experiences and social interactions of the street—things like the food culture and daily social use by office workers during the week or families on Sundays. But if these decisions start to be guided by data collected by a landlord or food delivery platform, things change. Small patterns of supply and demand start to influence these financial decisions. This detailed information can be good for business, but it creates two different realities—one shaped by culture and the other by data. As rents increase to reflect commercial potential, those who reflect the cultural life of the street could soon find their rent unaffordable.

    What happens to the life of an urban street if usage changes and human routines are disrupted by this micromanagement of commercial interactions? If culture can’t keep up with the literal view provided by data, does the city stop organising itself around social information?

    It’s not hard to think that city spaces could become available only to those with access to data, serving private interests over the public’s. Some argue that Sidewalk Labs was aiming to create new revenue streams for urban development. While these streams would make a development commercially viable, the focus on data surveillance worryingly risked turning civic space into a profit instrument. In the case of Toronto’s Quayside, this suggested the physical would become secondary to the data.

    Present-day commercial development is financed through real estate value, where the quality of human space indexes to sales or rental returns. When urban space is developed around the value of data, spatial quality is no longer a financial concern, and the focus shifts to technology. Urban space becomes technically optimised rather than designed. This approach also introduces a new framework for measuring success, one based on metrics that align with information flow rather than human experience.

    Consequences for architecture

    The example of Toronto’s Quayside acts as a canary in the coal mine of a future we are heading towards, a situation where data becomes the logic for urban development. As data monitoring becomes part of our physical world and data value management becomes a new contract for the city, architects can either work with these new data rules or counteract them with different development methods. Unfortunately, some architects have given in to developers. They should not be called Architects but ‘Spatial Engineers’. These Spatial Engineers use formulaic design patterns to reveal the hidden potential value in space. Keller Easterling has noted the influence of such spatial engineering in the repeated spatial patterns found in free trade zones and commercial areas worldwide where capital extraction takes precedence over human experience.

    The architect and their architectural creations are at risk of being unable to keep up with the speed of data. As the stability of architecture turns into a data-driven organisation of temporary material arrangements, architects need to either get involved in the speed of data or start contributing to an ethical reappropriation. Architecture needs a vocabulary to participate in the developing smart city discourse and help rebalance human social needs with real-estate extraction.

    In conclusion, the transition towards data-driven urban development presents both challenges and opportunities for architects. As the Toronto’s Quayside example illustrates, cities of the future could be shaped more by data flow than human needs, with profound implications for our shared urban spaces. As architects, we must engage with these changes, either by embracing the new rules of data or by advocating for alternative approaches that balance human social needs with commercial interests. This is not merely a question of architecture, but of society itself. In the data-saturated city of the future, the decisions we make now will shape our collective urban experience for generations to come.

    References

    Zuboff, S. (2019). The Age of Surveillance Capitalism: The Fight for a Human Future at the New Frontier of Power. United States: PublicAffairs.

  • The Ethics of Data

    The Ethics of Data

    As part of Thomas Fisher’s exploration of ethics in architecture, he considers the influence of data on humanity (Fisher 2018). While his conclusions offer little consideration regarding architecture as a built object, he highlights the relationship between data and humanity’s relationship with nature. In doing so, though, he misses the importance of culture and human emotion on behaviour, assuming that all organisms operate through rational data processing.  

    Data-ism

    Fisher begins by associating data with how organisms and machines sense the world, before concentrating on data’s role in algorithmically produced science. A possible reading of Fisher’s argument is that two scenarios exist involving data and the future of humanity’s influence on the planet, both involving Noah Yuval Harari’s concept of data-ism. The first is a move from a mindset of humans dominance over nature, to one of reciprocal benefit through understanding, communicating and caring for nature. In this scenario, data offers a common ground between humans, animals and machines to relate to each other and produce reciprocally beneficial outcomes. However, Fisher warns that just as data-ism could tend to equality, it could as quickly create a new hierarchy. For Harari data-ism either brings ecological balance through an extended sense of ethics or transposes a new order onto the world defined by a contribution to data flow. In the latter scenario, Fisher argues humans must resist the reductive world view of data-ism through avoiding technologies of data extraction, to prevent their demise.

    Equity

    Unpacking these two scenarios provides some important considerations regarding data. Initially, Fisher considers data’s influence in situation one, a reshaped ethics that brings the natural world, humans and machines together through a common data register. The ecological ethics of data-ism attempts to bring equity to all via data processing, treating human, machine and natural systems as “data processors”. Organism’s existing as biochemical algorithms is a useful concept, but misleading. It is possible to imagine all forms of life as data processors, but it presumes a singular character of data. Through using Harari’s concept, Fisher places the same assumption onto data through either digital or biochemical processing, without considering the existence of the data itself. The comparison between computers and brains relies on data existing in an equivalent format, but this ignores the difference between analogue and digital data, the former recording continuous change in material while the latter encodes discrete state changes as ones and zeros. or bits. Harari acknowledges this difference in Homo Deus speculating that human consciousness could be due to the material analogue rather than symbolic digital logic (Harari 2016, p228). However, Fisher does not acknowledge this; instead he uses data sensing as a premise to explore equality between all organisms, and the relationship between humanity and its technologies.  

    Infosphere

    On reading Fisher’s argument, I recall the theory of Floridi’s “Infosphere” that posits that the machines humans invent always sit in between a user and an unseen helper, what Floridi refers to as a “prompter” (Floridi 2014). While Floridi’s theory is concerned primarily with the future of technology, he draws out a trend relevant to Fisher’s view of data. Floridi provides a history of technology as an evolution of user and prompter starting as a link between humanity and nature, then between humanity and technology, before a third-order that removes the human “user” and replaces organisms with machines, thus predicting a human removal from technological development. Floridi describes this as humans no longer existing in the loop of innovation but “on the loop”, predicting humans progress from users to beneficiaries (Floridi 2014). Fisher and Floridi speculate on a different ethical relationship resulting from a new understanding of humanity as beneficiaries of technology and nature rather than users. In this move, both Floridi and Fisher introduce data’s influence on humanity through the way technology influences culture, but only Floridi recognises that culture sets the requirements for technology in the first place.

    Data Processing

    While it is arguable data exits as a constant between material and binary representation, the latter is a purely machine-interpretable version of data; humans cannot register nor semantically interpret digital data. Therefore, while I agree with Fisher, Harari and Floridi that data exists as reality before sensory experience, there is a problem in presuming that life is simply data processing through algorithms. The data processing view cannot consider the influence of culture, which requires as much consideration as nature and technology. While theoretically, nature and technology can interact through data, cultures rely on human interpretation into information, which in turn relies on human-constructed systems of meaning. Culture, understood as the social norms and beliefs negotiated between humans, both respond to and influence our understanding of nature and technology, meaning that any prediction of non-anthropocentric futures must resist the hyper-rational world view that data seduces. From an ethical viewpoint, Fisher misses the role of culture in modulating human understanding of reality through nature and technology.

    Complexity

    Tegmark’s theory of artificial intelligence provides a useful set of ideas for thinking about culture and data. In Tegmark’s explanation of life as an evolving process of replication and retention of complexity, he argues three stages of development occur, an initially grown biological, then designed culturally and finally learnt computationally (Tegmark 2017). Similarly to Floridi, Tegmark’s interest lies in the future of intelligent technology, but a point of difference is Tegmark’s discussion of culture’s influence on the industrial revolution and future technology, the human need to retain and pass on information. It is fashionable to think of machines as having independence from humanity and culture in the way dystopian futures predict unchecked technological evolution. Many of these futures ignore the cultural reason technology came to be in the first place. Both architecture and technology respond to culture, in turn helping to reshape it, but architecture and technology cannot come into existence through self-interest, they require assistance from humans.

    Culture

    In the rapid progression of science through algorithmic data analysis, it is easy to presume data as a given and overlook the influence of culture. The digital humanities, however, increasingly recognise data as a captured sample of reality shaped by the technology used. The act of experiencing and measuring the world sets up a circular relationship, one where world view sets a framework for acting, which then perversely produces data to support the world view. Rob Kitchin refers to this framework as a “socio-technical assemblage” of people, institutions, apparatus and beliefs that shape complex systems of data production (Kitchin). Kitchin’s concept is useful to critique Fisher’s argument for a human resistance of data-flow presuming that data-ism exists through the intentions of machines, rather than humans themselves. The perceived threat of superior data analysis and processing by machines would only offer a danger to humanity if it exclusively helped solve machine problems. But as I have argued, technology is ultimately shaped through human needs and solving machine problems should ultimately help humans, machines and nature. Where the real hierarchical imbalance could occur is that humans use the superior analysis they developed, to further benefit themselves at nature’s expense, rather than using it to come closer to nature. Any consideration of data and technology in the future of humanity must include the importance of shaping cultures around social equality across all life or a continued inequality between humans and other organisms. Data’s real effect, therefore, is in shifting or maintaining the human world view, which means that data does not create the future, culture does.

    Fisher, Thomas, 2018, *The Architecture of Ethics*, Routledge

    Floridi, Luciano, 2014, *The Fourth Revolution: How the Infosphere is Reshaping Human Reality*, OUP Oxford

    Harari, Yuval Noah, 2016, *Homo Deus: A Brief History of Tomorrow*, Random House

    Kitchin, Rob, 2014, *The Data Revolution: Big Data, Open Data, Data Infrastructures and Their Consequences*, Sage UK

    Tegmark, Max, 2017, *Life 3.0: Being Human in the Age of Artificial Intelligence*, Penguin UK

  • The Future of Architecture in Data-ism

    The Future of Architecture in Data-ism

    In Homo Deus: A Brief History of Tomorrow, Noah Yuval Harari extends the history of the human race, explored in his previous book Sapiens, into a speculative future involving humanity, nature and technology. Harari’s thinking centres around a common ground introduced up by data that organisms and computing machines produce intelligence through algorithms. Harari uses a modern scientific theory that living things exist as biochemical data processors, while computers process digital data, making them comparable in experiencing and decision making. This equivalence in detecting and processing the world provides Harai’ with a possible future scenario where the biochemical cannot process or absorb the scale and speed data flow required by “super intelligent” digital algorithm. While Harari concedes the assumption that life is reducible to data flow and decision-making fall short in explaining consciousness, the assumption sets up a scenario where humans understand and identify through their contribution to data flow in the “Internet of Everything” (Harari 2018). Harari’s last chapter specifically explores the “religion” of data-ism which considers humanity’s future in a world organised by contribution to the data flow. While Harari’s explores the impact on nature, technology and humanity, there is an excuse to consider architectures role in data-ism, and the potential influence of data-ism on architecture as a way of thinking?

    Decentering

    Two distinct possibilities appear to exist, both concerning the actual and perceived role of humans in the universe. The first is for humans to de-centralise themselves in the world and become more aware of the benefits from caring for nature and the natural environment, rather than manipulating and mistreating them. The other scenario is for the replacement of humans at the centre by algorithms that become more intelligent than humans, resulting in humanity becoming manipulated rather than the manipulator. My understanding of how technology progresses as a response to human problems, and the fact that intelligent algorithms can make decisions but do not have independent intentions, leads me to have faith in scenario one.

    Algorithmic Influence

    The main reason for siding with scenario one is through understanding means that intelligent algorithms influence the physical world. The majority of algorithms exert agency through information, examples of this agency appear in the fields of investing, music and art. However, algorithms have trouble influencing the material world directly because they do not have bodies. The counter-argument to this statement is the existence of robots which do affect the physical, but algorithms do not produce robots, humans do. While an algorithm can influence the physical world through a robot, the human-defined purpose of the robot determines the outcome.  The critical aspect to point out is that despite much materialist theorising and Hollywood speculation, algorithms cannot build robots and are always limited to the ability of their interface with the material world. The interface is a human construct. Some predict a future where intelligent algorithms begin to create material interfaces by depositing matter through 3d printing, but the 3d printer is a human-designed interface. I am not claiming that algorithms may one day produce material interfaces, just that they will require human help to do it. Therefore we must keep our eye on the ball to not create the terminator – I know that was what you were thinking.

    Behaviour

    The real threat from algorithms is through human behavioural manipulation via information. Political interests realised a long time ago that semantic information manipulates humans. In this contemporary “post-truth” era propaganda can come from both humans and algorithms (bots), with many humans unaware of its real origin. The outcomes are stark, algorithms help elect seemingly unelectable leaders, algorithms promote crazy, illogical responses to threatening human viruses, and algorithms even coax human behaviour in physical space. So while superior algorithms seem physically harmless to humans, the real threat is from manipulation.

    Architecture’s Role?

    So to return to the question – what is architecture’s role in data-ism? If Harari’s future is correct and humans become manipulated toward maintaining a data flow critical to intelligent algorithms, then the built environment could shift in its role from supporting human functions to determining them, but how would this happen? We know that algorithms cannot shape material space through direct action, so it requires agents able to do its bidding, hello humans! If an algorithm knows the arrangement of material space will produce data for its upkeep, via distributed sensors in the built environment, it could convince a human to construct it for them. Construction requires visual communication, which is part of the traditional architect’s toolkit, but this requirement for visible information changes when materials shape and organise through digital fabrication? In this process, data potentially passes from algorithm to fabrication machine, then into material parts and coordinated through the logic of a jigsaw. Rather than relying on a level of construction knowledge and skill, as architects do, the algorithm looks for anyone with hands to connect modular and discrete elements into easily assembled wholes. In the case of large scale 3d printing, human labour need not apply as matter can construct through algorithmic decisions, although, the guilty fabrication machine would rely on humans for existence.

    If this scenario seems far-fetched look to Toronto Quayside where Sidewalk Labs hope intelligent algorithms will direct flexible construction systems and planning rules promoting renovation to organise the urban environment to maximise rental returns and human happiness. Therefore, while algorithms have no direct influence on the built environment, they do not worry as they have the means to manipulate a specific chemical in biochemical algorithms, dopamine. Those who are involved in imagining and maintaining the built environment must have a heightened awareness of information sources influencing decision making to be confident that they are not inadvertently puppets for the new algorithmic overlords.

    Harari, Y, N. 2016, Homo Deus: A Brief History of Tomrrow, Random House

  • Architecture, Data and Super Users

    Architecture, Data and Super Users

    In “Super-users: design technology specialists and the future of design” Randy Deutsch argues that the Architecture Engineering Construction (AEC) industry needs a new type of professional skillset called, yes you guessed it, a “super-user”. On first reading, it is clear that the adverb “super” describes the ability to achieve things outside of the normal, but the noun “user” is harder to place in an architectural context, a user of what? Deutsch’s previous writings, BIM and Integrated Design, Data-Driven Design and Construction and Convergence: The redesign of design, offer some clues, they are users of digital technology.    

    Future Architect

    Deutsch argues that as technology seeps into all aspects of architectural practice, a new professional actor will solve the AEC industry’s problem of optimising productivity. This actor’s primary focus is to unleash the capabilities of technology and help architects navigate the future of automation. Super-Users suggests that Deutsch’s version of a future architectural profession is one where Silicon Valley’s techno-optimism transfers onto the workflows and processes of the AEC, this is after all where the term “user” originates. However, while the “user” in Silicon Valley shapes products around the desires or intended behaviours of people, Deutsch’s “super-user” relocates the site of design onto the human. These skills involve creating and manipulating digital tools, put simply, the super-user is a designer who can code.

    Unchecked Algorithms

    While all of Deutsch’s books explore technology and data in architectural practice, “super-users” takes a different approach positioning the human at the centre of design action rather than technology. This change in attitude is significant and reflects broader concerns around the impact of unchecked Artificial Intelligence in all professional domains. In Data-Driven Design and Construction, Deutsch argues that Architecture must address the demands for optimisation and efficiency in professional services, by removing human intuition from decision making. Data analytics, the statistical basis of the data-driven enables a calculation of architecture. In “super-users he changes tact, placing greater importance on human intuition on design, but assigning technology a role of augmenting rather than replacing the human. I presume Deutsch’s change in position stems from a realisation that while a workforce tasked with servicing technology produces better profits, it ultimately reduces the agency of humanity. Superusers, therefore, represents a move in architecture to realign with its humanist origins, rather than its increasingly corporate tendencies.

    Normalising Construction

    “super-users” is not detached from political-economic ideology though, far from it. Deutsch’s research interests are around the future of the architectural profession, particularly in the United States. To contribute to the built environment and hopefully achieve commercial success, architect companies must either participate in the AEC industry or take matters into their own hands. To pursue alternatives requires devising alternative funding strategies, business structures and construction techniques, there is much to gain but also to lose in any attempts of reformation. In participation, architects must negotiate the modern context of development, and the critical role capital and profit have in its functioning. As Jon Goodbun et al. point out, the AEC industry is unashamedly capitalist (Goodbun 2014). Development is a business model that encourages buildings to become desire creating commodities at an extreme scale, which in the process absorbs the economic surplus from the creativity of architects. Rather than challenge the status of architecture as a commodity Deutsch embraces it presenting the “super-user” architect as the one able to deliver the quickest, cheapest and highest quality. In doing so, Deutsch inadvertently places the architectural profession in the capitalist race for the bottom, or what Rifkin refers to as the tendency of the market to converge on zero-marginal costs (Rifkin 2012). Rather than challenge the economic status quo, the super-user helps maintain the story of capitalism which Harvey links to an increase in extreme inequality over the last thirty years (Harvey 2014), and a reason housing affordability is such a problem in western capitalist economies (Parvin 2019).

    Selling Ideas

    It is easy to lament Deutsch’s hesitation to recognise the super-user’s role in increasing the architect’s subservience to capital accumulation; I currently do not practice architecture commercially. Instead, I read Deutsch’s Super-Users from an academic research perspective and have concerns with the way he claims for a new type of skill set required in architecture. Its almost like he needs to sell a book. Deutsch describes the skills as, interpersonal, collaboration, conversational, problem identifying and solving, entrepreneurialism, teachability, knowledge sharing, storytelling, question asking, ability to think in 3d. These skills condense into five attributes, design, communication, research, work ethic, and tool agnostic. In this move, Deutsch doesn’t describe a new professional actor; he simply describes the modern architect, with a slight difference. This difference is that being “tool agnostic” redirects the super user’s knowledge from the basis of architecture to technology.

    Exploited Hackers

    This alignment of the super-user to tools draws parallels with Stewart Brand’s praise of Hackers in 1972. Brand identified the hacker as a class of people who were producers rather than consumers of technology, creating a counter to the “rigid and unimaginative technocrats” (Brand 1972). But as Evgeny Morozov points out, rather than the hacker disrupting, the hacker became the prize employee for employers seeking extreme productivity (Morozov 2014). In Brands ethos, hackers had the future skills to accommodate themselves into the system rather than try and reform it. Just as the technological independence of the 1970s Hackers became absorbed into the neoliberal economic machine, super-users set up a similar mastery of technology for the benefit of capital.

    References

    Brand, Stewart, 1972, Spacewar: Fanatic Life and Symbolic Death Among the Computer Bums, Rolling Stone 7th December 1972.

    Goodbun, Jon, Klein, Michael,Rumpfhuber, Andreas, Till, Jeremy, 2014, The Design of Scarcity, Strelka Press, Moscow

    Harvey, David, 2014, Seventeen Contradictions and the End of Capitalism, Oxford University Press

    Morozov, Evgeny, 2014, Making It, The New Yorker – http://www.newyorker.com/magazine/2014/01/13/making-it-2

    Parvin, Alistair, 2019, A New Social Contract, Medium – https://medium.com/@AlastairParvin/a-new-social-contract-359c426e0f61

  • Architecture is a History of Data.

    Architecture is a History of Data.

    Data is often viewed as a contemporary architectural phenomenon, emerging with digital technology and the internet. However, data has always played a crucial role in architectural history. From the earliest civilisations to the present, architects have utilised data for design and construction coordination, decision justification, and creation of enduring structures.

    Ancient Data

    The earliest examples of data in architectural history can be traced back to ancient times. Architects relied on simple tools and techniques to measure and document environmental details during this period. The ancient Egyptians, for instance, used a unit of measurement known as the cubit when designing and constructing significant structures like pyramids and temples. The cubit, determined by the forearm length from the elbow to the middle finger’s tip, ensured precise proportioning and alignment of each building component.

    Likewise, the ancient Greeks employed advanced mathematical principles to construct temples and other public edifices. Architects like Pythagoras and Euclid devised intricate systems of geometry and proportion, leading to structures with flawless symmetry and balance. In Greek society, specific numbers were believed to link architecture with nature and spirituality. Vitruvius suggested that the proportions observed in the human body were sufficient for architects to create the perfect building. Consequently, throughout antiquity, data in the form of numbers, ratios and measurements associated architecture with human experiences, nature, and the cosmos.

    Digital Technology

    In today’s world, digital technology has revolutionised the use of data by architects. They can now use advanced software to model designs in three dimensions, enabling them to explore various options and test ideas before construction. Additionally, sensors and other state-of-the-art technologies allow them to gather real-time data on building performance, optimising energy efficiency and indoor air quality. However, in these instances, the data is mostly non-human, consisting of pulses, signals, bits, and numbers representing mechanical and electrical observations rather than human sensory experiences.

    The Future

    The history of architecture is deeply intertwined with data. From the earliest civilisations to the modern era, it has transitioned from human-centred measurements and considerations of shelter and social interaction to non-human sensing driven by economic desires for efficiency and control. Architects have always used data to inform their designs, but is it serving our best interests today? While tools and techniques evolve, the fundamental role of data in architecture – connecting design to human experience and reality – should endure. As we look to the future, the increasing importance of data in shaping tomorrow’s buildings and cities calls for a more profound consideration.

  • The Problem Context of Australia’s AEC Industry, and Steps to Address it.

    The Problem Context of Australia’s AEC Industry, and Steps to Address it.

    In just three weeks at the ARC Training Centre for Next-Gen Architectural Manufacturing (Arch Manu), I’ve already begun to grasp the importance of this research group. This five-year journey promises to be more than just academic training; it will be an opportunity to fundamentally reshape how we address sustainability within the Architecture, Engineering, and Construction (AEC) industry.

    The backdrop to Arch Manu’s mission is a set of formidable challenges confronting Australia’s AEC sector. The Australian government recognises the urgent need for innovation and digital transformation in architectural services. The goal is to ensure that the industry remains competitive, efficient, and sustainable in a rapidly changing environment.

    Our objectives are ambitious and clear:

    1. Streamlining Production: We aim to use machines and automation to eliminate bottlenecks in production, creating a fluid transition from design to production. This reimagines how architecture is realised in practice.
    2. Integrating Advanced Technologies: We aim to help embed Artificial Intelligence, Machine Learning, Digital Fabrication, and Big Data analytics into the industry. These tools are essential for unlocking new levels of efficiency, competitiveness, and productivity.
    3. Empowering SMEs through Digital Transformation: Small to medium-sized enterprises are crucial to the industry but often lag in adopting new technologies. We will develop strategies to integrate these businesses into the digital age and create network effects of connectivity.
    4. Maximising Data and Tools: We will leverage data and digital tools to unlock gains in design quality, maintenance strategies, and operational efficiency across the sector.
    5. Exploiting the Power of Digital Twins: We will provide expertise and resources needed to utilise digital twins—virtual replicas of physical assets—which set to revolutionise architectural design and management.
    6. Bridging the Skills Gap: Our training programmes address the skills gap between traditional architectural practices and the demands of advanced manufacturing. These programmes foster a new generation of professionals ready to lead the industry.
    7. Fostering Industry-Wide Collaboration: Arch Manu breaks down the silos that inhibit innovation. By promoting collaboration and sharing knowledge, we remove barriers that have historically stymied progress in digital practices and sustainable business models.

    The work at Arch Manu addresses current industry challenges and provides a blueprint for the future. I’m relishing the prospect of contributing to changes that could ripple through the industry for decades to come.

    Please click the link to find out more about the Arch Manu Centre.

  • Digital Fabrications

    Digital Fabrications

    Galo Canizares is a rare practitioner; he is an assistant professor of architecture but is an advanced digital technology user with projects that explore the edges of the web and data. He is right up my Straße. His work helps me position my practice, as he is somewhat of an outlier. Take his book “Digital Fabrications” for instance, a book of stories describing experiments with digital tools, is a little confusing. If you knew Galo was an architect and you read the title, you would presume there would be examples of structures and physical objects scattered through the pages, but this is not the case. Digital Fabrication is about the digital image, and fabrications relate to fabricating reality more than it does to the fabric of reality.

    Interface

    I often find myself exploring ideas outside architecture, getting lost down the rabbit holes of technology’s cultural influence. Galo Canizares is the same; he is architecturally trained but has successfully expanded his practice to investigate and experiment with the digital, particularly the effects of digital interfaces on design. Today, he argues, software is ubiquitous and influences culture through the information we produce, consume and share through technical interfaces. Digital Fabrications highlights how Architects often use a variety of software packages and applications without thinking critically about how these tools have changed their work.

    Contexts

    The book unfolds like a fun journey through unrelated explorations, first setting the stage with the political influence of software and interfaces, then taking an exciting detour to narrate the history of Earth from a Martian perspective. This shift to science fiction is inventive and a surprising twist, but it veers away from the book’s expected theme, digital interfaces and dilutes the overall message.

    Experiments

    The projects featured in this book are technically impressive. Canizares develops an “absurdly dumb Twitter bot that would potentially say smart things” (p129), incorporating an artificial architecture culture personality into the design. The web drawing app Malevi.ch interprets the twentieth-century ‘Suprematist’ artist Kazimir Malevich’s theory of irrational space into an interactive interface. The written explanation is interesting, but the outcome of a force-directed figure-ground pattern generator doesn’t quite match the justification. It prompts one to question whether the tool was actually an experiment with matter.js and post-rationalised as a critical analysis of Malevich’s work.

    The Malevi.ch app by Galo and Jose Canizares

    Discussion

    Canizares’s digital interface experiments are noteworthy, but his two essays titled ‘Everything is Software’ are the real standouts. These pieces connect digital media theory with architectural discourse, defining two key terms: ‘postdigital’ and ‘postorthographic’. The ‘Post-Digital’ concept is exemplified by Carlo Listroti, who uses digital technology to craft architectural drawings that outstrip human precision and speed. In contrast, the ‘Postorthographic’ concept aligns with the work of Casey Reas, who uses pixels to generate images from data. ‘Post-orthographic’ signifies a shift from traditional 3D to 2D drawing techniques — like plans, elevations, and sections — to creating images based on pixels. Canizares argues that this ‘postorthographic’ approach ushers in a new form of dominant visual communication through social media, which in turn significantly impacts our social behaviours and collective cultural history.

    So What?

    I enjoyed reading the book, however, my critique would be that, at times, I found it difficult to follow the central point. A greater hierarchy of ideas through headings would have been beneficial, but I can understand the author’s choice as this would steer the writing into a more academic tone. I appreciate how the book links the past to the future through digital media, but I was left pondering, so what do we do? Canizares does well to convince the reader that interfaces and interaction are the future political spheres of influence but treats them as foregone conclusions and inevitable. This gap stimulates an alternative and critical response where designers reappropriate or hack interfaces to counteract the political influence gained from interaction.

    Canizares, G. (2019). Digital Fabrications: Designer Stories for a Software-based Planet. United States: ORO Editions/Applied Research & Design.

  • Exporting generative lines

    Exporting generative lines

    In a previous post, I wrote about making drawings by tracing over generated line patterns to stimulate the student design process. At the time, I identified a problem with getting the generated patterns out of the computer, alongside being able to create different patterns.

    Since then, I played around with adding a simple interface that allows someone to determine the number of lines, generate a random pattern using a button, and export it as a png for a drawing background.

    The next step is to layer more information into the line generator. I am interested in generating figure-ground patterns or playing around with grid arrangements as a catalyst for new ideas.

    Example

    Choose the number of lines, and push the Draw Lines button to see it in action.

  • Generative Inspiration

    Generative Inspiration

    The architecture design studio I coordinate starts the semester with a generative drawing exercise. The first design brief requires students to produce random lines using bamboo skewers or other stick-like materials. The forced process aims to promote quick drawings that ignore composition decisions and allows students to make visual judgements in finding and documenting patterns. These patterns inspire form-making with no reference to interior habitation requirements or external influences; these come later. The only connection to architecture is that these patterns must result in a habitable poche wall.

    Innocent By Design – Pinterest – Concept Sketch

    Randomeness

    The act of releasing and then drawing skewers introduces randomness into the design process. Unlike art or music disciplines, architects are often conflicted regarding randomness as it removes design agency. While this is true, randomness in architecture offers little meaning, it does stimulate creativity, break conventional thinking and can suggest unconsidered outcomes.

    Although the random line process ends in a drawing, it takes students a long time to release and trace lines into a starting composition. I am always interested in how the students follow the rules when teaching this beginning project;  they act procedurally to get to a final drawing. The students don’t realise it, but they are making the way a computer program operates. As students plot and join points into lines, they follow a repeatable function that a computer would easily and rapidly complete.

    Hand drawn random lines – Morpholio Trace

    With this in mind, I abstracted the process into a repeatable recipe for the computer, an algorithm capable of producing different digitally generated patterns. The aim became to explore customised software that produces spatial inspiration.

    Generative Design

    As Benedict Gross et al argue in Generative Design, producing customised software changes a designer’s analogue performance to “orchestrating the decision-making process of the computer” (Gross et al, 2, p4). Rather than producing one drawing, if the designer abstracts a process into rules and runs these with different inputs, it will result in multiple images. Abstracting and decomposing problems into smaller chunks through computer language engages the computer’s talent for repetition, randomness and logic and produces outcomes that any analogue process would require obsessive dedication.

    An initial test was how software could produce random lines across a canvas, similar to the bamboo design exercise. Although the outcome was similar, several opportunities arose. One was that the sketch could become interactive, allowing a user to adjust variables and for their inspiration. Another was that the sketch could output an open file format, such as an svg on png, suitable for future design work. The final idea was that the tool could take on the next stage of the design tutorial process, where students generate figure-ground poche diagrams from the generated line patterns.

    The code generated sketch using HTML canvas.

    The sketch regenerates each time the browser refreshes. The next stage is to explore adjustment interaction through a control panel, using a p5.js plugin and an integrated button to export an svg file for inkscape.