On the surface, the way we document buildings may still look the same. Modeling software, like any other tool in history, helps architects depict, design, and produce drawings for construction. But something has fundamentally changed in the ambitions of architectural visualization, and with it, relationships between representation and reality. Digital information has never been more closely coupled with the material logics and deployable flows of construction and operations across a building’s life.
In the last two decades, Building Information Modeling (BIM) has emerged as a de facto industrial-strength medium of the global Architectural, Engineering, and Construction (AEC) industry. BIM takes on an increasingly credible vision in search of greater construction efficiencies, error reductions, labor savings, and other frontiers of optimization. New modes of representation have emerged to support ever tighter synchronizations of building information, across ever larger surface areas of spatial management. Bigness meets “BIMness.”
Today, BIM is regarded to be on the same digitally “disruptive” plane as big data, robotics, and Virtual/Augmented Realities (VR and AR). Beyond 3D modeling, it has acquired a worldview of no less than seven dimensions: a 4D model simulates construction time; 5D estimates building cost; 6D analyzes energy and sustainability outcomes; and 7D manages facilities using a model of the building as-built. While 7D applications are still far from achieving the ubiquity that 3D BIM software currently enjoys, architecture’s “information turn” spells a new paradigm for cataloging the built environment as a single database of linked object data, from the scale of a wall or door element to (ostensibly) a building’s entire operational life.
This warrants a materialist understanding of “architectural visualization”—one that is not reduced to an affective or rhetorical function, as is the case with renderings and other architectural eye-candy. Rather, “visualization” compounds information (visual and nonvisual) in an “optically consistent space” for the calculated coordination of an entire industry. Modeling software is a “‘universal exchanger’ that allows work to be planned, dispatched, realized, and responsibility to be attributed.” Such an optical instrument manages material facts, secures expertise, and orders realities. Visualization is therefore bound up with the apparatus of vision itself—how one sees makes possible what one sees, which makes possible what one plans to do. Unlike a rendering, the technical image is not a suggestive picture, but a proxy. Its graphics are not relevant for their symbolic qualities, but for their sociotechnical dispositions. What does “7D” allow us to see (and do)? What human-machine relations and coordinated worlds are brokered by software?
Google “Building Information Model,” and you will find two distinct visual conditions: either ultrageneric solid building mass, economically cast in some default grey Shaded View on a WYSIWIG interface; or fluorescent pink, yellow, and acid green MEP and other engineering elements, densely compacted into a barely-there building envelope. If the first conveys self-evidence and cost reliability, the second conveys coordination expertise. This dichotomy points to a BIM model’s graphically split personality—an AEC Jekyll and Hyde.
Reyner Banham once described the High Tech impulse as “the most recent way of bringing advanced engineering within the discipline of architecture.” BIM visuality exemplifies this union. At first, building information guts appear no differently to the 1960s/1970s proclivity for seeing pipes before architecture, or pipes-as-architecture. One is reminded of François Dallegret’s canonical Anatomy of a Dwelling, which illustrated Banham’s “baroque ensemble of domestic gadgets.” But BIM’s lurid ducts depart from the High Tech milieu of tubular exuberance. Unlike Dallagret’s ornamental compositions, today’s labyrinthian HVAC layouts are militantly coordinated ensembles, demonstrating an ability to work-as-modeled.
Unlike the iconographic Centre Pompidou, whose exposed services are didactically denoted (general blue for air-conditioning, yellow for circulating electricity, and so on), BIM color-codes act more as durable office standards, disciplining screen vision across space and time, projects, workers and departments. For example, according to GSA BIM standards, the blue used for Compressed Air (RGB 0, 0, 255) is numerically distinguished from Domestic Cold Water (RGB 0, 63, 255), even though they look almost exactly the same. BIM’s pedantic chromatic spectrum hints at the sheer density of consultant information entering the singular scene of the model to the point where colors are not optically but numerically verified.
Even though they are not photorealistic, BIM graphics are somehow deeply trustworthy: the building looks overwhelmingly buildable. The use of screenshots on AEC company websites to convey BIM services to a non-technical audience may contribute to this solvency. A champion of the paperless domain, screenshots mark an epistemological shift from human-checked printed documents toward that of a “virtually witnessed” reality. Developed by the CAD Project at MIT in the 1960s, screenshots established visual conventions to show a computer-oblivious public that human-machine interactivity existed. According to Matthew Allen, “Computer-aided design and the interactive computer needed a public relations campaign in 1960 because their novelty was easy to miss.” The screenshot thus contributed to the historical development of computer coordination as proof of professional expertise and to the cultivation of confidence in interactive machine intelligence.
Screenshots would later record (self-)evidence of computer-generated space planning. Used to assess architectural program layouts in the 1980s, MIT’s experimental IMAGE interface was “an interactive graphics-based computer system for multi-constrained spatial synthesis.” IMAGE’s parameter-based space ranking and evaluation routines were tested in several “real-world” settings: MIT’s planning office, an architectural office, and in the studio classroom. A study was also conducted for the US Army Corps of Engineers, where IMAGE was used to analyze the hypothetical design of a recreation center for planning inconsistencies. While in no way claiming to generate total spatial solutions, IMAGE “provided quick, inexpensive, and wholly accountable evaluations of submitted design proposals” and was regarded as “a tool capable of helping designers to converge on design solutions.” The study concluded that IMAGE was, indeed, “helpful in the development of an architectural space program” and “could play several useful roles for agencies that monitor large numbers of building design contracts.” Machine-responsible space planning would later become integrated into smart modeling practices to the point where, today, entire rooms are parametrically guaranteed.
From analysis to application, digitalized space planning routines have become extremely efficient for Foucauldian typologies like operating theaters and prison cells, with highly prescriptive design briefs, predictable layouts and replicable object sets.
Depiction, Detection, Simulation
In strictly 2D drawings, building elements are geometrically translated and pictorially ordered into legible relationships. Drawn around the time that the Centre Pompidou were being constructed, a drawing from James Gowan’s 1975 Housing at East Hanningfield shows a pictorial composition of pipes and trusses, hand-drawn and color-coded for presentation. Even pre-CAD, the drawing reads startlingly like BIM: the solid building recedes, foregrounding a suspended architecture of object-oriented services. But unlike BIM, these objects are inert.
In the same year, Charles Eastman, software developer/architect/putative “father of BIM” proposed that 3D elements could be cataloged and coordinated in precise alphanumeric (i.e. machine-readable) terms, which when assembled together in a relational model, would eliminate the repetitive labor of updating 2D drawings, and facilitate “any type of quantitative analysis. In 1986, software developer Robert Aish (now at Autodesk) published a seminal paper on the first proto-BIM model RUCAPS, which built on this newfound agency for building modeling. No longer just representational plots of data, digital models could be sufficiently imbued with “compatible, if not synonymous” descriptions to become motivated inspectors of that data.
As Eastman portended, drawings could now reliably read and redraw each other from a master model: moving a window or a desk in plan would cause other sections, elevations, and perspective views to be regenerated accordingly. The model was now able to coordinate itself. Yet the ability for models to identify its own clashes and become comprehensive object databases was related to improved computer processing and storage hardware. Machine coordination has a material base—acknowledged almost half a century ago, as an impediment.
Combing through a BIM model, the machine-eye of coordination software flags discrepancies for us. Compare Gowan’s East Hanningfield section with recent screenshots of an automated clash detection routine, featured on the website of Arkansas contractor Nabholz Corporation under “The Beauty of BIM.” While Gowan’s drawing looks technical (a depiction), Nabholz’s images enact the technical (a calculus). The model coordinator shifts from a human position outside the model (like a doctor interpreting an x-ray, or an architect reading a drawing), to a point-of-view inside the model (like a camera probe for internal examination).
Yet Nabholz’s images are not being used for technical calculation. As Vladimir Bazjanac observed in 1975, at the early implementation of computer-aided architectural simulation: “What [architects and clients] expected from a computer model was credibility, not precision.” The business of BIM software thrives on clashes, and their detection, between different documents and models. The sales model of collaborative platforms—surely, the very act of coordination—cannot exist without the visualization of conflicting information.
Today, the invisible hand of building coordination extends beyond 3D. In a 1982 drawing of the Patera prefabricated building system by Hopkins Architects (another High-Tech protagonist), a yellow crane symbolizes the phase and nature of site labor. In a 2016 video advertisement by Synchro, a 4D construction management software company, however, the crane simulates project-specific site activity in fast-forward video sequence. The primary source of detail is no longer in the drawing, but the corpus of a database from which objects, drawings, reports, timelines, and virtual experiences are retrieved and regenerated.
While drawings express physical labor, clash reports and time simulations tell us how the machine saves humans from it (or so the refrain goes). Over the last century of innovations in industrial mechanization—from the Newcomen steam engine to the Frankfurt Kitchen and the Amazon Mechanical Turk—automation under capitalism never quite plays out for the worker in the end. Such a paradox of labor even exists in computer graphics: Blinn’s Law states that even as “technology advances, render processing time remains constant.” 3D animators tend to respond to reduced hardware barriers by making more computationally demanding simulations, rather than rendering the same scenes more quickly.
For now, the clash detection feature starring in Nabholz’s reputational images does seem to indeed be time-saving. But such software advancements necessitate an expanded skillset of human-machine partnerships, which require of its users even more kinds of checking activity, before, with, and behind the machine. “Human error” has not gone away, and no matter how reliable the automaton, fears of impending financial and legal liability are felt by its ever-conscientious operators.
René Magritte once admitted to the indexical limits of the image: Ceci n’est pas une pipe. The treacherous pipe paradigm has held more or less true through 2D and 3D architectural drawing and modeling: the image is no substitute for the real. That was only true until simulation became a possibility. In Autodesk Revit, a pipe can be expected to perform engineering analysis and optimize fluid flows. Pipe Types now submit to “real-world systems” such as gravity and pressure. Far more than just “a” pipe, today’s parametric pipe is uniquely identified: Logiwaste_DN 400 Pipe_4311-15-TT021SS.rfa.
Pipes aside, a surfeit of exacting construction materials, calculated structural elements, product-specific appliances, and engineered finishes are parametrically available and awaiting architectural specification—even clear primer, an invisible undercoat material, can be accurately applied to your model. Smart coordination turns architectural “space” into discrete, clickable assets guaranteed of a supplier. All across the BIM object-world, generic representations give way to proxies—offering ever greater coordination between the virtual and the real.
With great object specification comes great responsibility for its storage. Just as new archival practices arose over the nineteenth century to meet the eruption of photographs, data spreadsheets, object libraries, and clash reports register archive anxieties surrounding the mass proliferation of building data. Along with this comes, of course, liabilities, intellectual properties, and desires to control data stock. Today, virtual components are pre-approved and stockpiled in 3D warehouses on cloud or server, readily deployable for any project. Global offices like Arup have amassed in-house inventories of over 25,000 intelligent BIM objects for their engineering and architecture projects.
Software platforms are new keepers of the trade library. Even Trimble—owner of the much-maligned free modeling software, SketchUp—now offers Trimble MEP and “one stop shop” BIM content management platforms boasting eight million components. Object properties become object property. From Duravit sinks to Miele dishwashers, many major manufacturers now provide comprehensive BIM objects for entire product ranges. As architects gravitate toward clean, correct componentry for model fidelity, the menu variety and quality of virtual objects influence which supplier gets the job.
As BIMObject, BIMSmith, RevitCity and other trade platforms gain currency, smaller players lacking BIM expertise or overheads to invest in the virtual object economy may consequently be disadvantaged in the real supply chain.
Information Rich and Poor
Coordination is not without class distinction: namely, “smart” versus “dumb” computer graphics. Hito Steyerl once described and defended the “poor image”: ragged and ripped digital artifacts which pervade the web as “lumpen proletarian,” in contrast to their elite high-resolution counterparts. If smart graphics in today’s architectural software means parametric, product-specific, and behaviorally-accurate models, dumb graphics refer to uncoordinated 2D CAD blocks, generic furniture models, and error-prone free warehouse downloads. Unable to simulate anything but nevertheless circulate widely and stashed like ammunition on office servers and student hard-drives alike, such graphical detritus—architecture’s poor image—escape total management.
Through the eyes of the Smart City, the classist smart/dumb divide maps onto actual things: smart (sensor-laden) versus dumb (sensorless). According to one internet-of-things hardware provider: “Without a means to sense, capture, process, and send [collected] information, most of these objects remain dumb objects.” Sensorless stuff tends to elude software’s committed grip at the cost of participation. Will “dumb” devices one day be illegible to the smart city model? Among the potentially digitally endangered: non-compliant materials, obsolete appliances, broken (object) Families… As digital models gradually become proxies for real building conditions, would the “poor” building image cease to be insurance-claimable, to be recognized as valid entities deserving of servicing or services?
At the 7D scale, model coordination occurs long after the building is built. The BIM model, in all its exploitable data-richness, is tooled into facility management software. With EcoDomus, for example, building managers wield “assets in 3D to track mechanical, electrical, plumbing systems and identify areas of concern, i.e. plumbing leaks.” In ARCHIBUS Personnel & Occupancy, one can interact with rooms in a “live” building plan for occupancy status updates, providing “improved reporting on employee headcounts and locations, average room areas, room availability, space benchmarking, [and] occupancy rates.”
Lifecycle coordination invites a diverse array of interfaces: from diagrammatic dashboards to model views which lets managers see through walls into substructure, to geomapped point-cloud interiors with RPG-style navigation, to “Enterprise Information Modeling,” which combines BIM, geospatial, human resources, and financial data for a “[full life-cycle] view of an organization and its processes”—from a pipe to a property portfolio. In this entrepreneurial world-view, everything is an asset. The denser the information, the greater the potential Return on Investment (ROI).
Technicolor coordination extends beyond the screen, overflowing onto building sites and operational facilities. Contractors, clients, consultants, building managers, and maintenance workers may be goggled and gazing up at AR-projected duct layouts, or waving smartphones around building sites to identify variously hidden MEP elements. In Autodesk’s mobile facilities asset management software, BIM 360 Ops, building models are highly specified and rendered in site-specific, navigable, and portable detail. The building model can now provide preventative maintenance, prompting the replacement or repair of worn parts before systems fail.
These management systems make the upkeep of building infrastructure appear effortless—occupants may never have to see a light being replaced or elevator being fixed. The goal of 7D, it seems, is to wrest the data-rich model into an “electronic owner’s manual.” The energies poured into modeling a world in such precise, determinant terms bend in service of remote asset governance.
As the drive to integrate units of information at ever greater speeds of life-cycle synchronization, technical precision is increasingly equated with fact. The cybernetic building of material facts, inside and outside the computer, can be summed up in emerging technologies like the “digital twin.” Defined by IBM as a “dynamic, virtual model of the physical structure, powered by the massive amounts of [sensor] data that a single structure generates around the clock,” the digital twin is used to “offload some of the tedium of facility management” by automatically ordering replacement parts, regulating maintenance, switching devices on/off according to building occupation levels, and other efficiency-boosting feedback operations. Originally conceived for digital factories, digital twins now remotely manage offshore oil rigs, wind farms, and other extra-large infrastructures.
The digital twin of Amsterdam’s Schiphol Airport links BIM and GIS to real-time incident data, and can simulate “potential operational failures throughout the entire complex, which saves us both time and money.” The model is now a virtual replica of reality that tests for future scenarios, and readjusts. The recursive feedback loop so familiar to cybernetics theory today approaches a “constant vigilance.” The observant, self-regulating system is advocated as a means of predictively managing aging assets, or conversely, accelerating profit gains.
In Rated Agency, Michel Feher posits that today’s investments are secured by demonstrating creditworthiness, be it though “resources, a reputable record, or wise projects.” Building simulations carry reputational value and investment-worthiness, choreograph vision and labor, and guarantee an error-free reality. No longer just envisioning-instruments, software becomes a new data-rich site for owning, issuing, predicting, and verifying claims about infrastructure.
7D vision points to a larger logistical apparatus. Architectural visualization is no longer about building form per se, but about forms of building information and their management. Drawings now draw as many things as possible together so as to harness them at every turn in a building’s life. As the building information model becomes a new locus of power and prophecy, how that power is enacted becomes just as much a question of politics as it is of visuality.
Software as Infrastructure is a project by e-flux Architecture as part of “Eyes of the City” at the 2019 Bi-City Biennale of UrbanismArchitecture (Shenzhen).
Amelyn Ng is an Australian architect, writer, and cartoonist currently working on issues in graphics, epistemology, and theories of information-richness. She is a 2019 Wortham Fellow at the Rice School of Architecture, and a graduate of the Critical, Curatorial and Conceptual Practices (CCCP) program at Columbia GSAPP.