________________________________________________________________________________________________________________________
Where does the liability lie?
When there’ s a design flaw, a building defect, an HVAC issue, or a safety failure and AI was part of the process, who is responsible for the error?
For example, let’ s say AI software determines the optimal concrete mix for a particular job in a certain area of the country. If the software fails to consider regional factors— perhaps assuming the project is in a warm, dry climate instead of New England— the resulting mix could be unsuitable, leading to premature cracking or structural issues in the freeze and thaw of the Northeast. Who’ s at fault? The software developer, the general contractor who approved its use, the subcontractor who entered the data or the developer who demanded AI integration in order for the project to not go over budget.
Traditional construction contracts don’ t address the use of AI, meaning there’ s no clear or uniform way to assign legal responsibility if a project fails due to a software error. Industry-standard contracts like those from the American Institute of Architects( AIA) will eventually include detailed language about AI— specifying why it’ s used, who’ s responsible for inputting and verifying data, and how its outputs are applied. But that’ s years away. Instead, right now, the AIA has only gone so far as to issuing guidance and passing resolutions encouraging the profession to adopt AI responsibly.
Legal cases take time to move through the courts, and it will likely be several years before a body of case law provides some clarity. In the meantime, contractors need to cover their bases, and work with legal counsel to update contracts to include language about AI. Even interim or supplemental clauses that define roles, responsibilities and data ownership can help reduce exposure.
Data privacy and ownership
Open-source AI thrives on data. But where that data ends up, who owns it, and how it’ s used are issues construction firms must take seriously. A current case in Illinois( AXG Roofing LLC v. RB Global Inc et al) pits construction companies against equipment rental providers, alleging that they inflated prices by sharing real-time, confidential data.
Instead of reducing costs, firms could potentially pay more if the data gathered by AI platforms exposes buying habits that suppliers can exploit. When using AI, it’ s imperative to know who has access to the data, if it’ s closed or open source, if it’ s being sold or shared with third parties, and what protections exist against misuse.
The same questions apply to AI-assisted design tools. Traditional contracts define ownership of the work product typically between the owner, architect and engineer. What about when AI creates or refines those designs through a series of inputs or prompts that are collaboratively made? Who owns it now?
These issues can become especially critical when projects change hands. If an architect or engineer is fired or quits, can they retain and reuse the data that was entered into the project’ s AI? Does the project owner get the rights? Unless the contract addresses this, both sides may find themselves at a legal standstill.
Employee rights and insurance
Cameras have become increasingly common in every aspect of life and on the jobsite. AI software integrated with cameras can monitor workers’ safety compliance, detect exposed wiring, or identify other hazardous conditions. While the potential benefits to safety are huge, the technology raises serious privacy concerns.
If employees are constantly being recorded, how are those recordings stored and used? Were all workers informed? Did they consent? Can footage be used for disciplinary action? What happens if those systems are hacked and sensitive
18