Key Takeaways

  • Residents near Santiago operated a human-run chatbot to draw attention to the water use of AI data centers
  • The 12-hour Quili.AI project processed more than 25,000 global requests
  • The initiative underscores mounting tensions over data center expansion in drought-stricken Chile

About 50 residents in Quilicura, a municipality on the outskirts of Santiago, spent their Saturday doing something that looked half whimsical, half pointed: manually powering a chatbot. Their project, Quili.AI, was intentionally designed to mimic the experience of prompting an AI system—but with humans typing the responses and sketching the images. The idea wasn’t just a performance. It was a message about the hidden environmental costs tied to the region’s growing concentration of data centers.

Organizers from the environmental group Corporación NGEN said the 12-hour experiment handled more than 25,000 prompts from across the world. Some requests were playful, like the user who wanted an image of a sloth playing in the snow. Instead of instant output, a volunteer replied—politely—to hold on because a human was on the job. A hand‑drawn image eventually followed: a cartoon-like sloth sitting in a mound of snowballs. Not exactly a multimodal transformer model, but that wasn’t the point.

Quilicura has become a major hub for cloud infrastructure, especially hyperscale facilities. With that growth has come rising local concern about water usage. AI workloads, especially those running on high‑performance chips, have increased cooling requirements and energy demand globally. The specifics vary depending on equipment and climate, but the underlying tension remains the same: Who bears the resource burden of the digital services used everywhere else?

Quili.AI’s organizers framed the project as a reminder that every casual AI query has an environmental footprint—one that might be easy to ignore if you are thousands of miles away. Lorena Antiman, representing Corporación NGEN, said the goal was to spotlight the “hidden water footprint behind AI prompting” and to encourage more mindful use. Her point wasn’t that AI should be abandoned; instead, she argued that the rise of effortless querying should come with increased awareness. It is a perspective increasingly echoed in academic research, which has begun to quantify the water and energy costs of large-scale model training and inference.

Interestingly, some of the questions the volunteers fielded came faster than others. Local cultural queries—like how to make sopaipillas—were answered quickly. When they didn’t know something, volunteers roamed the community center looking for someone who did. The process had a certain charm, almost a reminder of how knowledge once traveled in tight‑knit communities before automation filled the gaps. It raises a subtle question: just because we can get instant answers from AI, should we?

The backdrop here is heavier. Chile has faced more than a decade of severe drought, which experts say has contributed to a series of destructive wildfires. Water scarcity is not a hypothetical concern. And in drought‑prone regions, data centers are drawing fresh scrutiny for the water they use in cooling systems. Not every facility uses water, of course. Some rely solely on air‑cooling or other methods. But enough do that the debate persists.

Tech giants have already staked their claims in the greater Santiago area. Amazon, Microsoft, and Google are among the companies that have built or planned major data centers there. Google, for instance, has touted its Quilicura facility—launched in 2015—as the most energy‑efficient data center in Latin America. The company has also pointed to investments in wetlands restoration and irrigation near the Maipo River basin. That said, a separate Google project near Santiago faced a court challenge over water usage, showing how fraught the issue has become. Community skepticism doesn’t always disappear just because a company publishes sustainability stats or funds conservation work.

The Quili.AI stunt reflects a growing global narrative: the need to reconcile AI advancement with its environmental footprint. Industry leaders often highlight efficiency improvements, and they are not wrong—per‑compute efficiency has improved dramatically over the past decade. Yet total demand keeps growing faster. AI’s resource curve bends upward even as chips and cooling systems improve. This is why small, symbolic gestures like a human‑run chatbot resonate. They hint at the scale of the invisible machinery behind each prompt.

For businesses watching from afar, especially those operating or relying on cloud services, the Quili.AI project may feel like a fringe demonstration. But its themes are creeping into regulatory discussions, stakeholder expectations, and sustainability reporting frameworks. Enterprises that deploy AI at scale will likely face more pressure to account for—or even measure—the water and energy tied to their usage. Could demand‑side transparency become part of enterprise AI governance? It is not impossible.

This may be the unexpected power of a human‑drawn sloth in a pile of snowballs. It is an odd little artifact that hints at a larger question: in a world chasing ever‑faster digital experiences, how often do we pause to think about the real‑world resources enabling them?