I&IC Workshop #3 with Algopop at ECAL, brief: “Botcaves”

Note: I publish here the brief that Matthew Plummer-Fernandez (a.k.a. Algopop) sent me before the workshop he’ll lead next week (17-21.11) with Media & Interaction Design students from 2nd and 3rd year Ba at the ECAL.

This workshop will take place in the frame of the I&IC research project, for which we had the occasion to exchange together prior to the workshop. It will investigate the idea of very low power computing, situated processing, data sensing/storage and automatized data treatment (“bots”) that could be highly distributed into everyday life objects or situations. While doing so, the project will undoubtedly address the idea of “networked objects”, which due to the low capacities of their computing parts will become major consumers of cloud based services (computing power, storage). Yet, following the hypothesis of the research, what kind of non-standard networked objects/situations based on what king of decentralized, personal cloud architecture?

The subject of this workshop explains some recent posts that could serve as resources or tools for this workshop, as the students will work around personal “bots” that will gather, process, host and expose data.

Stay tuned for more!





Algorithmic and autonomous software agents known as bots are increasingly participating in everyday life. Bots can potentially gather data from both physical and digital activity, store and share data in the ‘cloud’, and develop ways to communicate and learn from their databases. In essence bots can animate data, making it useful, interactive, visual or legible. Bots although software-based require hardware from which to run from, and it is this underexplored crossover between the physical and digital presence of bots that this workshop investigates.

You will be asked to design a physical ‘housing’ or ‘interface’, either bespoke or hacked from existing objects, for your personal bots to run from. These botcaves would be present in the home, workspace or other, permitting novel interactions between the digital and physical environments that these bots inhabit.

Raspberry Pis, template bot code, APIs, cloud storage, existing services (Twitter, IFTTT, etc) and physical elements (sensors, lights, cameras, etc) may be used in the workshop.



British/ Colombian Artist and Designer Matthew Plummer-Fernandez makes work that critically and playfully examines sociocultural entanglements with technologies. His current interests span algorithms, bots, automation, copyright, 3D files and file-sharing. He was awarded a Prix Ars Electronica Award of Distinction for the project Disarming Corruptor; an app for disguising 3D Print files as glitched artefacts. He is also known for his computational approach to aesthetics translated into physical sculpture.

For research purposes he runs Algopop, a popular tumblr that documents the emergence of algorithms in everyday life as well as the artists that respond to this context in their work. This has become the starting point to a practice-based PhD funded by the AHRC at Goldsmiths, University of London, where he is also a research associate at the Interaction Research Studio and a visiting tutor. He holds a BEng in Computer Aided Mechanical Engineering from Kings College London and an MA in Design Products from the Royal College of Art.



Maps of data center localizations

Although data centers are unevenly distributed, but it’s intriguing to observe the way they’re located spatially. It’s difficult to find world maps but here are some examples I found interesting, but it’s totally not exhaustive (lots of them are not documented). Also note that any queries on image search engine about data center geography leads to weather-related visualizations (which generally influence energy/water consumption for this infrastructure).

The largest US data-centers (by Nicolas Rapp, data by Dave Drazen at Geo-Tel)


Thingful, search engine for data

Note: Thingful is a “search engine” (beta version) for data and artifacts/sensors that produce data (the coming “Internet of Things” so to say, but also and mainly weather stations, aircrafts and Rastracks at this day). Already quite loaded, the content of the search engine and its map will with no doubt explode in the close future. It is a project by former creators of Pachube (in particular Usman Haque), which was an open webservice to “store, share and discover” data from realtime sensors, now sold and therefore private… It was sold to LogMeIn in 2011 (which is somehow a sad destiny for an open data project, but this is a different story) and became then Xively.



Amazon presents Echo, new cloud-enabled AI for home

By Thursday, November 13, 2014 Tags: 0066, Corporate, Object, Product, Smart Permalink 0

Echo is a connected object for your home which is activated by voice recognition. It’s a loud-speaker connected to the “cloud” via Wifi, so it’s main use seems to be streaming music. It’s apparently able to understand and answer queries said in “natural language”, like “Play some Henry Mancini” (activates your Amazon Music Library, Prime Music, TuneIn or IHeartRadio account). Of course, it’s main features are shopping-oriented but a few aren’t: you can ask for information about say, Ronald Reagan and it taps into Wikipedia and reads the page, it’s linked to weather prediction pools so you can ask about tomorrow and you can manage personal to-do lists. Unsurprisingly, “Echo’s brain is in the cloud, running on Amazon Web Services so it continually learns and adds more functionality over time”. The object’s also got a dedicated control app, which runs on Fire OS, Amazon’s new smartphone Operating System.

I&IC Workshop #2 with James Auger at HEAD: UI proposals

As a follow-up to the scenarios produced in our second workshop, we decided to specify the type of user interfaces that could emerge from the projects proposed by the participants. More specifically, we took each of the context they worked on and created a set of graphical user interfaces to show how the cloud computing service might appear. They are not meant to be exclusive but they illustrate the functions and possible usage discussed during the workshop. To some extent, they summarize the findings in a pragmatic way. Each of the contextual category features 2 or 3 interfaces that reflect on different types of target groups: user, system administrator, priest, sport coach, etc.



SQM: The Quantified Home, (2014). Edited by Space Caviar

Note: an interesting project /book by Space Caviar about the “house” under the pressure of “multiple of forces - financial, environmental, technological, geopolitical -”, to read in the frame of I&IC. Through its title, the book obviously address the question of domesticity immersed into technologies and the monitoring of its data.

While our project is gravitating around “networked objects/spaces”, the question of their monitoring, so as the production or use of data (“pushed” into to the cloud?) immediately comes into question, of course.

In this context, we must also point out Google and Apple efforts to tap into the “quantified house” with Nest and Homekit.


Via Space Caviar