I&IC’s OwnCloud Core Processing Library (evolution)

Note: “I&IC OwnCloud Core Processing Library” (working title) is part of a home cloud kit, which was described in a previous post and that will be composed by four various artifacts, both physical and digital.

The kit will be distributed freely at the end of the project.

-

motivations_usages_problems_01_02

One processing project with one cloud, while the server (OwnCloud) can still be accessed by a regular interface (OwnCloud client). The upper part (white) is the “network/server side” of the project (OwnCloud), hosted on a Linux server, while the bottom (grey dots) is user or “client side”. It can consists in connected objects or environments, interfaces, visualizations of different sorts.

 

The I&IC’s Owncloud Core Processing Library is now composed of a “client side” component and “server side” component (IICloud’s Addon).

The “client side” part of the library (“user side” vs. “network/server side” in the illustrations above and below) can be used from Processing, in order to get access to OwnCloud server(s) and manipulate files. The benefit of the core library resides in the fact that it mashups all together a set of heterogeneous functionalities in one single library (it has been therefore renamed I&IC OwnCloud Core Processing Library as it is more closely related to our research).

(The reasons why an I&IC’s) OwnCloud Core Processing Library

Beside the reflection produced by the overall Inhabiting & Interfacing the Cloud(s) project and the related necessity to provide “access to tools” to a larger community (largely described in the founding document of the project and in a former post about the setting up of this library), new paradigms may arise in the global organization of servers farms. These new paradigms may in return generate new ways to organize files on cloud servers (by a different control of the redundancy principle for example, or a different use of file’s duplication, etc.), allowing for new projects.

In order to answer the stakes of the I&IC design research and to prepare such output/proposals, we have developed the OwnCloud Core Processing Library that will allow to setup a software layer on top of the hardware layer.

 

To download and learn how to use the OwnCloud Core Processing Library, we’ve prepared a post in the Cook Books section of this site.

 

owncloud_logo    processing2-logo

Cookbook > How to set up Processing to use the OwnCloud Core Processing Library

We will describe how to use the OwnCloud Core Processing Library within the Processing framework, starting from a blank sketch. Library’s functions will be refined and new ones may be developped, some additional libraries will be added as well in order to propose high level functions deeper linked to the IICloud(s) project.

 

own_processing_logo

Towards a new paradigm: Fog Computing

Data-Gravity_big

 

The Internet of Things is emerging as a model, and the network routing all the IoT data to the cloud is at risk of getting clogged up. “Fog is about distributing enough intelligence out at the edge to calm the torrent of data, and change it from raw data over to real information that has value and gets forwarded up to the cloud.” Todd Baker, head of Cisco‘s IOx framework says. Fog Computing, which is somehow different from Edge Computing (we didn’t quite get how) is definitely a new business opportunity for the company who’s challenge is to package converged infrastructure services as products.

However, one interesting aspect of this new buzzword is that it adds up something new to the existing model: after all, cloud computing is based on the old client-server model, except the cloud is distributed by its nature (ahem, even though data is centralized). That’s the big difference.  There’s a basic rule that resumes the IT’s industry race towards new solutions: Moore’s law. The industry’s three building blocks are: storage, computing and network. As computing power doubles every 18 months, storage follows closely (its exponential curve is almost similar). However, if we graph network growth it appears to follow a straight line.

Network capacity is a scarce resource, and it’s not going to change any time soon: it’s the backbone of the infrastructure, built piece by piece with colossal amounts of cables, routers and fiber optics. This problematic forces the industry to find disruptive solutions, and the paradigm arising from the clash between these growth rates now has a name: Data gravity.

Maps of data center localizations

Although data centers are unevenly distributed, but it’s intriguing to observe the way they’re located spatially. It’s difficult to find world maps but here are some examples I found interesting, but it’s totally not exhaustive (lots of them are not documented). Also note that any queries on image search engine about data center geography leads to weather-related visualizations (which generally influence energy/water consumption for this infrastructure).

The largest US data-centers (by Nicolas Rapp, data by Dave Drazen at Geo-Tel)

DATA_MAP_FULL

Cloud Computing workshop at CHI2011

It’s as if the human-computer interaction community haven’t really addressed (yet) cloud computing, especially in the context of personal cloud services. An exception is this workshop called “Designing interaction for the cloud” organized by a team from Liverpool John Moores University. Their goal was to bring together researchers and practitioners from various fields and “examine the impact of cloud computing on the design of the user experience at the individual and organizational level”.

The workshop introductory paper highlights various research issues related to the following challenges:

  • Design for a fragmented user experience
  • Interoperability – goals and reality
  • Personal clouds and multi-sensory environments
  • The cloud in non-commercial application domains, e.g. medicine, education
  • Privacy and trust as UX issues
  • UI standards and processes in Cloud design

Reblog > Floating Datacenters

The prototypes of the “Google Navy” have been discovered on both coasts. But are they floating data centers? Or some kind of marketing facility for Google Glass? This perspective pushes further the question of the legal borders of the physical nature of data. This refers to our research in a sociological way, and makes me think of Sealand’s Datacenter HeavenCo in international waters (even if the scale of the infrastructure is in no way simmilar).

–-

Via DataCenterKnowloedge