COCOS

A middleware for self organizing sensor networks, which uses data-parallel paradigms to enable flexible grouping of sensor nodes.

  • Duration: from 2004
  • Funding: DFG Project

In the near future sensor networks will be used on a big scale. As the number of possible scenarios which use sensor networks grows continuously, the need for a suitable middleware approach becomes more and more pressing. At the moment, sensor networks are programmed at a fairly low level. This is time consuming and error prone. In this project a middleware for self organizing sensor networks was designed, which uses data-parallel paradigms to enable flexible grouping of Sensor nodes. These nodes can then be operated collectively, to aggregate values or to control distributed groups of actuators together. The COCOS project was funded by DFG (Deutsche Forschungs Gemeinschaft, the german National Science Foundation)in the SPP 1140. It was started at October, 1st 2004 and consists of three different layers.

Copra, COmmunication PRocessing Architecture, is a communication framework which enables communication among different nodes. It supplies components like e.g. routing or MAC, which can be grouped according to the needs of the application.
Chips, Convenient High-level Invocation Protocol Suite, enables the usage of remote method calls, which enable one node to control another remotely.
Cocos, COordination and COoperation Spaces, is the highest layer. Here groups of nodes or even the whole sensor network can be controlled using collective operations. These operations include the controlling of sensors and actuators as well as in network aggregation and preprocessing of values.

In the following a few example videos can be found, which show the Lego RCX robots which were used in the development and evaluation of the project. To these robots Reflex was ported, it is the operating system we use as basis for Cocos.

The first two videos show the robots moving, which only requires Reflex. The robots move around until they hit an obstacle. Then, they turn in a random direction and continue their movement

Random Motion 1
Random Motion 2

The next two videos show a simple robot ballet. One robots decides the moves and transmits them to all others. This has been realized in three different ways. Once, with Copra, once with Chips and once using the spaces from Cocos. As there is no optical difference, only two videos are shown here which use either 3 or 9 robots.

Robot ballett 1Robot ballett 2The final demonstrator of the first phase of the Cocos project is a sensing and aggregating scenario. It is divided into three phases.First all robots move around in a random manner. On a signal from one of them they all switch to the measurement phase. The third phase realizes in-network processing. The robot with the highest sampled value is determined and transmits it to a base station. To make this visible on the video it turns around and switches its light bulb on. It also plays a tune. After this, the robots start again with the random motion in phase one.Sensing and Aggregating, one lamp Sensing and Aggregating, two lamps