Log your Android resources on Cosm!

Being inspired from CosmX (announced at Cosm Blog as a recent hack day project) that updates a Cosm feed with your system’s current CPU usage, I have developed a similar Android app.

The app logs CPU usage, available memory, data usage (transmitted and received Kb) and battery level:

When you start the app it asks for a Cosm Feed ID and a key (with update and create permissions). Users can also configure what resources will be logges and how often the feed will be updated (1 minute, 5 minutes, 15 minutes, 30 minutes, 1 hour, 2 hours).

The CPU usage and available memory are polled every 10 seconds and a total average is calculated and used for updating the feed.

Thinking of adding more features like call duration and number logs, etc. especially if users like it and find it useful. A background service is used to monitor and log the resources, users can start/stop the service through the application main interface.

You can download the apk and Android source files from: http://code.google.com/p/cosm-android-logger

You can also get the application throug Android Play:

QRCode

https://play.google.com/store/apps/details?id=doukas.cosm.androidresources

EDIT (29/6/12): Entering the Cosm key manually really sucks, so I have updated the app to use a QR code reader for that purpose. Simply go to http://goqr.me/ or any alternative and generate a QR code with your key. Then simply use the ‘Read QR code’ button ;-)

The GeoSensor – A Low cost weather station

The GeoSensor is a new IoT-based project by Hugo Lavalle, an Arduino and IoT fan.

The main scope of the project is to provide makers with a low-cost weather station that will allow users to have direct access to meteorological data.

The idea behind this effort has come from the fact that Hugo is a sailor and have always wanted access to meteorological data from the place he usually sails. Most likely a commercial weather station is too expensive, so he decided to build one himself.

These are the main components proposed for the basic version of the project:

  • Wind and rain sensors
  • Arduino microcontroller
  • Ethernet shield
  • Temperature sensor
  • Cables, electric material, arduino case, etc.

Hugo has just launched a crowd-funding campaign for his project and is going pretty good. In case the project gets the funds, he will donate the first station to the marina he uses for sailing, that is located at the Parque da Cidade, in Jundiaí/SP.

Source code and h/w schematics will be released as Open Source at the end of the project.

You can help Hugo with his project here.

In case the fund raising goes well, Hugo plans to add more features like:

  • Including more sensors, like air and soil humidity.
  • Enabling alarms for heavy rains, that may cause flooding and mudslide, frequent problems here in Brazil during summer rains.
  • Investigating the possibility of measure river levels.
  • Building a wireless station, wih wifi or a zigbee/ethernet gateway.
  • GPS integration.
  • Building a mobile station with GPRS connection.
  • Using solar power

Hugo reports that the book has helped him a lot with his project! Thank you so much Hugo and good luck with your project! :-)

UPDATE (26/6/2012): Hugo reports that after having the first prototype ready, he will propose the creation of a national GeoSensors network. In that network, each participant would be a maker and/or manager of a GeoSensor. The resources could be obtained also from crowd funding. More details on the website that Hugo shares the progress of the prototype.

 

 

Feedback from OpenIoT – Notes from “The Body” discussion group

The OpenIoT weekend is over and I have to admit it, I had never thought it would be such an interesting and inspiring event! Lot’s of great people, a great place (Google campus) and great food as well!

People that attended had different backgrounds and thoughts about open data and user rights, but we all did manage after two long days full of discussions (and debates in some cases) to form a basic document that describes the fundamental rights of users for enabling the Open Internet of Things.

The document mainly suggests issues about:

  • Licensing (Data ownership, licensors, creators, etc.)
  • Accessibility (Royalties, parsers, etc.)
  • Timeliness (data resolution, etc.)
  • Privacy (what data is being collected, why, etc.)
  • Transparency

You can find the document here (and yes, please do put your name on it if you support what you read)!

On the first day I was leading “The Body” discussion group (about Quantified Self, wellness, user location tracking, etc.). More that 10 people have attended and contributed great ideas and concerns about how such data should be treated and what the user rights are.

As promised, I will soon compile the notes and put them online (probably will use the existing Google Group) so that people can review them, comment, and move on the discussion online so that we can come with something more concrete about QS and Open Data at the IoT!

For now, I am posting here the notes that contain the main highlights of the discussion (special thanks to Christine Perey for writing them down and also for being very inspiring!)

 

Check some photos from the event here.

If you have missed the event, or missed to follow the #openiot on Twitter, you can see the whole (Tweet) event stream here!

Many thanks to the people that have worked hard to make this event such a huge success: Alex Sonsino, Diana Proca, the volunteers, Usman and Ed from Cosm, Trevor Harwood (for his great effort on the initial draft and the google group), the Speakers, and anyone else I might forget. Many thanks to anyone having attended and contributed to the Document!

It’s been really nice to meet you guys!

Internet of Things can be Open, it is our right to demand for it, it is our right to form the way it can be!

Open the Internet of Things! Spread the news to the world! ;-)

You can also check the weekend’s write-up from Andrew Back here

Talk/Listen to your Cosm feeds!

Wouldn’t it be cool if you could use your voice on your mobile device for retrieving information about your Cosm feeds (and activate a trigger maybe)? This way you can learn about your room’s temperature (or whatever else you might be monitoring using your favorite microcontroller platform) while driving, jogging, etc.!

When this idea came in my mind I first thought of Siri. Then I turned on the WiFi on my iPhone and tried to find out Siri’s knowledge about IoT and Cosm. The first question was very general, to see if Siri had any idea of what I am talking about when I want to find out my room’s temperature. The response was completely disappointing!

One could say that I should not be expecting Siri to be able to answer such a question (well, at least it could search google for temperature :P ) so I tried again with a more..let’s say..fundamental question!

(Side note: I am expecting from such a voice recognition system difficulties in understanding my greek-english accent, but despite Cosm being much more easily to pronounce than Pachube, Siri thought on every attempt that I was asking about ‘cousin’ and not ‘cosm’.  I had to manually edit the query and correct it.)

Siri has thought that I am just kidding around, then I realized that maybe Siri is not familiar yet with the new name of Cosm:

That was only just a bit closer! Cosm guys, if you read this, you need to talk to Apple :P

Since there is no official voice application for Android I gave up and considered to build one myself. Since I didn’t want to jailbreak my iphone and and I am not very familiar with objective-C, I have used Android for that purpose.

I have developed a small app that requires from you only the tap of a button and then will ask you for the feed id you are interested in. It will look for available datastreams, will ask you which one your are interested in and will report you the current datastream value. Finally it will ask you whether you like to activate a trigger or not (not Cosm trigger, just a REST call to a remote server). Voice recognition works pretty well, the voice sdk needs improvement, but it is quite acceptable for a proof of concept!

See a small demo video here:

The trigger is just a REST call on a website, not an actual Cosm trigger!

The complete Android source can be found hereWarning: Code is really a mess, just made it for proof of concept, works quite well, but definitely could be optimized, and implemented using threads! Feel free to use it, modify it, improve it!

Leading ‘The Body’ discussion group @Open Internet of Things Assembly

The committee of the OpenIoT has honored me by assigning me the lead of ‘The Body’ discussion group on the first day (June 16th).

There will be a small presentation about the Quantified Self (QS), how and why people can track information about themselves and the context around them for their own benefit, and how this is related to the Open Internet of Things and Data Control/Privacy/Licensing.

The main scope of the session is to discuss on QS Data Control/Privacy/Licensing and generate up to 5 key points related to QS that should be added to the Open IoT framework. More information here.

Get your Open IoT Assembly ticker:

Eventbrite - Open Internet Of Things Assembly

The full schedule:

Interesting IoT findings and some news

I thought it would be nice to share some interesting IoT findings of the last few days:

A ship transmits location, speed and destination to Cosm! 

https://cosm.com/feeds/3818 The captain must be a real IoT fun!

 

I have found interesting patterns of my heart pulse rate while looking the data history on Cosm.

As explained in previous post (My Heart Rate on the (Cosm) Cloud!) I am using an ear-clip based heart pulse monitor and my Android phone to transmit average pulse rate (bpm) on Cosm (https://cosm.com/feeds/59362). Looking back on the data I notice interesting patterns of my pulse, I can easily find out when I am sitted and working (most of the time), being mobile and also wearing the sensor after lunch! However the most interesting is how easy it is nowadays to save and visualize online personal data. I remember back in 2006 when I first started doing research on on-body sensors, we had to build two applications; one for collecting the data from sensors and one for storing and visualizing them on a computer or a mobile phone (windows ce and j2me old days!).

 

New hardware platforms for the IoT are being published and are dww (definitely worth-watching)

The first one is a Kickstarter project asking for your support. Tōd connects real world actions to mobile devices and to the Web. It consists of ultra small and low power Bluetooth 4.0 enabled Smart Beacons that are programmable and come with great features.

 

The second one is electric imp, a new startup that has managed to build a tiny WiFi card (at the size of an SD card) at very low cost (arnd 25$) with low power consumption. It also comes with great dev kits that enable direct Arduino communication

The IoT Open Assembly is gaining publicity and attendees. It will be a great event featuring interesting talks and will also give participants the opportunity to contribute to the Open Internet of Things Document and Vision. I am really excited to attend the event.

They have recently released the event schedule. So check it out and book an early ticket for ‘being part of the Open Internet of Things’ until June 1st!

 

Cosm on mobile devices
Some of you might already know that I have built an Android application for viewing feeds on Pachube (have not updated it to say Cosm yet): https://play.google.com/store/apps/details?id=pachube.andorid
Now I am developing something more special, named ‘VocalCosm’…details will follow soon!

My Heart Rate on the (Cosm) Cloud!

Recently Seeedstudio (many thanks!) has provided me with a Grove Heart Rate ear-clip sensor:

This cool (and very low price) sensor is attached on your ear and can detect your heart’s pulse through transmitting infrared light and checking the absorption variation caused by the blood flow on your ear lobe. The site of the products provides also the Arduino code for detecting the beats and calculating an average heart rate (in bpm  - beats per minute). The sensor comes with a grove connector, so setting up and running the code took less than 5 mins! (thanks again @seeedstudio for providing me with a complete Grove kit).

After playing with it a while I realized that I could make a cool Cloud-based heart rate tracker by simply using an ADK board and my Android phone. This way I could be completely mobile (given that the 9V battery that powers the ADK board can last!).

I modified the Arduino code to send the heart rate to the Android using the ADB and made also a simple Android app that takes the heart rate and sends it to Cosm  (former Pachube) using the jpachube library.

This is the graph generated from Cosm: (EDIT: Battery has died, will put on again sometime tomorrow ) EDIT (14/05): I’ve put it on again for a few hours!)

EDIT: I will occasionally put on the sensor during the following days.

Despite being very mobile (the cable is long enough to reach my pocket where both boards and mobile phone are) I am sure the graph-feed will stop being live quite soon (will either get bored, battery will die or will take it off to go to sleep…)

The code for the Arduino is the following:


/************************* 2011 Seeedstudio **************************
* File Name : Heart rate sensor.pde
* Author : Seeedteam
* Version : V1.0
* Date : 30/12/2011
* Description : This program can be used to measure heart rate,
the lowest pulse in the program be set to 30.
*************************************************************************/

//Modified by @BuildingIoT
//for communication with Android

#include <SPI.h>
#include <Adb.h>

// Adb connection.
Connection * connection;

// Elapsed time for ADC sampling
long lastTime;

unsigned char pin = 13;
unsigned char counter=0;
unsigned int heart_rate=0;
unsigned long temp[21];
unsigned long sub=0;
volatile unsigned char state = LOW;
bool data_effect=true;
const int max_heartpluse_duty=2000;//you can change it follow your system's request.2000 meams 2 seconds. System return error if the duty overtrip 2 second.

void setup() {
pinMode(pin, OUTPUT);
Serial.begin(9600);
//Serial.println("Please put on the ear clip.");
delay(5000);//
array_init();
//Serial.println("Heart rate test begin.");
attachInterrupt(0, interrupt, RISING);//set interrupt 0,digital port 2

// Initialise the ADB subsystem.
ADB::init();

// Open an ADB stream to the phone's shell. Auto-reconnect
connection = ADB::addConnection("tcp:4567", true, adbEventHandler);
}

void loop() {
digitalWrite(pin, state);

}

void sum()//calculate the heart rate
{
if(data_effect)
{
heart_rate=1200000/(temp[20]-temp[0]);//60*20*1000/20_total_time
//Serial.print("Heart_rate_is:\t");
Serial.println(heart_rate);
connection->write(2, (uint8_t*)&heart_rate);
ADB::poll();
}
data_effect=1;//sign bit
}
void interrupt()
{
temp[counter]=millis();
state = !state;
//Serial.println(counter,DEC);
//Serial.println(temp[counter]);
switch(counter)
{
case(0):
sub=temp[counter]-temp[20];
//Serial.println(sub);
break;
default:
sub=temp[counter]-temp[counter-1];
//Serial.println(sub);
break;
}
if(sub>max_heartpluse_duty)//set 2 seconds as max heart pluse duty
{
data_effect=0;//sign bit
counter=0;
Serial.println("Heart rate measure error,test will restart!" );
array_init();
}
if (counter==20&&data_effect)
{
counter=0;
sum();
}
else if(counter!=20&&data_effect)
counter++;
else
{
counter=0;
data_effect=1;
}
}
void array_init()
{
for(unsigned char i=0;i!=20;++i)
{
temp[i]=0;
}
temp[20]=millis();
}
// Event handler for the shell connection.
void adbEventHandler(Connection * connection, adb_eventType event, uint16_t length, uint8_t * data)
{

}

For the Android app all is needed is an Activity that implements the ADB server and communicates with the Arduino board:


package buildingiot.heartrate;

import java.io.IOException;

import android.app.Activity;
import android.os.Bundle;
import android.util.Log;
import android.widget.TextView;

import org.microbridge.server.Server;
import org.microbridge.server.AbstractServerListener;

public class HeartRateOnCloudActivity extends Activity {

// Create TCP server (based on MicroBridge LightWeight Server).
// Note: This Server runs in a separate thread.
Server server = null;

int heartrate = 0;

TextView textView1;

/** Called when the activity is first created. */
@Override
public void onCreate(Bundle savedInstanceState) {
super.onCreate(savedInstanceState);
setContentView(R.layout.main);

textView1=(TextView)findViewById(R.id.textView1);

//Create TCP server (based on MicroBridge LightWeight Server)
try {
server = new Server(4568); //Use the same port number used in ADK Main Board firmware
textView1.setText("Starting server..");
server.start();
textView1.setText("server started!");

} catch (IOException e){
Log.e("Seeeduino ADK", "Unable to start TCP server", e);
textView1.setText("server not started!!");

}

server.addListener(new AbstractServerListener() {

@Override
public void onReceive(org.microbridge.server.Client client, byte[] data){
textView1.setText("got arduino data!");
String bpm = new String(data);
textView1.setText(bpm+" bpm");

}
});

}
}

To make it all work you need to have an ADB-enabled Arduino board like this one.

More examples on Android and Arduino communication can be found in my book.

Controlling your coffee machine and your room’s light through the Web

(a.k.a. the SmoothWakeAlarm Project).

This is a smooth-wake-alarm system based on the Flyport Ethernet and Lighting Nest. The system can automatically dim your room light before you wake up and also activate the coffee machine on time. An Android app allows you to set the internal alarm clock and an alarm for activating the coffee machine.

The app communicates with the Flyport web server through a REST API. The Flyport includes all the essential libraries for time management (internal alarm, setting time automatically through SNTP). The Lighting Nest has 3 relay switches on board that can be used to activate any external circuit.

Equipment used:

  • 1 Flyport Ethernet module
  • 1 Flyport Lighting Nest
  • 1 Led
  • 1 Resistor (optionally for the LED)
  • Power source

Implementation:

The project consists of two parts: a) the Android application that will be used for setting the wake alarm and setting the time for dimming the light (LED) and activating the coffee machine (using the Lighting Nest’s relay switch). b) The hardware for activating the coffee machine and dimming the LED.

The Flyport Ethernet provides all the essential functionality for receiving the wakeup and coffee machine activation time (through the embedded Web Server) and includes libraries for time control (like getting actual time through SNTP and setting an internal alarm).

The Lighting Nest provides relay switches (up to 5A) that can be used to directly activate the coffee machine.

To emulate the light dimming, PWM is used on a LED. Lighting Nest provides a PWM output on the EXPANSION connector.

One of the cool things is that the time handling is performed completely on the Flyport module since the accompanying libraries make time handling a piece of cake!

The code:

The Flyport comes with a very handy and simple IDE for editing, compiling and uploaded the code on the board. It also comes with  The Flyport projects consist of several source files (many of them auto-generated when creating the project through the IDE wizard). 2 are the basic ones that need editing when creating a custom project: HTTPApp.c and taskFlyport.c. The first one includes all the essential libraries and routines for implementing the HTTP communication part of the Flyport web server (this also means handling the GET requests from clients) and the second one handles the logic flow of the program (variable initialization, main execution loop, etc.).

A sample of the HTTPApp.c that handles the GET request for setting the coffe alarm activation time follows:


HTTP_IO_RESULT HTTPExecuteGet(void)
{
BYTE *hrs;
BYTE *mins;
BYTE filename[20];

// STEP #1:
// The function MPFSGetFilename retrieves the name of the requested cgi,
// in this case "alarm.cgi" and puts it inside the filename variable.
// Make sure BYTE filename[] above is large enough for your longest name
MPFSGetFilename(curHTTP.file, filename, 20);

// STEP #2:
// Handling of the cgi requests

if(!memcmp(filename, "alarm.cgi", 9)) // Is the requested file name "alarm.cgi"?
{
// STEP #3:
// The complete request is contained inside the system variable curHTTP.data.
// Using the function HTTPGetArg is possible to read the arguments
// of the cgi request from curHTTP.data. In this case we are reading the
// argument "hours" and "minutes" from the request "alarm.cgi?hours=x&minutes=y" and we assign it to the
// respective variables.

hrs = HTTPGetArg(curHTTP.data, (BYTE *)"hours");
mins = HTTPGetArg(curHTTP.data, (BYTE *)"minutes");
hours = atoi((char*)hrs);
minutes = atoi((char*)mins);
alarmSet = TRUE;
UARTWrite(1,"Got Alarm request!");
}

if(!memcmp(filename, "coffee.cgi", 10)) // Is the requested file name "coffee.cgi"?
{
hrs = HTTPGetArg(curHTTP.data, (BYTE *)"hours");
mins = HTTPGetArg(curHTTP.data, (BYTE *)"minutes");
coffeeHours = atoi((char*)hrs);
coffeeMinutes = atoi((char*)mins);
coffeeAlarm = TRUE;
UARTWrite(1,"Got Alarm request!");
}
return HTTP_IO_DONE;
}

The taskFlyport.c that uses both the internal clock (RTCC) and SNTP for getting realt time from the Interner is the following:


#include "taskFlyport.h"
#include "time.h"
#include "rtcc.h"

//SETTINGS
int GMT_hour_adding = 3;
int LightDimmingOffset = 5; //Define how many minutes before alarm to start dimming the Light

//TIME VARIABLES
time_t now;
struct tm *ts;
DWORD epoch=0;
DWORD epochtime=0xA2C2A;
t_RTCC mytime;

char dateUTC[100];
char dateUTC1[50];
//HELPFUL VARIABLES
extern BOOL alarmflag;
extern int hours, minutes;
extern int coffeeHours, coffeeMinutes;
extern BOOL alarmSet;
extern BOOL coffeeAlarm;

//Dimming variables
BOOL startDimming;
int dimmStep;
int dimmPercentage;
void FlyportTask()
{
// Flyport waiting for the cable connection
while (!MACLinked);
vTaskDelay(100);
UARTWrite(1,"Flyport ethernet connected to the cable... hello world!\r\n");

vTaskDelay(200);
UARTWrite(1,"waiting...");

while(epoch<epochtime) {
vTaskDelay(50);
epoch=SNTPGetUTCSeconds();
}

UARTWrite(1, "done!\r\n");
coffeeHours = 0;

//GET SNTP time and set it:
epoch=SNTPGetUTCSeconds();
now=(time_t)epoch;
ts = localtime(&now);

ts->tm_hour = (ts->tm_hour + GMT_hour_adding);
// Correct if overflowed hour 0-24 format
if(ts->tm_hour > 24) {
ts->tm_hour = ts->tm_hour - 24;
}
else if(ts->tm_hour < 0) {
ts->tm_hour = ts->tm_hour +24;
}

//Set the RTC according to SNTP time
mytime.year = 12;
mytime.month = 4;
mytime.day = 21;
mytime.hour = ts->tm_hour;
mytime.min = ts->tm_min;
mytime.sec = ts->tm_sec;
RTCCWrite (&mytime);

//Initialize dimming variables
dimmStep = 100/LightDimmingOffset;
dimmPercentage = dimmStep;

while(1)
{
// Main user's firmware loop

//Read the time and check for alarms set
epoch=SNTPGetUTCSeconds();
now=(time_t)epoch;
ts = localtime(&now);

ts->tm_hour = (ts->tm_hour + GMT_hour_adding);
// Correct if overflowed hour 0-24 format
if(ts->tm_hour > 24) {
ts->tm_hour = ts->tm_hour - 24;
}
else if(ts->tm_hour < 0) {
ts->tm_hour = ts->tm_hour +24;
}

if(coffeeAlarm==TRUE) {
//set the coffee machine alarm:
t_RTCC myalarm;
myalarm.hour = coffeeHours;
myalarm.min = coffeeMinutes;
myalarm.sec = 00;
RTCCSetAlarm(&myalarm, 1, EVERY_DAY);
RTCCRunAlarm(1);

coffeeAlarm = FALSE;

//debug:
UARTWrite(1,"Coffee Alarm set!\r\n");
}

strftime(dateUTC1, sizeof(dateUTC1), "%H:%M.%S", ts);

sprintf(dateUTC,"%s\r\n",dateUTC1);
UARTWrite(1,dateUTC);

//Check the dimming alarm
if(alarmSet==TRUE) {
//check if it is dime to start dimming the light:
if(minutes < LightDimmingOffset) {
if((hours -1) == ts->tm_hour && (60-LightDimmingOffset+minutes) == ts->tm_min) {
//start dimming!
startDimming = TRUE;
alarmSet = FALSE;
//turn on the lamp
PWMInit(1,1000,5);
PWMOn(11, 1);
}

}

else {
if(hours == ts->tm_hour && (minutes-LightDimmingOffset) == ts->tm_min) {
//start dimming!
startDimming = TRUE;
alarmSet = FALSE;

//to be used with PWM on Flyport
//turn on the lamp
PWMInit(1,1000,5);
PWMOn(11, 1);
}

}

//Debug:
UARTWrite(1,"Alarm Set\r\n");
}

//Check if the coffee alarm has turned on
if (alarmflag==FALSE) {
UARTWrite(1,"Alarm not triggered\r\n");
}
else {
UARTWrite(1,"Alarm triggered!!!\r\n");
alarmflag = FALSE;
//turn on the coffee machine:
IOPut(P21, toggle);
}

//to be used with PWM on Flyport
//check if dimming has started:
if(startDimming==TRUE && dimmPercentage<100) {
PWMDuty(dimmPercentage, 1);
dimmPercentage+=dimmStep;
}

UARTFlush(1);

//Delay the loop for 10 secs
vTaskDelay(1000);
}
}

The Android application simply has a UI for setting the internal alarm clock and the coffee machine activation time. Both times are sent through a GET request to the Flyport Web server that handles the rest.

Check the video with the demo:

Get the complete Flyport and Android code here.

SuperDuplex: An InfraRed Bootloader for Arduino

NoMi Design has implemented a very cool Arduino Bootloader based on 38kHz infrared remote modulation. The coolest thing is that you can use the Arduino IDE and program remotely (through IR) your ATMega chip without any modifications on the IDE!

As guys describe in their blog entry they have managed to resolve any transmission issues like echo and demodulation. IR cannot beat the wireless reprogramming using ZigBee or any other wireless/wired interface but it is a very interesting solution that brings many ideas for home projects!

Pattern Recognition for the Air Quality Egg – Part two

While the idea of pattern recognition for the Air Quality Egg is gaining attention and people willing to help I though to use existing feeds on Pachube and see if there are any patterns easily recognizable through standard techniques.

When searching for ‘Air Quality Egg’ feeds on Pachube, I have two of them that seem to be consisting of prototypes and constantly reporting sensor values:

 

So initially, I have used the Pachube API to retrieve past values of AirQuality, CO, NO2, temperature and humidity datastreams. The values taken are from various days between 16/04-25/04 and during various time slots within each day (at 1hr intervals). A number of abt 900 datastream entries have been collected.

Then I have used the WEKA data mining tool to make some rough analysis using K-Means clustering. I have used 3 clusters as an input (corresponding to potential air quality levels like good, medium, bad). Not even knowing what the actual explanation of the Air Quality datastream is (and what the values mean), clustering appeared to have done a good job in identifying properly the clusters (regarding the Air Quality) and also visualizing the association of the other sensor data (humidity, temperature, NO2 and CO):

The first picture (click to view actual size) depicts the Air Quality (X-axis) vs CO (Y-axis) association. The cluster identification looks very clear to what could be interpreted as low, medium and very good quality of data. A first assumption from this graph could be that the collected CO level range (0-25) does not seem to affect a lot the air quality sensor readings for the air quality..

The second image visualizes the correlation between air quality and humidity. Again the 3 different types of AQ seem to be easily distinguishable, but in this case humidity effects significantly the quality of air AQ sensor readings; high humidity indicates low AQ readings.

The NO2 levels on this image also  do not seem to affect the AQ much. Temperature also seems to have no impact:

So far, this initial analysis has demonstrated that:

a) Air Quality on the two selected sites can be grouped to 3 distinguished clusters

b) Only humidity levels seem to affect the sensor readings for the Air Quality.

Probably the measured range of NO2 and CO levels is two small to play a significant role in AQ. Hopefully when Eggs arrive to their owners and users start generate more data, analysis will show more interesting results.

In the meantime I plan to make this an online service where users can enter feed ids and data will be automatically clustered and visualized. Also based on the clusters, an initial training model can be built so that new feed data can be associated by the service to one of the cluster-categories.

Any volunteers to help with J2EE and the web front end?

Charalampos