Hardware challenges for the IoT

Last month I was invited by Aeris Communications to present an IoT-related topic at the IoTSiliconValley Meetup. The Meetup group focuses on talks about the Internet of Things and M2M. Since previous talks had already introduced the concept of IoT to attendants and discussed various IoT platforms, I decided to present about hardware challenges for the IoT both from the makers and the manufacturer’s perspective.

In addition, the presentation included a brief introduction to the openPicus Flyport platform and a short demo.

The slides of the presentation can be found here:

The meeting took place at the famous Hacker Dojo!
The Hacker Dojo!
More than 60 people have attended with different backgrounds, experience on IoT and prototyping and different perspectives on the hardware challenges.
The WiFi vs RF debate section was a bit confusing (probably a title like WiFi vs Non-WiFi would be more appropriate) but it was quite interesting to see how attendees reacted on questions about using WiFi as a current networking solution for IoT products, about the future of ZigBee and the overall IoT hardware challenges.
People showed great interest also in the Flyport and asked many questions on hardware features, WiFi connectivity, programming environment and project integration.
Overall, it was a great experience, and hopefully more IoT presentations will follow (both near and far away from Silicon Valley…)
Many thanks to Aeris Communications for hosting the meetup and to everyone who attended!!

The WiFi and RF debate for IoT and Sensors

The IoT concept has been around for a while and soon applications and platforms will start hitting the market (whether the market is mature enough for the IoT, could be another debate…). It has never been that easy to build a prototype, make a nice 3D-printed case, deploy a backend application on a Cloud platform (or use one of the numerous existing IoT Web applications) and use crowd funding for getting started.

So, I am quite sure there is a number of people out there investigating potential solutions for M2M communication. In addition, a number of vendors are offering respective solutions trying to address the major IoT communication challenges: cost, power consumption, range.

Lately there have been many debates over low-power WiFi and RF (including ZigBee, 6LoWPAN, etc.) networks in the context of IoT application and sensor devices. In the lack of a succesfull radio communication technology for IoT devices, people (developers, researchers, etc.) seem to have been divided into two groups: Group A that supports that new WiFi implementations (featuring low energy consumption) can be used for IoT internetworking. Group B that opposes and suggests that WiFi is too complicated for sensor and M2M communication.

Was WiFi ever ment to be used for M2M and IoT? Definitely not, so why bother with the comparison and why do vendors care about providing enhanced WiFi modules that can be integrated into devices? There are some benefits for sure:

WiFi pros:

  • Internet gateways (i.e. WiFi access points) are already used so IoT devices could have direct connectivity to the Internet without the need for additional infrastructure (setup cost, maintenance cost, etc.).
  • Almost every smartphone is WiFi enabled, so communication with WiFi devices is quite easy and direct without the need for additional hardware.
  • It is an established standard, supporting a full TCP/IP stack, meaning that when developing applications you need to focus only on the application level programming for message and information exchange.
  • Integrates security


WiFi cons:

  • Cost (even the Electric Imp at 25$ is much more expensive than a pair of RF transceivers)
  • Even low power WiFi has more power consumption than RF (and also the communication protocol introduces much unnecessary overhead)
  • Low power has poor indoor performance

RF on the other hand has also great benefits over WiFi in the context of M2M:

  • Higher transmission range and better indoor performance
  • Better price


Standards like ZigBee and 6LoWPAN include security implementations and mechanisms for error corrections/retransmission, etc. so that developers can focus just on the application level as well.

Who is the winner?

I think there can be two answers here: a) depends on the application/use case, b) the hybrid solution.

For instance, if an IoT system relies on a limited number of devices and direct mobile communication, then WiFi currently looks the best solution.

For case a) there are several options. An Arduino with a WiFi shield, or a Flyport WiFi module can be used for quick prototyping (and also used integrated in systems, especially the Flyport). The soon available electric imp is another example of a WiFi integrated device with low power consumption that can interact with the physical world and enable IoT applications.

For the hybrid solution, the idea is that devices are interconnected through RF and then a gateway is used to provide connectivity with the rest infrastructure and/or the Internet.

What’s missing here? Standards for gateway-to-devices communication, address translation (like NAT) for physical devices so that they can be accessed remotely, etc.

A very interesting RF-based approach is the Enocean alliance and the soon-to-come Flyport Enocean gateway.


Which RF technology is going to win?

Well, ZigBee for a number of reasons has failed to penetrate home market (even Bluetooth 4 seems to compete better)  and I am not quite sure it will be given a second chance in the IoT era. 6LoWPAN looks more promising, incorporates IPv6 support (thought I am not sure why every device shall have an IP address) and might have a better chance.

Would like your opinion on the entire debate issue, so please go ahead and answer the following poll:

WiFi vs RF for IoT: I would currently develop a product using:

View Results

Loading ... Loading ...

WiFi vs RF for IoT: Which one will eventually win?

View Results

Loading ... Loading ...

About ZigBee:

View Results

Loading ... Loading ...

Log your Android resources on Cosm!

Being inspired from CosmX (announced at Cosm Blog as a recent hack day project) that updates a Cosm feed with your system’s current CPU usage, I have developed a similar Android app.

The app logs CPU usage, available memory, data usage (transmitted and received Kb) and battery level:

When you start the app it asks for a Cosm Feed ID and a key (with update and create permissions). Users can also configure what resources will be logges and how often the feed will be updated (1 minute, 5 minutes, 15 minutes, 30 minutes, 1 hour, 2 hours).

The CPU usage and available memory are polled every 10 seconds and a total average is calculated and used for updating the feed.

Thinking of adding more features like call duration and number logs, etc. especially if users like it and find it useful. A background service is used to monitor and log the resources, users can start/stop the service through the application main interface.

You can download the apk and Android source files from: http://code.google.com/p/cosm-android-logger

You can also get the application throug Android Play:



EDIT (29/6/12): Entering the Cosm key manually really sucks, so I have updated the app to use a QR code reader for that purpose. Simply go to http://goqr.me/ or any alternative and generate a QR code with your key. Then simply use the ‘Read QR code’ button ;-)

The GeoSensor – A Low cost weather station

The GeoSensor is a new IoT-based project by Hugo Lavalle, an Arduino and IoT fan.

The main scope of the project is to provide makers with a low-cost weather station that will allow users to have direct access to meteorological data.

The idea behind this effort has come from the fact that Hugo is a sailor and have always wanted access to meteorological data from the place he usually sails. Most likely a commercial weather station is too expensive, so he decided to build one himself.

These are the main components proposed for the basic version of the project:

  • Wind and rain sensors
  • Arduino microcontroller
  • Ethernet shield
  • Temperature sensor
  • Cables, electric material, arduino case, etc.

Hugo has just launched a crowd-funding campaign for his project and is going pretty good. In case the project gets the funds, he will donate the first station to the marina he uses for sailing, that is located at the Parque da Cidade, in Jundiaí/SP.

Source code and h/w schematics will be released as Open Source at the end of the project.

You can help Hugo with his project here.

In case the fund raising goes well, Hugo plans to add more features like:

  • Including more sensors, like air and soil humidity.
  • Enabling alarms for heavy rains, that may cause flooding and mudslide, frequent problems here in Brazil during summer rains.
  • Investigating the possibility of measure river levels.
  • Building a wireless station, wih wifi or a zigbee/ethernet gateway.
  • GPS integration.
  • Building a mobile station with GPRS connection.
  • Using solar power

Hugo reports that the book has helped him a lot with his project! Thank you so much Hugo and good luck with your project! :-)

UPDATE (26/6/2012): Hugo reports that after having the first prototype ready, he will propose the creation of a national GeoSensors network. In that network, each participant would be a maker and/or manager of a GeoSensor. The resources could be obtained also from crowd funding. More details on the website that Hugo shares the progress of the prototype.



Feedback from OpenIoT – Notes from “The Body” discussion group

The OpenIoT weekend is over and I have to admit it, I had never thought it would be such an interesting and inspiring event! Lot’s of great people, a great place (Google campus) and great food as well!

People that attended had different backgrounds and thoughts about open data and user rights, but we all did manage after two long days full of discussions (and debates in some cases) to form a basic document that describes the fundamental rights of users for enabling the Open Internet of Things.

The document mainly suggests issues about:

  • Licensing (Data ownership, licensors, creators, etc.)
  • Accessibility (Royalties, parsers, etc.)
  • Timeliness (data resolution, etc.)
  • Privacy (what data is being collected, why, etc.)
  • Transparency

You can find the document here (and yes, please do put your name on it if you support what you read)!

On the first day I was leading “The Body” discussion group (about Quantified Self, wellness, user location tracking, etc.). More that 10 people have attended and contributed great ideas and concerns about how such data should be treated and what the user rights are.

As promised, I will soon compile the notes and put them online (probably will use the existing Google Group) so that people can review them, comment, and move on the discussion online so that we can come with something more concrete about QS and Open Data at the IoT!

For now, I am posting here the notes that contain the main highlights of the discussion (special thanks to Christine Perey for writing them down and also for being very inspiring!)


Check some photos from the event here.

If you have missed the event, or missed to follow the #openiot on Twitter, you can see the whole (Tweet) event stream here!

Many thanks to the people that have worked hard to make this event such a huge success: Alex Sonsino, Diana Proca, the volunteers, Usman and Ed from Cosm, Trevor Harwood (for his great effort on the initial draft and the google group), the Speakers, and anyone else I might forget. Many thanks to anyone having attended and contributed to the Document!

It’s been really nice to meet you guys!

Internet of Things can be Open, it is our right to demand for it, it is our right to form the way it can be!

Open the Internet of Things! Spread the news to the world! ;-)

You can also check the weekend’s write-up from Andrew Back here

Talk/Listen to your Cosm feeds!

Wouldn’t it be cool if you could use your voice on your mobile device for retrieving information about your Cosm feeds (and activate a trigger maybe)? This way you can learn about your room’s temperature (or whatever else you might be monitoring using your favorite microcontroller platform) while driving, jogging, etc.!

When this idea came in my mind I first thought of Siri. Then I turned on the WiFi on my iPhone and tried to find out Siri’s knowledge about IoT and Cosm. The first question was very general, to see if Siri had any idea of what I am talking about when I want to find out my room’s temperature. The response was completely disappointing!

One could say that I should not be expecting Siri to be able to answer such a question (well, at least it could search google for temperature :P) so I tried again with a more..let’s say..fundamental question!

(Side note: I am expecting from such a voice recognition system difficulties in understanding my greek-english accent, but despite Cosm being much more easily to pronounce than Pachube, Siri thought on every attempt that I was asking about ‘cousin’ and not ‘cosm’.  I had to manually edit the query and correct it.)

Siri has thought that I am just kidding around, then I realized that maybe Siri is not familiar yet with the new name of Cosm:

That was only just a bit closer! Cosm guys, if you read this, you need to talk to Apple :P

Since there is no official voice application for Android I gave up and considered to build one myself. Since I didn’t want to jailbreak my iphone and and I am not very familiar with objective-C, I have used Android for that purpose.

I have developed a small app that requires from you only the tap of a button and then will ask you for the feed id you are interested in. It will look for available datastreams, will ask you which one your are interested in and will report you the current datastream value. Finally it will ask you whether you like to activate a trigger or not (not Cosm trigger, just a REST call to a remote server). Voice recognition works pretty well, the voice sdk needs improvement, but it is quite acceptable for a proof of concept!

See a small demo video here:

The trigger is just a REST call on a website, not an actual Cosm trigger!

The complete Android source can be found hereWarning: Code is really a mess, just made it for proof of concept, works quite well, but definitely could be optimized, and implemented using threads! Feel free to use it, modify it, improve it!

Leading ‘The Body’ discussion group @Open Internet of Things Assembly

The committee of the OpenIoT has honored me by assigning me the lead of ‘The Body’ discussion group on the first day (June 16th).

There will be a small presentation about the Quantified Self (QS), how and why people can track information about themselves and the context around them for their own benefit, and how this is related to the Open Internet of Things and Data Control/Privacy/Licensing.

The main scope of the session is to discuss on QS Data Control/Privacy/Licensing and generate up to 5 key points related to QS that should be added to the Open IoT framework. More information here.

Get your Open IoT Assembly ticker:

Eventbrite - Open Internet Of Things Assembly

The full schedule:

Interesting IoT findings and some news

I thought it would be nice to share some interesting IoT findings of the last few days:

A ship transmits location, speed and destination to Cosm! 

https://cosm.com/feeds/3818 The captain must be a real IoT fun!


I have found interesting patterns of my heart pulse rate while looking the data history on Cosm.

As explained in previous post (My Heart Rate on the (Cosm) Cloud!) I am using an ear-clip based heart pulse monitor and my Android phone to transmit average pulse rate (bpm) on Cosm (https://cosm.com/feeds/59362). Looking back on the data I notice interesting patterns of my pulse, I can easily find out when I am sitted and working (most of the time), being mobile and also wearing the sensor after lunch! However the most interesting is how easy it is nowadays to save and visualize online personal data. I remember back in 2006 when I first started doing research on on-body sensors, we had to build two applications; one for collecting the data from sensors and one for storing and visualizing them on a computer or a mobile phone (windows ce and j2me old days!).


New hardware platforms for the IoT are being published and are dww (definitely worth-watching)

The first one is a Kickstarter project asking for your support. Tōd connects real world actions to mobile devices and to the Web. It consists of ultra small and low power Bluetooth 4.0 enabled Smart Beacons that are programmable and come with great features.


The second one is electric imp, a new startup that has managed to build a tiny WiFi card (at the size of an SD card) at very low cost (arnd 25$) with low power consumption. It also comes with great dev kits that enable direct Arduino communication

The IoT Open Assembly is gaining publicity and attendees. It will be a great event featuring interesting talks and will also give participants the opportunity to contribute to the Open Internet of Things Document and Vision. I am really excited to attend the event.

They have recently released the event schedule. So check it out and book an early ticket for ‘being part of the Open Internet of Things’ until June 1st!


Cosm on mobile devices
Some of you might already know that I have built an Android application for viewing feeds on Pachube (have not updated it to say Cosm yet): https://play.google.com/store/apps/details?id=pachube.andorid
Now I am developing something more special, named ‘VocalCosm’…details will follow soon!

My Heart Rate on the (Cosm) Cloud!

Recently Seeedstudio (many thanks!) has provided me with a Grove Heart Rate ear-clip sensor:

This cool (and very low price) sensor is attached on your ear and can detect your heart’s pulse through transmitting infrared light and checking the absorption variation caused by the blood flow on your ear lobe. The site of the products provides also the Arduino code for detecting the beats and calculating an average heart rate (in bpm  – beats per minute). The sensor comes with a grove connector, so setting up and running the code took less than 5 mins! (thanks again @seeedstudio for providing me with a complete Grove kit).

After playing with it a while I realized that I could make a cool Cloud-based heart rate tracker by simply using an ADK board and my Android phone. This way I could be completely mobile (given that the 9V battery that powers the ADK board can last!).

I modified the Arduino code to send the heart rate to the Android using the ADB and made also a simple Android app that takes the heart rate and sends it to Cosm  (former Pachube) using the jpachube library.

This is the graph generated from Cosm: (EDIT: Battery has died, will put on again sometime tomorrow ) EDIT (14/05): I’ve put it on again for a few hours!)

EDIT: I will occasionally put on the sensor during the following days.

Despite being very mobile (the cable is long enough to reach my pocket where both boards and mobile phone are) I am sure the graph-feed will stop being live quite soon (will either get bored, battery will die or will take it off to go to sleep…)

The code for the Arduino is the following:

/************************* 2011 Seeedstudio **************************
* File Name : Heart rate sensor.pde
* Author : Seeedteam
* Version : V1.0
* Date : 30/12/2011
* Description : This program can be used to measure heart rate,
the lowest pulse in the program be set to 30.

//Modified by @BuildingIoT
//for communication with Android

#include <SPI.h>
#include <Adb.h>

// Adb connection.
Connection * connection;

// Elapsed time for ADC sampling
long lastTime;

unsigned char pin = 13;
unsigned char counter=0;
unsigned int heart_rate=0;
unsigned long temp[21];
unsigned long sub=0;
volatile unsigned char state = LOW;
bool data_effect=true;
const int max_heartpluse_duty=2000;//you can change it follow your system's request.2000 meams 2 seconds. System return error if the duty overtrip 2 second.

void setup() {
pinMode(pin, OUTPUT);
//Serial.println("Please put on the ear clip.");
//Serial.println("Heart rate test begin.");
attachInterrupt(0, interrupt, RISING);//set interrupt 0,digital port 2

// Initialise the ADB subsystem.

// Open an ADB stream to the phone's shell. Auto-reconnect
connection = ADB::addConnection("tcp:4567", true, adbEventHandler);

void loop() {
digitalWrite(pin, state);


void sum()//calculate the heart rate
connection->write(2, (uint8_t*)&heart_rate);
data_effect=1;//sign bit
void interrupt()
state = !state;
if(sub>max_heartpluse_duty)//set 2 seconds as max heart pluse duty
data_effect=0;//sign bit
Serial.println("Heart rate measure error,test will restart!" );
if (counter==20&&data_effect)
else if(counter!=20&&data_effect)
void array_init()
for(unsigned char i=0;i!=20;++i)
// Event handler for the shell connection.
void adbEventHandler(Connection * connection, adb_eventType event, uint16_t length, uint8_t * data)


For the Android app all is needed is an Activity that implements the ADB server and communicates with the Arduino board:

package buildingiot.heartrate;

import java.io.IOException;

import android.app.Activity;
import android.os.Bundle;
import android.util.Log;
import android.widget.TextView;

import org.microbridge.server.Server;
import org.microbridge.server.AbstractServerListener;

public class HeartRateOnCloudActivity extends Activity {

// Create TCP server (based on MicroBridge LightWeight Server).
// Note: This Server runs in a separate thread.
Server server = null;

int heartrate = 0;

TextView textView1;

/** Called when the activity is first created. */
public void onCreate(Bundle savedInstanceState) {


//Create TCP server (based on MicroBridge LightWeight Server)
try {
server = new Server(4568); //Use the same port number used in ADK Main Board firmware
textView1.setText("Starting server..");
textView1.setText("server started!");

} catch (IOException e){
Log.e("Seeeduino ADK", "Unable to start TCP server", e);
textView1.setText("server not started!!");


server.addListener(new AbstractServerListener() {

public void onReceive(org.microbridge.server.Client client, byte[] data){
textView1.setText("got arduino data!");
String bpm = new String(data);
textView1.setText(bpm+" bpm");



To make it all work you need to have an ADB-enabled Arduino board like this one.

More examples on Android and Arduino communication can be found in my book.

Controlling your coffee machine and your room’s light through the Web

(a.k.a. the SmoothWakeAlarm Project).

This is a smooth-wake-alarm system based on the Flyport Ethernet and Lighting Nest. The system can automatically dim your room light before you wake up and also activate the coffee machine on time. An Android app allows you to set the internal alarm clock and an alarm for activating the coffee machine.

The app communicates with the Flyport web server through a REST API. The Flyport includes all the essential libraries for time management (internal alarm, setting time automatically through SNTP). The Lighting Nest has 3 relay switches on board that can be used to activate any external circuit.

Equipment used:

  • 1 Flyport Ethernet module
  • 1 Flyport Lighting Nest
  • 1 Led
  • 1 Resistor (optionally for the LED)
  • Power source


The project consists of two parts: a) the Android application that will be used for setting the wake alarm and setting the time for dimming the light (LED) and activating the coffee machine (using the Lighting Nest’s relay switch). b) The hardware for activating the coffee machine and dimming the LED.

The Flyport Ethernet provides all the essential functionality for receiving the wakeup and coffee machine activation time (through the embedded Web Server) and includes libraries for time control (like getting actual time through SNTP and setting an internal alarm).

The Lighting Nest provides relay switches (up to 5A) that can be used to directly activate the coffee machine.

To emulate the light dimming, PWM is used on a LED. Lighting Nest provides a PWM output on the EXPANSION connector.

One of the cool things is that the time handling is performed completely on the Flyport module since the accompanying libraries make time handling a piece of cake!

The code:

The Flyport comes with a very handy and simple IDE for editing, compiling and uploaded the code on the board. It also comes with  The Flyport projects consist of several source files (many of them auto-generated when creating the project through the IDE wizard). 2 are the basic ones that need editing when creating a custom project: HTTPApp.c and taskFlyport.c. The first one includes all the essential libraries and routines for implementing the HTTP communication part of the Flyport web server (this also means handling the GET requests from clients) and the second one handles the logic flow of the program (variable initialization, main execution loop, etc.).

A sample of the HTTPApp.c that handles the GET request for setting the coffe alarm activation time follows:

BYTE *hrs;
BYTE *mins;
BYTE filename[20];

// STEP #1:
// The function MPFSGetFilename retrieves the name of the requested cgi,
// in this case "alarm.cgi" and puts it inside the filename variable.
// Make sure BYTE filename[] above is large enough for your longest name
MPFSGetFilename(curHTTP.file, filename, 20);

// STEP #2:
// Handling of the cgi requests

if(!memcmp(filename, "alarm.cgi", 9)) // Is the requested file name "alarm.cgi"?
// STEP #3:
// The complete request is contained inside the system variable curHTTP.data.
// Using the function HTTPGetArg is possible to read the arguments
// of the cgi request from curHTTP.data. In this case we are reading the
// argument "hours" and "minutes" from the request "alarm.cgi?hours=x&minutes=y" and we assign it to the
// respective variables.

hrs = HTTPGetArg(curHTTP.data, (BYTE *)"hours");
mins = HTTPGetArg(curHTTP.data, (BYTE *)"minutes");
hours = atoi((char*)hrs);
minutes = atoi((char*)mins);
alarmSet = TRUE;
UARTWrite(1,"Got Alarm request!");

if(!memcmp(filename, "coffee.cgi", 10)) // Is the requested file name "coffee.cgi"?
hrs = HTTPGetArg(curHTTP.data, (BYTE *)"hours");
mins = HTTPGetArg(curHTTP.data, (BYTE *)"minutes");
coffeeHours = atoi((char*)hrs);
coffeeMinutes = atoi((char*)mins);
coffeeAlarm = TRUE;
UARTWrite(1,"Got Alarm request!");
return HTTP_IO_DONE;

The taskFlyport.c that uses both the internal clock (RTCC) and SNTP for getting realt time from the Interner is the following:

#include "taskFlyport.h"
#include "time.h"
#include "rtcc.h"

int GMT_hour_adding = 3;
int LightDimmingOffset = 5; //Define how many minutes before alarm to start dimming the Light

time_t now;
struct tm *ts;
DWORD epoch=0;
DWORD epochtime=0xA2C2A;
t_RTCC mytime;

char dateUTC[100];
char dateUTC1[50];
extern BOOL alarmflag;
extern int hours, minutes;
extern int coffeeHours, coffeeMinutes;
extern BOOL alarmSet;
extern BOOL coffeeAlarm;

//Dimming variables
BOOL startDimming;
int dimmStep;
int dimmPercentage;
void FlyportTask()
// Flyport waiting for the cable connection
while (!MACLinked);
UARTWrite(1,"Flyport ethernet connected to the cable... hello world!\r\n");


while(epoch<epochtime) {

UARTWrite(1, "done!\r\n");
coffeeHours = 0;

//GET SNTP time and set it:
ts = localtime(&now);

ts->tm_hour = (ts->tm_hour + GMT_hour_adding);
// Correct if overflowed hour 0-24 format
if(ts->tm_hour > 24) {
ts->tm_hour = ts->tm_hour - 24;
else if(ts->tm_hour < 0) {
ts->tm_hour = ts->tm_hour +24;

//Set the RTC according to SNTP time
mytime.year = 12;
mytime.month = 4;
mytime.day = 21;
mytime.hour = ts->tm_hour;
mytime.min = ts->tm_min;
mytime.sec = ts->tm_sec;
RTCCWrite (&mytime);

//Initialize dimming variables
dimmStep = 100/LightDimmingOffset;
dimmPercentage = dimmStep;

// Main user's firmware loop

//Read the time and check for alarms set
ts = localtime(&now);

ts->tm_hour = (ts->tm_hour + GMT_hour_adding);
// Correct if overflowed hour 0-24 format
if(ts->tm_hour > 24) {
ts->tm_hour = ts->tm_hour - 24;
else if(ts->tm_hour < 0) {
ts->tm_hour = ts->tm_hour +24;

if(coffeeAlarm==TRUE) {
//set the coffee machine alarm:
t_RTCC myalarm;
myalarm.hour = coffeeHours;
myalarm.min = coffeeMinutes;
myalarm.sec = 00;
RTCCSetAlarm(&myalarm, 1, EVERY_DAY);

coffeeAlarm = FALSE;

UARTWrite(1,"Coffee Alarm set!\r\n");

strftime(dateUTC1, sizeof(dateUTC1), "%H:%M.%S", ts);


//Check the dimming alarm
if(alarmSet==TRUE) {
//check if it is dime to start dimming the light:
if(minutes < LightDimmingOffset) {
if((hours -1) == ts->tm_hour && (60-LightDimmingOffset+minutes) == ts->tm_min) {
//start dimming!
startDimming = TRUE;
alarmSet = FALSE;
//turn on the lamp
PWMOn(11, 1);


else {
if(hours == ts->tm_hour && (minutes-LightDimmingOffset) == ts->tm_min) {
//start dimming!
startDimming = TRUE;
alarmSet = FALSE;

//to be used with PWM on Flyport
//turn on the lamp
PWMOn(11, 1);


UARTWrite(1,"Alarm Set\r\n");

//Check if the coffee alarm has turned on
if (alarmflag==FALSE) {
UARTWrite(1,"Alarm not triggered\r\n");
else {
UARTWrite(1,"Alarm triggered!!!\r\n");
alarmflag = FALSE;
//turn on the coffee machine:
IOPut(P21, toggle);

//to be used with PWM on Flyport
//check if dimming has started:
if(startDimming==TRUE && dimmPercentage<100) {
PWMDuty(dimmPercentage, 1);


//Delay the loop for 10 secs

The Android application simply has a UI for setting the internal alarm clock and the coffee machine activation time. Both times are sent through a GET request to the Flyport Web server that handles the rest.

Check the video with the demo:

Get the complete Flyport and Android code here.