Design is fine: Workshop at UDK Berlin.

Last week our Gadgeteer tour lead us to Berlin, where Prof. Gesche Jost heads the Design Research Lab at UDK Berlin. 6 designers and creatives came together for the 2-day workshop next to the waterside of the Spree. Interestingly, for the first time we had people with a major background in Design and Arts. Nevertheless, they mastered technical hurdles quickly and produced two impressive projects:

The Bus Ticker:

Bus ticker: Shows the next buses.

Bus ticker: Shows the next buses.

An analog display shows the arrival time of the next two buses for the nearby bus stop. Therefore, a server script was written that scraped the website of the Berlin traffic union (BVV) and returned the buses’ arrival times in a Gadgeteer-friendly format. A WIFI module connects to the server and sends the data to an OLED display. Two servos control the analog display telling users when it’s time to pack up, put on their shoes and run for the bus. Or when it’s simply too late..

The Problem Generator:

Problem Generator: Collecting votes from local communities.

Problem Generator: Collecting votes from local communities.

This device is meant to be put up in neighborhoods to collect opinions about common topics, such as “Do we need another slide for our local playground?” or “Should we invest in a local power station for electric cars?”. Community members can vote by swiping their cards (Yes/No) in front of the device. Multiple questions can be answered in sequence. The result is stored to an SD card.

A vote is given by swiping a Yes/No card.

A vote is given by swiping a Yes/No card.

The multitouch display shows the question while an RFID reader notices the proximity of Yes/No tags. Once swiped, visual confirmation is given by a colored OLED display and a summary statistic is displayed at the end of the question sequence.

Pong, Carts and Smart Plants: Workshop at DFKI Saarbrücken

This week we stopped by the DFKI at Saarland University, specifically we visited Antonio Krüger’s group: The Innovative Retail Lab. With 14 participants our biggest group so far and after Darmstadt our second 2-day workshop. Hence, people had more time to work on their own projects with Gadgeteer. And this is what they came up with:


Using the IR distance sensor and a joystick, this ancient game was put into a physical shape including multiplayer capabilities. With a slight deviation: The ball was controlled by player one using the joystick while player two had to defend using the distance sensor which controlled the paddle. Graphical UI on the T35 touch display:

Pong - using IR distance sensor and a joystick

Pong – using IR distance sensor and a joystick

The Plant Watch:

This gadget found use for our OLED displays. The three techie magicians working on this project put together a temperature and moisture sensor to be placed into a plant’s bucket. Data were live-streamed to a web server where the smart watch would poll current information from.

Smart Watch - temperature, moisture and location

Smart Watch – temperature, moisture and location

Smart watch - monitoring plants' health

Smart watch – monitoring plants’ health

Additionally they integrated a GPS sender that could be placed into a dog’s collar to show the pet’s location on the arm-worn device.

Smart watch - dog monitoring

Smart watch – dog monitoring

If we had a device like that in production, we’re convinced that even our techies could grow a green thumb.

LED Space Invaders:

Another old classic: Space Invaders. This group got our 8×8 LED matrices working by bringing to life an I2C connection. While hostile projectiles rained down from the sky, the player had to move his ship to horizontal safety using two buttons for left, right and a hidden split-ship command.

Space Invaders - 4 LED 8x8 matrices

Space Invaders – 4 LED 8×8 matrices

Space Invaders - Game Over

Space Invaders – Game Over

Here’s a live demo video:

A special feature even prevented the player from dying. Absolutely gorgeous!

Barcode Reader for Shopping Carts:

Inspired by their current research efforts around the RFID-enhanced shopping mall, this group designed a gadget to be mounted on a shopping cart. When RFID-tagged items are placed into the cart thereby passing the RFID-reader, a product description is shown on the display and the ‘virtual’ shopping cart is updated.

Shopping Cart Gadget

Shopping Cart Gadget

Shopping Cart Gadget: Scanning articles

Shopping Cart Gadget: Scanning articles

On the sidelines our Robo got an XBee Extension, so he can now receive commands wirelessly and be remotely controlled.

Sound, tele-presence and robo tinkering: Workshop at ETH Zurich

Third stop: ETH Zurich. We started off early in the morning and welcomed 11 participants to our first workshop ‘abroad’. When we broke out into groups, among them two 2-people teams were formed who decided to collaborate by splitting up their project into two bigger modules: One input, one output device. The result was the

Remote controlled camera for tele-presence:

Telepresence - Remote Control

Telepresence – Remote Control

The idea was to hold a device in hand on which a screen was mounted showing the remote conversation partner. By turning the device the compass sent off measurements to a webserver via the wifi-module which in turn controlled the pointing direction of the remote camera. This camera was using servos to be positioned and received input from the web-server via another wifi-module. Great idea, packaging and our first project including network functionality.

Telepresence - Remotely controlled camera

Telepresence – Remotely controlled camera

Sinus Wave Generator:

In the first hands-on part this group started off quickly by applying visual processing techniques to the camera’s video stream. By identifying the brightest point in the frame they managed to highlight (draw an eclipse) the bright spot on the multitouch screen.

Visual processing with Gadgeteer

Visual processing with Gadgeteer

But their real passion was geared towards audio. And so they used the input from a potentiometer and joystick to manipulate a sinus wave and output it through the audio module. To visually display the frequency spectrum they were planning on using the 8×8 LED matrix, but due to time constraints, some soldering confusion and a mysterious exception emitted by the Gadgeteer library, the LEDs unfortunately never saw the day of light.

Audio generated by potentiometer input

Audio generated by potentiometer input

Biometric Snake:

Snake – one of the old classics in video game history – must have inspired generations of programmers. And so it infected this team as well: But unlike the conservative version that used to be controlled by buttons, their gadgeteery solution used the readings from the accelerometer to steer the snake’s direction. Neat graphical output on the multitouch screen and on top the connected pulse sensor that directly mapped the player’s heartbeat to the snake’s speed. So the more excitement, the faster the snake, the more excitement, the faster…

Biometric Snake - the player's pulse affects the snake's speed

Biometric Snake – the player’s pulse affects the snake’s speed

The Robo Connection:

Robo - FEZ mini microcontroller

Robo – FEZ mini microcontroller

Finally someone took on the challenge to extend our lab pet Robo. This fellow runs a FEZ mini microcontroller and implements simple move commands. The team around Robo focused on building a serial connection between his microcontroller and .NET Gadgeteer in order to send commands and data back and forth. The idea was to have Robo spin around the room and look for RFID-tagged appliances. Hence, he was equipped with RFID-reader as well as IR-sensor to avoid collisions. There was a remote control hooked up via Gadgeteer to manually steer Robo around the room. A battery pack was strapped to his back so he could explore his environment wirelessly.

Robo extended: Gadgeteer connection, IR, RFID sensor and remote control

Robo extended: Gadgeteer connection, IR, RFID sensor and remote control

Once again, participants came up with amazing ideas that made the trip to Zurich worthwhile… despite a rather costly 40 EUR meal at the student cafeteria.

Gamer over.

Gamer over.

Etch, Sketch and Angry Oranges: Workshop at University of Ulm

Our second stop on our Gadgeteer Tour took place at University of Ulm, where the research groups of Michael Weber and Enrico Rukzio are located. 12 participants found together to hack away over the course of one day. Until the lunch break we introduced the .NET Gadgeteer platform, its beauties, quirks and possibilities and allowed people to try out different code samples and modules. The second part of the workshop focussed on participants’ ideas of application fields for hardware prototyping. 4 groups found together and started translating their ideas into concrete hardware gadgets.


Two groups tackled our suggested challenge of building an RFID-based coffee dispenser: Across several research institutions (ours included) we found tally sheets being used for keeping track of coffee consumptions. Therefore, we suggested to bring the coffee machine to the next century by augmenting cups with RFID chips and equip the coffee machine itself with a tag reader and a display resulting in

The Coffee Cashier:

Coffee Cashier

Coffee Cashier

Including consumption statistics

Including consumption statistics

In this scenario each person owns a unique cup. When placed on the reader platform the system increases the person’s coffee count and shows a statistic of recent coffee consumption. In case of an unknown tag the system offers an on-device registration service. Pretty cool and pretty scary when coffee statistics are being visualized..

Etch a Sketch:

Another group came up with a modern version of the Etch a Sketch drawing tool, originally invented by André Cassagnes in 1960: Two potentiometers control the x and y direction of a constantly progressing line. Different drawing colors can be selected by the press of a button. An accelerometer tracks if the device is shaken and erases the sketch accordingly. Nicely wrapped into a carton package this toy is ready to be shipped.

Etch a Sketch

Etch a Sketch

Artist at work

Artist at work

The fourth group figured out the possibility to build a lie detector based on a pulse and moisture sensor. Culprits would be hooked up to the machine and heart rate and sweat state will confirm or refute their testimony. The prototype draws a graph of both in real-time showing the person’s state of agitation.

Lie Detector: Behind the scenes

Lie Detector: Behind the scenes

To complete the lie-detector’s front-end it is dressed up in an orange costume and equipped with LEDs, vibration motor and servos to show and express its happiness with the current testimony. In the split of a second the relaxed, sun-soaked fruit can turn into an angry orange terminator.

A happy gadget

A happy gadget

An angry orange

An angry orange


There seems no limit to the participants’ creativity and engagement. Neither the occasional Bluescreen (reseting the Gadgeteer mainboard sometimes has a disruptive effect on the connected computer) nor the rather limited access to building material could prevent the results from turning out astonishing.

Neatly packed in a one-day workshop the strengths and limitations of Gadgeteer can concisely be exposed.


Rapid Prototyping with .NET Gadgeteer: Workshop at TU Darmstadt

The tour has begun. Last week we started off by conducting our first external workshop at TU Darmstadt with the Embedded System Group of Kristof Van Laerhoven. 10 participants  joined in and hacked away on our .NET Gadgeteer kits. The workshop lasted for 2 days, so we had enough time to introduce the Gadgeteer platform, walk participants through example projects and reserve an entire day for allowing them to hack on projects of their choice.

The workshop focused on getting hands-on experiences. After letting participants implement the classic 5-minute camera (even though most of the time it takes us 6 due to compilation delays) they had the chance to enhance the camera. Various ideas were born to mount the camera on multiple servo motors and remote control the resulting spy device. Over the course of the next day, two teams focused on a mobile camera while one team ventured off to make an attempt on augmenting a table to detect knock positions by spreading out multiple accelerometers across the table.


Camera Tower:

This construct of two servos could be controlled by joystick and send a camera live-stream to the touchscreen.

Oscilloscope: Bringing out the big guns..

Oscilloscope: Bringing out the big guns..

Knock Detection:

Knock knock... only one accelerometer there.

Knock knock… only one accelerometer there.

The idea of this spider-like construction was to use data from 3 accelerometers to detect the location of a knuckle knock on the table. Unfortunately, the project exceeded the bus capabilities of the FEZ Spider module. Even though the mainboard generally supports connecting multiple accelerometers, data can only be read from one at a time due to using the same bus system. Probably an I2C bus addressing problem which we will forward to Microsoft. Nevertheless, great idea.

The Batmobile:

Batmobile: Gadgeteer Workshop - TU Darmstadt

Batmobile: Gadgeteer Workshop – TU Darmstadt

This heroic prototype combines the power of servos with clever engineering. A joystick controls two servo motors which in turn control the wheels of the vehicle. It can ride forward, turn left/right and… kick things off the table. The camera is currently more a dummy weight than spy gadget, but could be combined with the functionality of the Camera Tower.


A two-days workshop was quite the right amount of time to let participants tinker with all the components, sensors and code fragments we dumped on them. Also, you can never prepare enough for all the eventual ideas that boil up from an engineer’s mind. But inventiveness helps out a great deal: Hence, tape rolls were turned into wheels, USB cables cut open to be soldered to extension boards and coffee cups were considered for controlling human breath rhythm.

Here goes another USB cable..

There goes another USB cable..

We learned a great deal and are excited to see what participants of future workshops will come up with.

Building devices for Text-entry: Student Workshop at University of Stuttgart


Last week we conducted our first Gadgeteer workshop for a test audience comprising 18 Master-level students within the context of our advanced HCI lecture. After a short introduction to the Gadgeteer platform, C# and its quirks, we made every group build a digital camera with components from the basic Gadgeteer kit. It was interesting to observe how they split up the tasks of component assembly, programming and tangible frame building (we gave them several breadboards and pins to attach components to).


Camera with Display

Text Entry Challenge

This initial tinkering with the camera component went quite smoothly and so we moved on to the main part of our workshop. In class we had recently discussed various text-entry methods such as T9FrogPad and Twiddler. Therefore we proposed the following challenge: We formed 4 groups consisting of 4-5 students. Each group received a basic gadgeteer kit including the following components:

  • UsbClientDP module
  • T35_display module
  • Button module

Additionally we handed each group a random sensor and asked them to build a device for text-entry. After 90minutes they would have to type in the phrase:

"rapid prototyping with gadgeteer"

We would stop the time it took each group to write out the phrase to the screen using their input devices. These are the sensors we picked:

  1. Compass
  2. Potentiometer
  3. IR distance sensor
  4. Accelerometer

Game on! Only 15minutes in our compass group presented a first prototype browsing through the alphabet using compass readings. We were aware of the fact that some input methods were more obvious and easier to tinker with than others, but life is not fair anyways. And so the group using the IR distance sensor ran into first problems when having to use the analog extension module in order to read values from the sensor.


Here are the prototypes our students created:

1. Text entry through compass:

Compass as text input

Group 1: Compass and button as text input

This group figured out how to use the cardinal direction as discrete input to select a letters from the alphabet. The button was pressed to confirm the selected letter and send it to the screen.

2. Text entry through accelerometer:


Group 2: Accelerometer and button as text input


Using the accelerometer this group designed an input device that browsed through the alphabet by tilting the breadboard. Left tilt: Previous letter, right tilt: Next letter, and a backward tilt to correct errors. A button press confirmed the selected letter.

3. Text entry through potentiometer:


Group 3: Potentiometer and button as text input

This group found a way to cycle through the alphabet by reading the potentiometer’s values and was therefore able to move back and forth rather quickly. Again, a button press confirmed the current letter selection.

4. Text entry through IR distance sensor:


Group 4: IR sensor and button as text input

As mentioned, using the IR sensor proved to be a bit more challenging due to the necessity to use the extension module. Unfortunately, our student group failed to complete the prototype in time using this exciting input modality.

At the end of the challenge each group had to perform a live demonstration of their prototype. We timed how long it took them to write the designated phrase. The results were the following:

  • Group 1: Button + Compass : 133 sec
  • Group 2: Button + Accelerometer : 181 sec
  • Group 3: Button + Poti : 56,1 sec
  • Group 4: Button + IR-Distance : did not finish

Congratulations to Group 3 who managed to build a text entry method using a button and a potentiometer in less than 90minutes and a spectacular finishing time of: 56.1 seconds.


Browsing the alphabet

Overall, the students showed a lot of involvement and it was interesting to see how they split up the different tasks amongst each other. Running workshops like that alongside the actual lecture was well received and helped solidifying the topics discussed in class.

We are excited to further experiment with teaching methods involving rapid hardware prototyping on Microsoft’s Gadgeteer platform to support teaching and to invite students to take a hands-on approach to consolidate lecture knowledge.

XBee ZigBee HowTo


This first setup describes how to configure two XBee Series 2 modules in order to communicate with each other.


In contrast to XBee Series 1 nodes that use a simple 802.15.4 MAC layer protocol to communicate with each other (just point to point or point to multipoint), Series 2 nodes use the much more powerful ZigBee protocol that builds on top of the 802.15.4 protocol. Because series 1 devices can’t address their destination node within the payload, they must be flashed with a destination address using the X-CTU Tool or a serial terminal program and AT commands. Series 2 devices do not only send pure data, they encapsulate their payload in a protocol structure similar to many others (e.g. TCP/IP). This way they can provide a much broader functionality and can be used for building complex mesh sensor networks. Each node within a ZigBee network must have one of three roles:

  • Coordinator: Exactly one for each network
  • Router: None or many, additionally route packets to other devices
  • End Device : None or many
XBee ZigBee Series 2 module | XBee Explorer adapter | Gadgeteer XBee Adapter

XBee ZigBee Series 2 module | XBee Explorer adapter | Gadgeteer XBee Adapter

XBee Device Configuarion

Each XBee device must be flashed and configured with a certain firmware depending on its role in the network. Due to the XBee microcontrollers’ memory constraints not all  functions can be supported by a single firmware. In our scenario one device should be flashed as Coordinator in API mode and the other as Router in AT mode. Both must be in the same network (same PAN ID) and there must be only one coordinator. The first XBee device has to be put on the Sparkfun Explorer and connected via USB to the Desktop Computer.  A COM-Port should be automatically installed (remember the port number). Now install the latest version of the X-CTU  tool from the website [1] and run it.

API and AT Mode

The configuration described in the next part will enable each ZigBee device configured as Router or End Device to send data that was received over its serial port to the Coordinator Node (Router/End Device in AT mode). The Coordinator can send data to all other devices in the same PAN or address each device individually (Coordinator in API mode). This centralized topology enables End Device to send and receive data through a transparent serial interface without the need of addressing. If dynamic peer-to-peer communication is needed, Routers and End Devices must also be set to API mode. In this case the API mode protocol must be supported by the microcontroller connected to the serial interface of the ZigBee Router or End Device. In a configuration with only two ZigBee devices the Router/End Device will send all data to the Coordinator and the Coordinator can broadcast  all data to all Router/End Devices (in this case only one). An even easier way to connect two ZigBee modules is to set both to AT mode. The Coordinator must then be configured with the destination address of the receiving module (DH/DL setting). The following configuration describes the mixed mode variant which can be for example used with the XBee Internet Gateway (XIG) in order to give ZigBee Router/End Devices Internet access.

Coordinator Configuration:

Select the correct COM port and press “Test/Query”. If this works, go to the “Modem Configuration” Tab and start with flashing the newest firmware onto your device.  After updating firmware (Modem: XB24-ZB, Function Set: ZIGBEE COORDINTOR API, Version 21A7 (or newer)) press the “Write” button. Then press “Restore” to reset all default values and get a clean configuration. Now set all values as follows and press “Write” again. Congratulations, you have set up the coordinator module!

  • PAN  ID  = 1 (or another)
  • API AP = 2 (enable API with escaping)
  • NI = MCI-BEE-1 (human readable name)

If your modem (here XB24-ZB) can not be found within X-CTU, try to updated the tool automatically or download the missing modem driver (a .zip-file) from Digi’s repository and install it manually (X-CTU->Modem Configuration->Download new versions…->File…).

Router Configuration:

After updating firmware (Modem: XB24-ZB, Function Set: ZIGBEE ROUTER AT, Version 20A7 (or newer)) press the “Write” Button. Afterwards press “Restore” to reset to default values. Now set all values as follows and press “Write” again:

  • PAN  ID  = 1 (MUST be same as the coordinator!)
  • Destination High DH = 0
  • Destination Low DL = 0 (means coordinator address)
  • JV = 1 (means router tries to rejoin with coordinator after startup)
  • NI = MCI-BEE-2 (human readable name)
Coordinator Configuration in X-CTU tool

Coordinator Configuration in X-CTU tool

Optionally our MCI-BEE-2 could have been configured as end device.Your XBees should be configured know. This can checked by connecting the coordinator in API mode using the SparkFun Explorer to a USB port and starting the “Terminal” tab within the X-CTU tool. Data received from other modules in the same PAN will be displayed in the window – enclosed by the API mode protocol structure.

Gadgeteer Quick Start Guide

This is a quick start guide for .NET Gadgeteer and the FEZ Spider Kit. It includes instructions for the installation of necessary software and drivers of the Gadgeteer platform as well as for the first Gadgeteer program.

GHI Spider Kit (Source:

Software Installation

GHI Electronics, the manufacturer of the Spider Kit describes the three necessary steps for setting up the .NET Gadgeteer development environment. Follow these steps in their designated order. Should you already have installed a full version of Visual Studio 2010, we recommend installing Microsoft Visual C# Express 2010 nevertheless. In a second step install the Microsoft .NET Micro Framework 4.2 QFE2 SDK (not 4.1!). You need to create an account on the GHI Electronics website in order to download and install the GHI NETMF v4.2 and .NET Gadgeteer Package in the last step.

If you want your FEZ Spider to run out of the box, the mainboard firmware should have version (If not, you will be able to read soon in another article about how to update your firmware). The firmware version will be displayed on the T35 display module after you  connect it to the Gadgeteer mainboard and start the system for the first time.

First Gadgeteer Application

Run Visual C# Express and create a new Project.

Create a new NETMF 4.2 project

You will start within the hardware layout view. Drag and drop the following modules from the left toolbox onto the white space near the spider mainboard:

  • UsbClientDP module
  • T35_display module
  • Button module
Right click on the white space and select “Connect all modules”. The IDE will automatically connect all your selected modules to compatible ports on the spider mainboard. Now all you need to do is to connect your physical modules the same way. Notice that all sockets on the mainboard as well as on the modules have labels on them (X,Y,U;S, etc.) that describe their compatibility (e.g. socket A modules can only be plugged into A sockets on the mainboard).

Initial hardware configuration

Plug the USB mini cable into the UsbClientDB module and connect it with a USB2.0 port on your computer (Hint: USB 3.0 has not worked for us!). When  the Spider Kit is connected via USB for the first time, Windows will try to install the driver. If it fails, point the driver installation routine to the following folder (you can find the device as EMX in your system device manager):

C:\Program Files (x86)\GHI Electronics\GHI Premium NETMF v4.2 SDK\USB Drivers\GHI_NETMF_Interface

(See for this issue)

After successfully installing the driver you should see a “GHI NETMF Debug Interface” device in your device manager:

Successfully installed Gadgeteer driver

Change back to Visual C# Express and select the Program.cs tab which holds the skeleton code for your first Gadgeteer application. Enter the following lines and hit the “Start Debugger” button in the top of Visual Studio:

First Gadgeteer program code

You can monitor the deployment process in the footer line of Visual Studio. If the process takes too long, try to reset the spider mainboard by pressing the reset button. If deployment is successful a debug window comes up. Press the button module and see what happens on the display. Congratulations, you just wrote your first Gadgeteer application!

Further web resources on Gadgeteer