The Junction journey of our team – Maksym, Raul, Christian, Carlos and me, Eerik – began actually on the 12th of September in Spark Hub, Tartu at an event called Tech Race Tartu. Tech Race Europe is a hackathon event touring through cities around Europe and as it`s meant for everyone regardless of their skillset we decided to give it a go and have some fun solving tons of challenges. After the event we thought we did okay and didn’t even think about winning it.
As a surprise we ended up winning the event by solving different kinds of coding challenges (for example coding and playing back Queen’s Bohemian Rhapsody in Python). And for that we got automatically accepted to Junction 2019, Europe`s leading hackaton to put our machine learning, data engineering and software developing skills to the test. And as a bonus, we got tickets to sTARTUp Day, too.
Before Tech Race we didn’t actually plan to join Junction this year as we had been there before and didn’t feel like hacking through days and nights on coffee without sleep again. However, after being automatically accepted we decided to go and have some fun in Helsinki. We had more or less two months until the main event, so we started the preparation (mainly organizing accommodation and transportation tickets). Well, these two months passed by way too fast…
An experience to inspire
On Friday, November 15th we arrived in Helsinki. It was a breezy, yet beautiful day. This time we actually decided to book accommodation in order to get at least some quality sleep in a bed instead of on a floor in the middle of hundreds of other hackers (which of course was an experience itself on the previous year). So we headed to our apartment (we even had a sauna there!). But first things first, so we went for sightseeing in Helsinki and enjoyed our stay there, though we still didn’t have a clear idea about a specific track or project. After unpacking we grabbed some snacks from the local K-Market and finally headed to the Junction venue in Aalto University campus to receive our event passes.
The venue looked awesome! The School of Arts, Design & Architecture building is huge and decorated with various colors which gives it a cozy and creative vibe. Hundreds of hackers were navigating their way through the building from one track to another – so did we. We decided to visit every booth (track) and see what kind of problem(s) they are trying to solve; what kind of tools they are providing and what kind of data do they have. This year the tracks were very data-related as is the general direction in the world, too.
Choosing our challenge to solve with machine learning
While visiting all the booths we didn’t find a clear favorite challenge to tackle. Though many tracks and problems were appealing, there weren’t unfortunately any mind-blowing gadgets to play around nor build our prototype on (compared to last year). This did set us back a bit so we decided to grab some swag and energy drinks when heading back to our apartment to further brainstorm the problem. We also got some pizza and cake for our birthday boy Raul!
After spending many hours playing around with various ideas including:
- analyzing the “DNA of Helsinki” about tourist movement from Port of Helsinki combining 1,7 billion data rows with different sources of data;
- building an innovative solution using Finland’s open data (several hundred data sets) in cooperation with Microsoft Azure cloud in order to leverage the data science techniques on Azure, and data & AI services;
- using data such as food ingredients, pricing, product availability and basket data in order to optimize and direct customer’s purchase behavior.
In the end we didn’t choose any of those and decided to hack Helvar and VTT’s track named “Making Spaces Brighter with Radar Technology” instead. The idea was to develop futuristic applications capable of delivering state-of-the-art utility (lightning, temperature, air conditioning) optimization by applying Machine Learning and AI algorithms on the data from Radar sensors. Fortunately, the organizers of the track provided the hackers with demo tools, equipment and data. So we got two datasets to start with: one from an indoor garage and another one from a meeting room. Both sets were collected using a RGB camera and a 60 GHz 2D radar. The datasets are time-synchronized. A virtual demo environment for simulating the behavior of the lights was also provided that could be controlled via specific API. Demo luminaires were also be provided that could also be controlled via API.
Let there be light
By midnight on Friday (initial project idea submission deadline) we managed to frame our main idea. We decided to use machine learning on data from radar and camera to model shared spaces and optimize their utilization. Radar data is used to monitor the number of people in a specific area and this information is used to model the space occupancy over time. This model is used to adjust the best global settings of light, room temperature and air conditions. For personalized or local adjustments, the high-resolution radar is used to estimate the user pose and recognizes gesture commands.
In our physical demo we decided to present a local application of the product which would adjust the local settings of lightning. Local application could be controlled using body gestures. When building the prototype we used a webcam as a proxy to radar detection as we didn’t have a separate radar to prototype with. Therefore we used our laptop’s webcam and a pre-trained model called PoseNet for user pose extraction. Setting up the PoseNet wasn’t complicated, but there were some challenges nevertheless. For example the gesture detection worked better if the person was visible at full length (including legs). The lightning also needs to be good in order to guarantee best detection.
As we didn’t want the detection to constantly change the local lightning setting we decided to create start and end gestures. In order to “turn on” and “turn off” the application we used a gesture when both of the wrists are close together. As we got the wrists’ coordinates from the PoseNet we set up a threshold how close the wrists should be in order to toggle the application.
After successfully “turning on” the application the light level could be controlled by moving the right arm up or down. The color temperature was also controllable with the same hand when moving it from left to right. For saving the current setting the “turn off” gesture had to be executed (putting the wrists together). We also added a state where the application couldn’t be turned on – 2 seconds right after turning off the application – in order to ensure that it won`t be turned on accidentally. To improve the user experience of the application we also included sound effects for turning on and turning off.
For the virtual demo the organizers provided an environment where we set up a simulation office with 14 rooms. The rooms were controlled differently: “room 01” was controlled by the physical demo application meaning that whenever we adjusted the local settings with our gestures the simulation environment changed accordingly in real time. “Room 02” was controlled using a regression model which was trained on historical data. Depending on the day of the week and the hour of the day the light changed accordingly based on history. Other rooms were controlled by manual input meaning that lightning conditions changed when people were added and removed manually.
How much hacking is too much?
At some point at night we discovered that the light bulbs outside our building were blinking. For a moment we got super suspicious and thought that we were controlling the street lights of the entire block. Things like that happen when you don’t get enough sleep.
After 2 days of hacking we were finally ready to demo our prototype. We had a breakfast, cleaned the apartment, polished our prototype and set our steps towards the venue. Right after stepping outside a huge breeze hit us and the pizza boxes just got their wings and took off. That was a sign!
Eventually we got to the venue on time and quickly set up our prototype for presentation. Luckily we got a whole classroom for us so we didn’t have to worry about distractions in front of the webcam possibly messing up the demo. First the track organizers visited us to listen to our presentation and perform a live demo with the physical application. In general, they seemed very satisfied. After some time other groups started visiting us and at some point the whole classroom was full of enthusiastic hackers.
In general we did very well and were satisfied with the end result at Junction. The team was great and we all had a good time and enjoyed the event. And as we didn’t have much time to see the city we decided to go back soon to visit Helsinki!
By Eerik Muuli, Developer at STACC
If you`d like to be a part of our cool community of hackers, consider joining STACC and apply now!