How LookingForTable predicts a value [Neural Nets]

LookingFoTable is maybe the most famous project of ShowMeYourCode at the moment. It’s an android application that help people to find out where to go based on the store completeness.

As the description on play store says:

“‘A common problem when you need to go out is that you never know if the restaurant or the coffee shop you want to visit is out of space. If it is, you need to change your plans and go somewhere else, maybe far away and the loop might continue. So, “Looking For Table” is the solutions. People who are already in a store can write the number of available spots. Also the store itself can do it with a special account. This way every people who want to go out, they will know where is more possible to find a table to spend their night.”

All these sounds great. But the problem that we had is that the app needed many users to be accurate. Specifically, we needed at least one entry per hour in every store to actually have a working app for the users. And since we didn’t expand too much in the store owners, users’ entries were an important thing. But the whole thing is a circle. If you don’t have users you don’t have entries so the app is not useful and you don”t have users again!

The solution was the machine learning. Using a neural network we created a feature that predicts the number of empty tables based on previous entries. The algorithms actually learns from experience and after a long time it’s gonna be a really good predictor.

As we said, we used a neural network for that task. If you are not familial with them take a look at our first and our second tutorial (on Greek language) or in a video like the video below.

Now you know some basic staff about neural networks we are going to break down the code and see where the magic is happening!

Firstly, we need to define our model. We need to define the inputs, the outputs and the hidden layer(s). We choose to go with a 3,10,5 strategy. That means that we’ll have 3 inputs, 10 neurons on the one and only hidden layers and 5 outputs. Let’s see what all these means:

The 3 inputs are the: day of week, time, store id. So as you can see, we’ll have one neural network for all the stores. As “time” obviously we keep only the hour, not the minutes or the seconds.

The output is an one-hot encoding vector with the values: [empty, almost empty, half, almost full]. The size of the store (number of tables) matters to that calculation as you’ll see in the process.

Let’s jump to the python code. Cause the best way to understand what is happening is a jupyter notebook you can continue the reading there 🙂

Training NotebookPrediction NotebookGet the whole repo

The final result when there isn’t any entry is something like that:

Screenshot_1493118539

 

Download the new version of LookingForTable and find out all the new features 🙂
Download it on playstore Show Your Love

 

Facebook Profile photo

Studying in Computer Engineering & Informatics Department. Member of the fedora community. Interested in mobile apps and machine learning.

Leave a Reply

Your email address will not be published. Required fields are marked *