Evangelos Tsiamalos
Personal Website

× Home Info and CV Projects Extracurricular

PROJECTS

Data Science Guides (Ongoing)

 As a Teaching Assistant of Data Science Discovery and as a Lead Instructor of Data Science at the DPI Digital Scholars program in Chicago, I went to the process of creating a number of Data Science Guides. Some of them can be viewed in the Data Science Discovery Website. I am still writing new guides to be included in the website, so this is an ongoing project.

Statistical Learning (Ongoing)

Study of Splines: Dimension Reduction and the Double Descent Phenomenon

We first use a method to obtain the prediction matrix in the case of a LOESS model. We use that to write a fast Leave-one-out cross validation and Generalised Cross Validation

In the second part, we try clustering time series by reducing their dimention using natural cubic splines.

Lastly we observe the phenomenon of Double Descent using Ridgeless Regression.

Study of Linear Regression Regularisers and Principal Component Regression

Implementing Lasso, Ridge Regularisers and Principal Component Regression on a problem of linear regression. We use cross validation for the tuning parameters.

Investigation of KNN algorithm

Studying the behaviour of the KNN algorithm. The data are simulated from mixtures of normals. With different configurations on the data, we try to see the behaviour of the classifier. We perform Cross Validation and we compare our implementation (from scratch) to Python's implementation.

CWI Traineeship Project

 During my Traineeship in CWI, Amsterdam I worked and studied on various topics. The main focus was on Statistical Hypothesis testing and more specifically in developing methods of testing that are Valid under Optional Stopping. I became a lot more familiar with Information Theory and Martingale Theory.

 I undertook a small project that focused in the robustness properties of Exponential Families and whether or not this could be observed in other distributions that are not exponential families.

Here is my project's report: Full_Report.pdf

Neural Network Design and Implementation

  This is a project that I built during my undergraduate studies when I first started reading about Deep Learning. I was always curious to see how Neural Networks work and I wanted to understand it. I decided to build a Regression Neural Network with back propagation from scratch using python. I was trying to build it to the highest level of generality that I could (meaning that I wanted to give the user the freedom to determine the parameters themselves, rather than having fixed values). I solved the equations of the back-propagation algorithm and I managed to implement it. I tested it on a time-series dataset.

Code for Neural Network

Project report (It is written in Greek since it was part of my undergraduate studies).