- Hi, this is Steffen. I enjoy building machine learning systems including model development. Followed by
playingexperimenting to get a better understanding 馃挙馃挙 - In my previous life, I ran numerical simulation to understand the chemical interactions between water and our environment. 馃實馃寠馃尰
Building Agents & LLM Workflows
Hi, Over the last three months, I鈥檝e been working on Agents and LLM workflows. I had the opportunity to do this as part of a residency program at Merantix. I鈥檓 deeply thankful for the opportunity and would highly recommend the program to others. Being surrounded by people with shared goals was incredibly motivating and inspiring. I already miss the discussions and the knowledge exchange. Unlike typical work environments, here the participants were truly aligned, and everyone learned from each other. ...
Evaluation of RAG systems
Hi, The implementation of this article is here. RAGs are complex systems. This is obvious, when you try to evaluate them. There are multiple aspects, which need to be checked. Here, I try to look into different approaches to get a better understanding and problems, when facing RAG systems. RAG system evaluation involves two distinct parts: retrieval and generation part. For retrieval, context relevance and noise robustness are key factors in assessing quality, while for generation part, key factors like answer faithfulness, answer relevance, negative rejection, information integration, and counterfactual robustness are important (Gao et al. 2024). ...
Understanding approximate nearest neighbor algorithm
Hi, This post is about the approximate nearest neighbor (ANN) algorithm. The code for this post is here, where I provide an example of using a framework and a python implementation. Most python implementation were written with the help of a LLM. I鈥檓 amazed, how helpful they are for learning new things. I see them like a drunken professor, which with the right approach will be a very helpful tool. As a next step in understanding RAGs, I want to have a closer look at approximate nearest neighbor algorithms. Basically, the purpose is to find the closest vector to a query vector in a database. Since I鈥檓 also interested into the implementation, I follow mostly this amazing blog post. Vector search is the basic component of vector databases and their main purpose. ANN algorithms are looking for a close match instead of an exact match. This loss of accuracy allows an improvement of efficieny, which allows the search through much bigger datasets, high-dimensional data and real-time apllications. ...
Short example of Information Retrieval
Hi, Some time ago, I did a small project on information retrieval. I think, it\s a good idea to share it with all its shortcomings. Here is the code. Sadly, the LLM part doesn鈥檛 work with the quantized model, so I commented it out. The project is a small information retrieval of a FAQ, where I want to map the correct answer to a question. In my example, it鈥檚 a 1:1 mapping between question and answer, but it also works with multiple answers. ...
Get embeddings for multiple data sources
Hi, Following my first short post about RAGs, I would like to provide a brief overview about embeddings, which are used to find similiar objects in a vector database. To better understand how various transformer models handle different input data types, I created this notebook. I explore therefor, text, image, audio and video data. I鈥檝e chosen to skip the more traditional text embeddings (TF-IDF, Word2Vec or GloVe), because there are already very good tutorials available. Additionally, I plan to discuss the training of embedding models in a separate blog post. For this post, I use mostly pretrained classification models, where I use the last layer before the prediction head as embedding. ...
Overview of RAG (Retrieval-Augmented Generation) systems
Hi, It鈥檚 been a while since my last post, mostly because of my own laziness. Over the past year, I鈥檝e been working on several projects, one of which is a small RAG (Retrieval-Augmented Generation) system. I implemented it to combine external knowledge (in this case internal safety documents) with a large language model (LLM). This approach allows the use of data that the LLM wasn鈥檛 trained on and also helps reduce hallucinations. ...
Deep Learning model explainability
Hi, In my first post, I looked into the explainability of classical machine learning models. As a next step, I鈥檓 interested in the explainability of neural networks. Model explainability is easy for simple models (linear regression, decision trees), and some tools exist for more complex algorithms (ensemble trees). Therefore, I highly recommend the book Interpretable Machine Learning by Christoph Molnar for a deeper theoretical understanding. All different approaches for model explanability are shown with a PyTorch model in this kaggle notebook. ...
Model explainability
Hi, Some months have passed since my last post. Model explainability is easy for simple models (linear regression, decision trees), and some tools exist for more complex algorithms (ensemble trees). I want to dig into the tools to interpret more complex models with this post. Therefore, I highly recommend the book Interpretable Machine Learning by Christoph Molnar for a deeper theoretical understanding. All different approaches for model explanability are shown with a RandomForest model in this kaggle notebook. ...
Implementing a Transformer Network from scratch
Hi, This post is about my implementation of an encoder transformer network from scratch as a follow-up of understanding the attention layer together with the colab implementation. I use a simplified dataset, where I don鈥檛 expect great results. My approach is building something from scratch to understand it in depth. I faced many challenges during my implementation, so I aligned my code to the BertSequenceClassifier from huggingface. My biggest challenge was to get the network to train. This challenge took me several months of low focus and a proper de- and reconstruction of the architecture. Minor issues were missing skip connections and some data preparation issues. ...
Learning about time-series analysis
Hi, Recently, I had to work on a simple time-series analysis. I performed poorly since I never worked with time-series before. I believe in a deterministic world, and in general, I prefer to find the causality of a specific data behavior prior to a simple way of empiristic modeling. However, I understand the need for time-series analysis as not enough data available, the underlying processes understood, the complexity bearable, or the time/need for a proper process understanding. The goal is to make a prediction based on the previously observed observations. In a traditional sense (Arima), you look at the trend, seasonality, and cycles - in the more modern way, you throw the data into a model architecture (deep learning). In this context, I should mention the famous paper Statistical Modeling: The Two Cultures, while I prefer to use algorithmic models and treat the data mechanism as unknown. I would add that the underlying data mechanism is deterministic, and we should use collected data to get improved models. Anyway, let鈥檚 use the many resources in the time-series field to get better in this field. ...