Elena' s AI Blog

Python

Recommender Systems


Recommendation systems are algorithms that suggest relevant items to users. Depending on the application, these items could be movies, songs, products, or anything else. In this post, we explore the basics of collaborative and content-based filtering and code them in Python.

Super-girls don't cry in face-swaps


This post covers simple ways to create face swaps, including coding and AI tools such as InsightFace. It also includes links to relevant research papers and GitHub repositories. We will also do easy Python coding for face detection and face swaps.

OpenAI's Model Show-off


OpenAI's GPT models are highly sophisticated machine learning models that are used in various fields such as natural language processing, coding assistance, and content creation. OpenAI's newest video-generating model, Sora, sets a new benchmark in video generation technology, which I quickly explore in this post.

What is Docker?


Docker lets you quickly deploy microservices, cloud-native architectures, or web apps. In this post, we will use Docker to create a reliable environment for Flask applications that efficiently manages dependencies and deployment intricacies.

Joking Flask App


In this post, I describe the process of building web applications using the Flask framework; we will create a website showing a random joke from a text file. We will learn about Jinja2 templates, static files, routing, and running Flask applications.

Cool Wallpaper with QR code for iPhone


When my iPhone is locked, I can share my website address with a QR code. How to use reportlab and Python to generate a QR code for the iPhone wallpaper?

Bias-Variance Challenge


In machine learning, we usually start from a simple baseline model and progressively adjust its complexity until we reach that spot with the best model performance. How can we do this? Let's detail the most essential machine learning concepts and the bias-variance challenge.

Decision Tree versus Random Forest, and Hyperparameter Optimisation


Decision trees, with their elegant simplicity and transparency, stand in stark contrast to the robust predictive power of Random Forest, an ensemble of trees. In this post, we compare the key distinctions, advantages, and trade-offs between these two approaches. We will use Scikit-Learn for training and testing both models and also perform hyperparameter optimisation to find both model parameters for improved performance.

Generate Music with AI


In this post, we will get into music generation with AI. We will briefly explore existing AI applications generating audio. We will explore transformer usage while coding music generation with HuggingFace transformers in Python.

Loop like a Pro with Python Iterators


This post explains the basics of Python iterators and their successful alternatives, such as list comprehension. While these alternatives use more memory, they are still useful in practice. The post also covers advanced techniques for working with iterators, including using the itertools module and creating generators with the yield keyword. By mastering iterators, readers can create elegant and efficient code and become better Python programmers.

Audio Signal Processing with Python's Librosa


In this post, I focus on audio signal processing and working with WAV files. I apply Python's Librosa library for extracting wave features commonly used in research and application tasks such as gender prediction, music genre prediction, and voice identification. To succeed in these complex tasks, we need a clear understanding of how WAV files can be analysed, which I cover in detail with handy Python code snippets.

Machine Learning Tests using the Titanic dataset


In this post, we created and evaluated several machine-learning models using the Titanic Dataset. We have compared the performance of the Logistic Regression, Decision Tree and Random Forest from Python's library scikit-learn and a Neural Network created with TensorFlow. The Random Forest Performed the best!

Data exploration and analysis with Python Pandas


In Data Science, we have so many terms explaining concepts and techniques that it is easy to need clarification and get a clear understanding of all data science components and steps. In this post, I filled the gap by explaining data science's two essential components, data analysis and exploration. To clarify things, I have shown both approaches, compared them, and provided Python code using Pandas dataframe and graph drawing.

Python coding with chatGPT


In this post, I did some Python coding with chatGPT. We have coded a neuron, a simple neural network, and learned how to train it. I am pleased with the result. I think that chatGPT has excellent potential for CS students and all coders that want to update their skills effectively. Is it an end of the StackOverflow? We cannot see the feature. However, we still need social interaction with humans, and AI cannot substitute human communication.

Git Commands and a Contribution Workflow


I have created a list of arguably the most useful Git commands and an example contribution workflow. I have also found a great JavaScript application for learning Git branching!

Linters and Git Pre-commit


It's great to focus on code development while keeping the coding style right. This could be achieved with automatic formatting checks before committing files into the code repository. In this post, I have described the pre-commit usage with git hooks and a simple setup for checking Python files.

Python classes and pigeons


Happy 1st of September, dear visitors. I have decided to write a letter to you. The letter concerns pigeons and Python classes, the essential OOP concepts such as inheritance, polymorphism, and encapsulation.

TensorFlow: Romancing with TensorFlow and NLP


In this post we will create a simple poem generation model with Keras Sequential API.

TensorFlow: Evaluating the Saved Bird Species Prediction Model


In this post, I have described the process of in-depth model evaluation. I have reused the previously created EffecientNetB0 model, which is fine-tuned with the 400 Bird Species Kaggle dataset. As a result, I have found out which bird species are not well predicted.

TensorFlow: Transfer Learning (Fine-Tuning) in Image Classification


We used a 400 species birds dataset for building bird species predictive models based on EffeicientNetB0 from Keras. The baseline model showed already an excellent Accuracy=0.9845. However, data augmentation did not help in improving accuracy, which slightly lowered to 0.9690. Further, this model with a data augmentation layer was partially unfrozen, retrained with a lower learning rate, and reached an Accuracy=0.9850.

TensorFlow: Transfer Learning (Feature Extraction) in Image Classification


Image classification is a complex task. However, we can approach the problem while reusing state-of-the-art pre-trained models. Using previously learned patterns from other models is named "Transfer Learning." This way, we can efficiently apply well-tested models, potentially leading to excellent performance.

TensorFlow: Convolutional Neural Networks for Image Classification


In this post, I have demonstrated CNN usage for birds recognition using TensorFlow and Kaggle 400 birds species dataset. We observed how the model works with the original and augmented images.

TensorFlow: Evaluating the Regression Model


In this post, we have performed the evaluation of four regression models using TensorFlow. MAE and MSE error metrics were used to compare the Sequential models while finding the best neural network architecture regarding the defined hyperparameters.

TensorFlow: Regression Model


I have described regression modeling in TensorFlow. We have predicted a numerical value and adjusted hyperparameters to better model performance with a simple neural network. We generated a dataset, demonstrated a simple data split into training and testing sets, visualised our data and the created neural network, evaluated our model using a testing dataset.

TensorFlow: Global and Operation-level Seeds


In training Machine Learning models, we want to avoid any ordering biases in the data. In some cases, such as in Cross-Validation experiments, it is essential to mix data and ensure that the order of data is the same between different runs or system restarts. We can use operation-level and global seeds to achieve the reproducibility of results.

Tensors in TensorFlow


TensorFlow is a free OS library for machine learning created by Google Brain. Tensorflow has excellent functionality for building deep neural networks. I have chosen TensorFlow because it is pretty robust, efficient, and can be used with Python. In this post, I am going to write about how we can create tensors, shuffle them, index them, get information about tensors with simple examples.

Artificial Neural Networks


Artificial neural networks (ANNs) are the cornerstone of Deep Learning algorithms. The name and the architecture are adopted from the human brain's neural network. ANNs are designed to simulate human reasoning based on how neurons communicate. ANNs contain a set of artificial neurons connected.

Python Programming Language


Python is relatively easy to learn and beginner-friendly. I like Python because you can program any kind of project with it. It is open-source and free for anyone to use. Python has well-tested machine learning libraries and a very supportive community. I will overview herein a basic syntax of the Python programming language. This will be useful for beginners or people who move quickly from another programming language to Python.

Tools and Data to Experiment with Machine Learning


Python open-source library scikit-learn provides a comprehensive selection of machine learning techniques (regression, classification, clustering), feature selection, metrics, preprocessing, and other functionality. At this moment, Scikit-learn, is lacking deep learning functionality; however, we can use TensorFlow with the Scikit Flow wrapper for creating neural networks using the Scikit-learn approach.