<?xml version="1.0" encoding="utf-8" standalone="yes"?><rss version="2.0" xmlns:atom="http://www.w3.org/2005/Atom"><channel><title>Posts on Antoine Boucher</title><link>https://antoineboucher.info/CV/blog/posts/</link><description>Recent content in Posts on Antoine Boucher</description><generator>Hugo</generator><language>en-us</language><lastBuildDate>Sat, 18 Apr 2026 12:00:00 -0400</lastBuildDate><atom:link href="https://antoineboucher.info/CV/blog/posts/index.xml" rel="self" type="application/rss+xml"/><item><title>Exploring movie similarities with vector search algorithms</title><link>https://antoineboucher.info/CV/blog/posts/vector-databases-similar-movies/</link><pubDate>Mon, 13 Apr 2026 12:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/vector-databases-similar-movies/</guid><description>&lt;p&gt;This page is a &lt;strong&gt;single walkthrough&lt;/strong&gt; of a movie-similarity thread: embeddings and nearest neighbors, then a second engine with &lt;strong&gt;two vector meanings&lt;/strong&gt;, then &lt;strong&gt;retrieval + generation&lt;/strong&gt; tied to your own rows. Short animated walkthroughs from that work live in the companion notebooks and Medium series rather than in this static site bundle.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;At a glance&lt;/strong&gt;&lt;/p&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Part 1 —&lt;/strong&gt; Build a &lt;strong&gt;PostgreSQL + pgvector&lt;/strong&gt; catalog from structured movie data; run &lt;strong&gt;kNN in SQL&lt;/strong&gt; with cosine (and other) distances.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Part 2 —&lt;/strong&gt; Reuse “vectors = similarity” with &lt;strong&gt;Qdrant + MovieLens&lt;/strong&gt;: &lt;strong&gt;dense&lt;/strong&gt; text embeddings for “movies like this phrasing,” &lt;strong&gt;sparse&lt;/strong&gt; rating vectors for “users like you.”&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Part 3 —&lt;/strong&gt; Use the same pgvector-backed rows as the &lt;strong&gt;retrieval layer&lt;/strong&gt; for a small &lt;strong&gt;RAG&lt;/strong&gt; flow (&lt;strong&gt;LangChain&lt;/strong&gt; + &lt;strong&gt;Ollama&lt;/strong&gt;): question → top rows → grounded answer.&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="visualizations"&gt;Visualizations&lt;/h2&gt;
&lt;p&gt;&lt;em&gt;Part 1 — pgvector / SQL: exploring similar movies from embeddings and distance metrics.&lt;/em&gt;&lt;/p&gt;</description></item><item><title>From BMC to pitch — QcES journey notes (Spring 2024)</title><link>https://antoineboucher.info/CV/blog/posts/qces-lean-discovery-pitch/</link><pubDate>Mon, 13 Apr 2026 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/qces-lean-discovery-pitch/</guid><description>&lt;p&gt;&lt;strong&gt;&lt;a href="https://antoineboucher.info/CV/blog/posts/qces-lean-discovery-pitch/"&gt;Full article in French&lt;/a&gt;&lt;/strong&gt; — same slug; you can also switch to &lt;strong&gt;FR&lt;/strong&gt; in the header.&lt;/p&gt;
&lt;h2 id="at-a-glance"&gt;At a glance&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;The &lt;strong&gt;Business Model Canvas (BMC)&lt;/strong&gt; describes how an organization &lt;strong&gt;creates, delivers, and captures&lt;/strong&gt; value; it answers three questions: &lt;strong&gt;desirability&lt;/strong&gt;, &lt;strong&gt;feasibility&lt;/strong&gt;, and &lt;strong&gt;viability&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;You usually start with &lt;strong&gt;customer segments&lt;/strong&gt; and &lt;strong&gt;value proposition&lt;/strong&gt;, then iterate—the model &lt;strong&gt;evolves&lt;/strong&gt; with the market.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Segmentation&lt;/strong&gt; makes the market &lt;strong&gt;concrete&lt;/strong&gt; (B2B vs B2C, crisp criteria) instead of vague labels (“doctors”, “parents”).&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Interviews&lt;/strong&gt; are for &lt;strong&gt;discovery&lt;/strong&gt;: the goal is to &lt;strong&gt;learn&lt;/strong&gt;, not to sell; &lt;strong&gt;connect&lt;/strong&gt;, don’t convince—and stay attached to the &lt;strong&gt;problem&lt;/strong&gt;, not your first idea of the solution.&lt;/li&gt;
&lt;li&gt;Structured &lt;strong&gt;feedback&lt;/strong&gt; (strengths + one growth angle) and a &lt;strong&gt;short spoken pitch&lt;/strong&gt; (no slides) clarify the idea early.&lt;/li&gt;
&lt;li&gt;A &lt;strong&gt;product roadmap&lt;/strong&gt; is a &lt;strong&gt;strategic view over time&lt;/strong&gt;, not a detailed project plan; it aligns vision, audience, horizon, metrics, and resources.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;PoC&lt;/strong&gt;, &lt;strong&gt;prototype&lt;/strong&gt;, and &lt;strong&gt;MVP&lt;/strong&gt; play different roles: technology check, user interaction learning, then a &lt;strong&gt;first market version&lt;/strong&gt; you can stress-test with real users or buyers.&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Market and value proposition&lt;/strong&gt;: account for external forces (macro, industry, trends) and express value as &lt;strong&gt;offer + customer benefit&lt;/strong&gt;.&lt;/li&gt;
&lt;li&gt;A solid &lt;strong&gt;pitch&lt;/strong&gt; often follows &lt;strong&gt;Hook → Believe → Join&lt;/strong&gt;: lead with the problem, show credibility and differentiation, then make a &lt;strong&gt;specific ask&lt;/strong&gt;.&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;&lt;em&gt;This is a personal write-up based on QcES materials (Spring 2024 cohort) and facilitators; it is not an official program document.&lt;/em&gt;&lt;/p&gt;</description></item><item><title>Python library for MarketWatch virtual trading</title><link>https://antoineboucher.info/CV/blog/posts/marketwatch-python-trading/</link><pubDate>Mon, 13 Apr 2026 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/marketwatch-python-trading/</guid><description>&lt;p&gt;I published &lt;strong&gt;&lt;a href="https://pypi.org/project/marketwatch/"&gt;marketwatch&lt;/a&gt;&lt;/strong&gt; on PyPI: a small Python client for the &lt;a href="https://www.marketwatch.com"&gt;MarketWatch&lt;/a&gt; &lt;strong&gt;virtual stock game&lt;/strong&gt; (paper trading), not live brokerage access. If you want to script watchlists, pull game or portfolio data, or experiment with automation against the game, it wraps the flows in a straightforward API.&lt;/p&gt;
&lt;h2 id="links"&gt;Links&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;&lt;strong&gt;Package:&lt;/strong&gt; &lt;a href="https://pypi.org/project/marketwatch/"&gt;pypi.org/project/marketwatch&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Documentation:&lt;/strong&gt; &lt;a href="https://antoinebou12.github.io/marketwatch/"&gt;antoinebou12.github.io/marketwatch&lt;/a&gt;&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Source &amp;amp; issues:&lt;/strong&gt; &lt;a href="https://github.com/antoinebou12/marketwatch"&gt;github.com/antoinebou12/marketwatch&lt;/a&gt;&lt;/li&gt;
&lt;/ul&gt;
&lt;h2 id="what-it-can-do"&gt;What it can do&lt;/h2&gt;
&lt;ul&gt;
&lt;li&gt;Create and manage &lt;strong&gt;watchlists&lt;/strong&gt;&lt;/li&gt;
&lt;li&gt;Read &lt;strong&gt;game&lt;/strong&gt; details and settings&lt;/li&gt;
&lt;li&gt;Inspect &lt;strong&gt;portfolio&lt;/strong&gt;, positions, and pending orders&lt;/li&gt;
&lt;li&gt;&lt;strong&gt;Buy&lt;/strong&gt; and &lt;strong&gt;sell&lt;/strong&gt; (in-game)&lt;/li&gt;
&lt;li&gt;Fetch the &lt;strong&gt;leaderboard&lt;/strong&gt; for a game&lt;/li&gt;
&lt;/ul&gt;
&lt;p&gt;Useful if you are exploring automated strategies or small bots &lt;strong&gt;inside the game’s rules&lt;/strong&gt;—see the docs for method names and return shapes.&lt;/p&gt;</description></item><item><title>GraphQuon 2025 — University of Toronto</title><link>https://antoineboucher.info/CV/blog/posts/graphquon-2025-university-of-toronto/</link><pubDate>Sat, 15 Nov 2025 10:00:00 -0500</pubDate><guid>https://antoineboucher.info/CV/blog/posts/graphquon-2025-university-of-toronto/</guid><description>&lt;p&gt;&lt;strong&gt;GraphQuon&lt;/strong&gt; is the annual Quebec–Ontario pre-SIGGRAPH workshop. The &lt;strong&gt;2025&lt;/strong&gt; edition took place &lt;strong&gt;15–16 November 2025&lt;/strong&gt; at the &lt;strong&gt;University of Toronto&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;Series page: &lt;a href="https://www.dgp.toronto.edu/graphquon/"&gt;Dynamic Graphics Project — GraphQuon&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>A Method to Install Python Packages for Add-ons &amp; Plugins in Blender (Windows, Blender 4.2+)</title><link>https://antoineboucher.info/CV/blog/posts/blender-python-packages/</link><pubDate>Sat, 08 Feb 2025 12:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/blender-python-packages/</guid><description>&lt;h2 id="introduction"&gt;Introduction&lt;/h2&gt;
&lt;p&gt;Blender is a powerhouse for 3D creation, offering a Python API that allows users to extend its functionality with scripts, add-ons, and plugins. However, one challenge developers face is &lt;strong&gt;installing external Python packages&lt;/strong&gt; within Blender’s &lt;strong&gt;isolated Python environment&lt;/strong&gt;.&lt;/p&gt;
&lt;p&gt;Unlike system-wide Python installations, Blender bundles its own Python interpreter, making standard package installations tricky. This article presents &lt;strong&gt;a more general and robust method&lt;/strong&gt; to install Python dependencies for Blender add-ons and plugins — ensuring a smooth workflow across different versions.&lt;/p&gt;</description></item><item><title>GraphQuon 2024 — ÉTS (Montréal)</title><link>https://antoineboucher.info/CV/blog/posts/graphquon-2024-ets/</link><pubDate>Sat, 09 Nov 2024 09:00:00 -0500</pubDate><guid>https://antoineboucher.info/CV/blog/posts/graphquon-2024-ets/</guid><description>&lt;p&gt;&lt;strong&gt;GraphQuon&lt;/strong&gt; (formerly MOTOGRAPH) is the annual Quebec–Ontario pre-SIGGRAPH workshop for East-Canadian computer graphics labs. The &lt;strong&gt;2024&lt;/strong&gt; edition ran &lt;strong&gt;9–10 November 2024&lt;/strong&gt; at &lt;strong&gt;École de technologie supérieure (ÉTS)&lt;/strong&gt; in Montréal.&lt;/p&gt;
&lt;p&gt;Official site: &lt;a href="https://graphquon.github.io/"&gt;graphquon.github.io&lt;/a&gt;.&lt;/p&gt;</description></item><item><title>Multiple Technical Indicators Backtesting on Multiple Tickers using Python</title><link>https://antoineboucher.info/CV/blog/posts/multiple-indicators-backtesting/</link><pubDate>Thu, 30 May 2024 15:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/multiple-indicators-backtesting/</guid><description>&lt;h2 id="introduction"&gt;Introduction&lt;/h2&gt;
&lt;p&gt;In this report, we present an experiment with technical indicators using the BatchBacktesting project available on GitHub at the following link: &lt;a href="https://github.com/AlgoETS/BatchBacktesting"&gt;BatchBacktesting&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id="installing-dependencies"&gt;Installing Dependencies&lt;/h2&gt;
&lt;p&gt;To get started, install the necessary libraries:&lt;/p&gt;
&lt;p&gt;!pip install numpy httpx richp&lt;/p&gt;
&lt;h2 id="importing-modules"&gt;Importing Modules&lt;/h2&gt;
&lt;p&gt;Here are the modules to import for the script:&lt;/p&gt;
&lt;p&gt;import pandas as pd&lt;br&gt;
import numpy as np&lt;br&gt;
from datetime import datetime&lt;br&gt;
import httpx&lt;br&gt;
import concurrent.futures&lt;br&gt;
import glob&lt;br&gt;
import warnings&lt;br&gt;
from rich.progress import track&lt;/p&gt;</description></item><item><title>Economics of LEGO Sets with Data Science</title><link>https://antoineboucher.info/CV/blog/posts/economics-lego-data-science/</link><pubDate>Thu, 30 May 2024 12:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/economics-lego-data-science/</guid><description>&lt;p&gt;As a data enthusiast and LEGO fan, I decided to delve into the world of LEGO using historical data. My goal was to understand the trends, pricing, and characteristics of LEGO sets over time. Using datasets from Rebrickable and analysis tools like Pandas, Matplotlib, and Scikit-Learn, I conducted a comprehensive analysis. Here’s a journey through the history and economics of LEGO sets.&lt;/p&gt;
&lt;h2 id="dataset-overview"&gt;Dataset Overview&lt;/h2&gt;
&lt;p&gt;The datasets used for this analysis include various aspects of LEGO sets, parts, and themes:&lt;/p&gt;</description></item><item><title>Experimenting with technical indicators using Python and backtesting</title><link>https://antoineboucher.info/CV/blog/posts/experimentation-indicateurs-backtesting/</link><pubDate>Tue, 14 May 2024 20:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/experimentation-indicateurs-backtesting/</guid><description>&lt;h2 id="introduction"&gt;Introduction&lt;/h2&gt;
&lt;p&gt;In this report, we present an experiment with technical indicators using the BatchBacktesting project available on GitHub at the following link: &lt;a href="https://github.com/AlgoETS/BatchBacktesting"&gt;BatchBacktesting&lt;/a&gt;.&lt;/p&gt;
&lt;h2 id="installing-dependencies"&gt;Installing Dependencies&lt;/h2&gt;
&lt;p&gt;To get started, install the necessary libraries:&lt;/p&gt;
&lt;p&gt;!pip install numpy httpx richp&lt;/p&gt;
&lt;h2 id="importing-modules"&gt;Importing Modules&lt;/h2&gt;
&lt;p&gt;Here are the modules to import for the script:&lt;/p&gt;
&lt;p&gt;import pandas as pd&lt;br&gt;
import numpy as np&lt;br&gt;
from datetime import datetime&lt;br&gt;
import httpx&lt;br&gt;
import concurrent.futures&lt;br&gt;
import glob&lt;br&gt;
import warnings&lt;br&gt;
from rich.progress import track&lt;/p&gt;</description></item><item><title>Making Caddy, AWS EC2, CloudWatch, Step Functions, and Lambda Work Together</title><link>https://antoineboucher.info/CV/blog/posts/caddy-ec2-cloudwatch-lambda/</link><pubDate>Tue, 14 May 2024 18:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/caddy-ec2-cloudwatch-lambda/</guid><description>&lt;h2 id="introduction"&gt;Introduction&lt;/h2&gt;
&lt;p&gt;Creating a robust and scalable web infrastructure can be both complex and costly. However, with the right tools and a little bit of creativity, you can build a cost-effective and efficient solution. In this article, we will walk through setting up a Caddy web server on AWS EC2, integrating it with AWS CloudWatch for monitoring, and using AWS Step Functions and Lambda to automate and streamline operations. This guide aims to provide a comprehensive approach to setting up a low-cost dashboard using these technologies.&lt;/p&gt;</description></item><item><title>A Journey to AWS Certified Cloud Practitioner</title><link>https://antoineboucher.info/CV/blog/posts/aws-certified-cloud-practitioner/</link><pubDate>Tue, 14 May 2024 16:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/aws-certified-cloud-practitioner/</guid><description>&lt;p&gt;I’m thrilled to share that I’ve recently obtained the AWS Certified Cloud Practitioner certification from Amazon Web Services (AWS)! This accomplishment represents a significant milestone in my professional journey, and I want to take this opportunity to highlight some of the incredible tools that made this achievement possible.&lt;/p&gt;
&lt;p&gt;AWS Skill Builder and AWS Cloud Quest were instrumental in my preparation, providing an engaging and comprehensive learning experience. In this article, I’ll share my study plan and how these AWS tools can help anyone aiming to enhance their cloud computing skills.&lt;/p&gt;</description></item><item><title>Predicting Stock Prices with Monte Carlo Simulations</title><link>https://antoineboucher.info/CV/blog/posts/predicting-stock-prices-monte-carlo/</link><pubDate>Tue, 14 May 2024 09:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/predicting-stock-prices-monte-carlo/</guid><description>&lt;h2 id="introduction"&gt;Introduction&lt;/h2&gt;
&lt;p&gt;In finance, decisions are rarely about a single “forecast” price: they are about &lt;strong&gt;ranges&lt;/strong&gt;, &lt;strong&gt;tail risk&lt;/strong&gt;, and &lt;strong&gt;how wrong&lt;/strong&gt; simple models can be. This article walks through a &lt;strong&gt;Monte Carlo path simulation&lt;/strong&gt; in Python: we estimate drift and volatility from historical closes, simulate many future price paths (a geometric Brownian–style discrete step), and summarize the result as a &lt;strong&gt;distribution&lt;/strong&gt;—the right object for risk-style questions (bands, percentiles, coverage against a hold-out period).&lt;/p&gt;</description></item><item><title>Kinectron + p5.js — sketch controls and GIF export</title><link>https://antoineboucher.info/CV/blog/posts/kinectron-p5-sketch-gif/</link><pubDate>Fri, 15 Mar 2024 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/kinectron-p5-sketch-gif/</guid><description>&lt;h3 id="introduction"&gt;Introduction&lt;/h3&gt;
&lt;p&gt;This tutorial will guide you through setting up a Kinectron sketch in p5.js, which includes functionality for stopping and playing the sketch, as well as saving it as a GIF.&lt;/p&gt;
&lt;h3 id="prerequisites"&gt;Prerequisites&lt;/h3&gt;
&lt;ul&gt;
&lt;li&gt;Basic knowledge of JavaScript and p5.js.&lt;/li&gt;
&lt;li&gt;Kinectron library installed.&lt;/li&gt;
&lt;li&gt;p5.js library installed.&lt;/li&gt;
&lt;li&gt;Have a Kinect v2 or Azure Kinect DK.&lt;/li&gt;
&lt;li&gt;Have a Kinectron server running.&lt;/li&gt;
&lt;li&gt;Have a local or online environment that supports JavaScript and p5.js.&lt;/li&gt;
&lt;/ul&gt;
&lt;h3 id="step-1-set-up-your-environment"&gt;Step 1: Set Up Your Environment&lt;/h3&gt;
&lt;p&gt;Ensure you have the p5.js and Kinectron libraries included in your HTML file.&lt;/p&gt;</description></item><item><title>Byzantium’s first workshop — Solidity and an ERC-20 token on Ethereum</title><link>https://antoineboucher.info/CV/blog/posts/byzantium-solidity-ethereum-workshop/</link><pubDate>Mon, 11 Mar 2024 18:30:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/byzantium-solidity-ethereum-workshop/</guid><description>&lt;p&gt;&lt;strong&gt;Byzantium&lt;/strong&gt; ran its &lt;strong&gt;first Ethereum workshop&lt;/strong&gt;: a hands-on session where attendees went from a &lt;strong&gt;Solidity / ERC-20&lt;/strong&gt; starter (via &lt;strong&gt;OpenZeppelin&lt;/strong&gt;) to &lt;strong&gt;deploying a token&lt;/strong&gt; and &lt;strong&gt;swapping transfers&lt;/strong&gt; with each other. &lt;strong&gt;Khalil Anis Zabat&lt;/strong&gt; led the session.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;a href="https://antoineboucher.info/CV/blog/posts/byzantium-solidity-ethereum-workshop/"&gt;Full article in French&lt;/a&gt;&lt;/strong&gt; (same slug — you can also switch to &lt;strong&gt;FR&lt;/strong&gt; in the header).&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;Links:&lt;/strong&gt; &lt;a href="https://www.linkedin.com/posts/antoineboucher12_retour-sur-notre-tout-premier-workshop-activity-7173128307155156992-tQy4"&gt;LinkedIn thread / Byzantium recap&lt;/a&gt; · &lt;a href="https://lnkd.in/e-9T5-MX"&gt;Deployed contract (short link)&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Thanks to the facilitator and everyone who joined.&lt;/p&gt;</description></item><item><title>Create a portfolio with Hugo (week 1)</title><link>https://antoineboucher.info/CV/blog/posts/portfolio-hugo-week-1/</link><pubDate>Sat, 06 Jan 2024 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/portfolio-hugo-week-1/</guid><description>&lt;h2 id="introduction"&gt;Introduction&lt;/h2&gt;
&lt;p&gt;Welcome to my personal blog, a chronicle of my journey in developing a multifaceted portfolio using Hugo. As a software engineer, I am excited to share the nuances of building a dynamic and interactive website, where my professional skills intersect with personal passions. This inaugural post marks the beginning of a series in which I&amp;rsquo;ll delve into various aspects of web development, data analysis, and the integration of advanced web technologies.&lt;/p&gt;</description></item><item><title>LiDAR apartment scan with Rhino on iPhone</title><link>https://antoineboucher.info/CV/blog/posts/rhino-lidar-apartment-scan/</link><pubDate>Tue, 02 Jan 2024 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/rhino-lidar-apartment-scan/</guid><description>&lt;h2 id="introduction"&gt;Introduction&lt;/h2&gt;
&lt;p&gt;We used the Rhino app on an iPhone with LiDAR to scan our apartment and make clearer decisions about layout and furniture.&lt;/p&gt;
&lt;h3 id="lidar-and-rhino"&gt;LiDAR and Rhino&lt;/h3&gt;
&lt;p&gt;LiDAR captures depth quickly; Rhino on iPhone turns those scans into workable 3D geometry for review on device.&lt;/p&gt;
&lt;h3 id="process"&gt;Process&lt;/h3&gt;
&lt;p&gt;We walked room by room while the phone mapped space; Rhino updated the model as we moved.&lt;/p&gt;
&lt;h3 id="screenshots"&gt;Screenshots&lt;/h3&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/rhino-lidar-apartment-scan/images/Screenshot-from-2024-01-02-22-41-35.png" alt="Scan workflow"&gt;&lt;/p&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/rhino-lidar-apartment-scan/images/Screenshot-from-2024-01-02-22-42-01.png" alt="Model review"&gt;&lt;/p&gt;
&lt;h3 id="still-frames-from-the-scan"&gt;Still frames from the scan&lt;/h3&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/rhino-lidar-apartment-scan/images/1781a950-9e10-4bcd-b142-1711d9e73881.jpg" alt="Room capture"&gt;&lt;/p&gt;</description></item><item><title>My journey in software engineering</title><link>https://antoineboucher.info/CV/blog/posts/software-engineering-journey/</link><pubDate>Sat, 30 Dec 2023 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/software-engineering-journey/</guid><description>&lt;p&gt;This site’s bio sums up the slice of the field I care about most: &lt;strong&gt;backend&lt;/strong&gt;, &lt;strong&gt;platform&lt;/strong&gt;, and &lt;strong&gt;DevSecOps&lt;/strong&gt;. This post is a longer look at how I think about that journey — not a timeline of jobs, but the ideas that kept showing up once I stopped treating “shipping features” as the only scoreboard.&lt;/p&gt;
&lt;h2 id="from-features-to-systems"&gt;From features to systems&lt;/h2&gt;
&lt;p&gt;Early on, progress often feels linear: tickets closed, endpoints added, screens shipped. That work matters. Over time, though, the interesting problems sit one level up: how services talk to each other, how failures propagate, how a change in one team’s repo affects everyone else on Monday morning. Backend engineering stops being “write the handler” and becomes “design something that stays understandable when you’re not in the room.”&lt;/p&gt;</description></item><item><title>Creating a professional résumé with JSON Resume</title><link>https://antoineboucher.info/CV/blog/posts/professional-resume-json-resume/</link><pubDate>Sat, 10 Sep 2022 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/professional-resume-json-resume/</guid><description>&lt;h2 id="introduction"&gt;Introduction&lt;/h2&gt;
&lt;p&gt;In today&amp;rsquo;s digital world, having an online resume is crucial for showcasing your professional profile. One effective way to create an online resume is by using the JSON Resume npm package. This package allows you to write your resume in JSON and then export it to various formats such as HTML, PDF, or even integrate it into your personal website.&lt;/p&gt;
&lt;h2 id="json-resume-format"&gt;JSON Resume Format&lt;/h2&gt;
&lt;p&gt;JSON Resume is a community-driven open-source initiative to create a JSON-based standard for resumes. The format is lightweight and easy to use, making it perfect for building tools around it.&lt;/p&gt;</description></item><item><title>D2C OpenAI plugin — diagrams with PlantUML, Mermaid, and D2</title><link>https://antoineboucher.info/CV/blog/posts/d2c-openai-diagram-plugin/</link><pubDate>Tue, 06 Sep 2022 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/d2c-openai-diagram-plugin/</guid><description>&lt;p&gt;Github: &lt;a href="https://lnkd.in/en3dSVuQ"&gt;https://lnkd.in/en3dSVuQ&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;Plugin URL: &lt;a href="https://lnkd.in/exVNZMnT"&gt;https://lnkd.in/exVNZMnT&lt;/a&gt;&lt;/p&gt;
&lt;p&gt;D2COpenAIPlugin is a plugin for ChatGPT that enables users to generate diagrams using PlantUML, Mermaid, D2. This plugin enhances the capabilities of ChatGPT by providing a seamless way to create diverse and creative diagrams.&lt;/p&gt;
&lt;p&gt;For a &lt;strong&gt;prompt-in-the-chat&lt;/strong&gt; workflow (AIPRM template, cache hit/miss sequence examples, and canvas-tool tips), see &lt;strong&gt;&lt;a href="https://antoineboucher.info/CV/blog/posts/chatgpt-airprm-sequence-diagrams/"&gt;Diagram prompts with ChatGPT and AIPRM&lt;/a&gt;&lt;/strong&gt; — complementary to this plugin-based approach.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/d2c-openai-diagram-plugin/images/1692387139389.jpeg" alt="1692387139389.jpeg"&gt;&lt;/p&gt;
&lt;p&gt;🤖 ChatGPT UML Plugins - DEMO&lt;/p&gt;</description></item><item><title>Diagram prompts with ChatGPT and AIPRM (PlantUML, Mermaid, and more)</title><link>https://antoineboucher.info/CV/blog/posts/chatgpt-airprm-sequence-diagrams/</link><pubDate>Tue, 06 Sep 2022 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/chatgpt-airprm-sequence-diagrams/</guid><description>&lt;p&gt;The &lt;a href="https://www.aiprm.com/"&gt;AIPRM&lt;/a&gt; browser extension gives you reusable prompt templates inside ChatGPT. Combined with a small &lt;strong&gt;structured prompt&lt;/strong&gt; (diagram type, what to draw, why, and which tool), you get consistent output whether you want text-first formats like &lt;strong&gt;PlantUML&lt;/strong&gt; or &lt;strong&gt;Mermaid&lt;/strong&gt;, or a recipe for redrawing the same flow in a canvas tool.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;a href="https://antoineboucher.info/CV/blog/posts/chatgpt-airprm-sequence-diagrams/"&gt;Full article in French&lt;/a&gt;&lt;/strong&gt; (same slug — you can also switch to &lt;strong&gt;FR&lt;/strong&gt; in the site header).&lt;/p&gt;
&lt;h2 id="aiprm-prompt-template-copy-and-adapt"&gt;AIPRM prompt template (copy and adapt)&lt;/h2&gt;
&lt;p&gt;Fill one line per dimension. You can paste the block below into ChatGPT (with or without AIPRM) and edit the bracketed values.&lt;/p&gt;</description></item><item><title>Expo Manger Santé 2023 — olives, kiosks, and discoveries</title><link>https://antoineboucher.info/CV/blog/posts/expo-manger-sante-2023/</link><pubDate>Tue, 06 Sep 2022 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/expo-manger-sante-2023/</guid><description>&lt;p&gt;I recently took part in &lt;strong&gt;Expo Manger Santé 2023&lt;/strong&gt; at &lt;strong&gt;Place des Congrès&lt;/strong&gt; in Montreal. Event photos were taken by &lt;strong&gt;OS7Media&lt;/strong&gt; (&lt;a href="mailto:os7mediamatrix@gmail.com"&gt;os7mediamatrix@gmail.com&lt;/a&gt;) — thank you for the shots used here.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/expo-manger-sante-2023/images/antoine1.jpeg" alt="Antoine1"&gt;
&lt;img src="https://antoineboucher.info/CV/blog/posts/expo-manger-sante-2023/images/antoine2.jpeg" alt="Antoine2"&gt;&lt;/p&gt;
&lt;h2 id="a-successful-sales-endeavor"&gt;A Successful Sales Endeavor&lt;/h2&gt;
&lt;p&gt;As a salesman, I am passionate about the rich, savory taste of olives and their health benefits. Over two days, I had the opportunity to share this passion with attendees, which translated into remarkable sales, netting $300. It was not just about the sales, though; it was about the connections made and the stories shared over the love of olives.&lt;/p&gt;</description></item><item><title>GitHub Copilot session at Cédille (with GitHub &amp; Arctiq)</title><link>https://antoineboucher.info/CV/blog/posts/github-copilot-cedille-session/</link><pubDate>Tue, 06 Sep 2022 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/github-copilot-cedille-session/</guid><description>&lt;p&gt;At &lt;strong&gt;Cédille&lt;/strong&gt;, we hosted a session with &lt;strong&gt;GitHub&lt;/strong&gt; and &lt;strong&gt;Arctiq&lt;/strong&gt; focused on &lt;strong&gt;GitHub Copilot&lt;/strong&gt; and AI-assisted development. Highlights included &lt;strong&gt;Copilot Chat&lt;/strong&gt; with &lt;code&gt;/createNotebook&lt;/code&gt; for quick Jupyter notebooks from existing code, and pointers to &lt;strong&gt;&lt;a href="https://githubnext.com"&gt;GitHub Next&lt;/a&gt;&lt;/strong&gt; experiments.&lt;/p&gt;
&lt;p&gt;Thanks to speakers &lt;strong&gt;Thierry Madkaud&lt;/strong&gt; and &lt;strong&gt;Eldrick Wega&lt;/strong&gt;, and to everyone who joined.&lt;/p&gt;
&lt;p&gt;&lt;strong&gt;&lt;a href="https://antoineboucher.info/CV/blog/posts/github-copilot-cedille-session/"&gt;Full article in French&lt;/a&gt;&lt;/strong&gt; (same slug — you can also switch to &lt;strong&gt;FR&lt;/strong&gt; in the header).&lt;/p&gt;</description></item><item><title>Live chat and support platforms compared (3CX, ManyChat, Kommunicate, Chatwoot)</title><link>https://antoineboucher.info/CV/blog/posts/livechat-platform-notes/</link><pubDate>Tue, 06 Sep 2022 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/livechat-platform-notes/</guid><description>&lt;p&gt;These notes come from comparing options for &lt;strong&gt;website live chat&lt;/strong&gt;, &lt;strong&gt;chatbots&lt;/strong&gt;, and a &lt;strong&gt;shared support inbox&lt;/strong&gt;. The products below are not interchangeable: some are full communications stacks, others are marketing automation, and one is an open-source helpdesk. &lt;strong&gt;Pricing, channels, and features change often&lt;/strong&gt;—treat this as orientation, then confirm on each vendor’s site.&lt;/p&gt;
&lt;h2 id="3cx"&gt;3CX&lt;/h2&gt;
&lt;p&gt;&lt;a href="https://www.3cx.com"&gt;3CX&lt;/a&gt; is primarily &lt;strong&gt;UCaaS / PBX&lt;/strong&gt; (phones, meetings, extensions). Its &lt;strong&gt;web live chat&lt;/strong&gt; and related widgets sit in that same ecosystem, which helps if you already route voice and chat through 3CX and want one vendor for queues and agents.&lt;/p&gt;</description></item><item><title>Run:ai on AWS — webinar notes (inference &amp; autoscaling)</title><link>https://antoineboucher.info/CV/blog/posts/runai-aws-inference-webinar/</link><pubDate>Tue, 06 Sep 2022 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/runai-aws-inference-webinar/</guid><description>&lt;p&gt;Notes from the &lt;strong&gt;Run:ai&lt;/strong&gt; webinar on running and scaling &lt;strong&gt;inference&lt;/strong&gt; workloads on &lt;strong&gt;AWS&lt;/strong&gt; (Americas). Run:ai focuses on scheduling, visibility, and efficiency for GPU-backed models in shared environments.&lt;/p&gt;
&lt;h2 id="dashboard"&gt;Dashboard&lt;/h2&gt;
&lt;p&gt;Overview of jobs and resource usage.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/runai-aws-inference-webinar/images/dashboard.jpeg" alt="Dashboard"&gt;&lt;/p&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/runai-aws-inference-webinar/images/dashboard1.jpeg" alt="Dashboard (alternate view)"&gt;&lt;/p&gt;
&lt;h2 id="cli"&gt;CLI&lt;/h2&gt;
&lt;p&gt;Command-line operations and automation.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/runai-aws-inference-webinar/images/cli.jpeg" alt="CLI"&gt;&lt;/p&gt;
&lt;h2 id="models-and-load"&gt;Models and load&lt;/h2&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/runai-aws-inference-webinar/images/model.jpeg" alt="Model view"&gt;&lt;/p&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/runai-aws-inference-webinar/images/multi.jpeg" alt="Multi-instance / scaling"&gt;&lt;/p&gt;
&lt;h2 id="workload-management"&gt;Workload management&lt;/h2&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/runai-aws-inference-webinar/images/managing.jpeg" alt="Managing workloads"&gt;&lt;/p&gt;
&lt;h2 id="infrastructure-view"&gt;Infrastructure view&lt;/h2&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/runai-aws-inference-webinar/images/servers.jpeg" alt="Servers"&gt;&lt;/p&gt;
&lt;h2 id="demo"&gt;Demo&lt;/h2&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/runai-aws-inference-webinar/images/demo.jpeg" alt="Demo"&gt;&lt;/p&gt;
&lt;h2 id="challenges"&gt;Challenges&lt;/h2&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/runai-aws-inference-webinar/images/challenges.jpeg" alt="Challenges slide"&gt;&lt;/p&gt;
&lt;hr&gt;
&lt;p&gt;For product details, see the official &lt;strong&gt;Run:ai&lt;/strong&gt; documentation and &lt;strong&gt;AWS&lt;/strong&gt; marketplace or partner listings.&lt;/p&gt;</description></item><item><title>Snapchat Lens Creator</title><link>https://antoineboucher.info/CV/blog/posts/snapchat-lens-creator/</link><pubDate>Tue, 06 Sep 2022 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/snapchat-lens-creator/</guid><description>&lt;p&gt;&lt;em&gt;Updated April 2026 with current Lens Insights figures.&lt;/em&gt;&lt;/p&gt;
&lt;p&gt;To date, my Snapchat lenses have accumulated &lt;strong&gt;6.21M plays&lt;/strong&gt;, &lt;strong&gt;12.11M views&lt;/strong&gt;, &lt;strong&gt;616.4k shares&lt;/strong&gt;, and &lt;strong&gt;6,893 favorites&lt;/strong&gt; (all-time, Lens Insights). That started as a personal interest in AR filters and grew into paid work on &lt;strong&gt;Fiverr&lt;/strong&gt; alongside my own experiments.&lt;/p&gt;
&lt;p&gt;Between &lt;strong&gt;2017 and 2020&lt;/strong&gt; I shipped &lt;strong&gt;42 lenses&lt;/strong&gt; for myself and clients. A few that carried the most usage include &lt;strong&gt;Go Crazy Facetime&lt;/strong&gt; (~2.9M plays), &lt;strong&gt;Face Ghosting&lt;/strong&gt; (~1.2M plays), and &lt;strong&gt;BIG SMILE&lt;/strong&gt; (~520k plays).&lt;/p&gt;</description></item><item><title>Snowflake Data-for-Breakfast Conference Insights</title><link>https://antoineboucher.info/CV/blog/posts/snowflake-data-for-breakfast/</link><pubDate>Tue, 06 Sep 2022 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/snowflake-data-for-breakfast/</guid><description>&lt;p&gt;Notes from the Snowflake Data-for-Breakfast conference on the Snowflake Cloud Data Platform, data warehousing, integration, and analytics—including a strong keynote from Infostrux.&lt;/p&gt;
&lt;h2 id="overview"&gt;Overview&lt;/h2&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/snowflake-data-for-breakfast/images/governed.jpeg" alt="Conference materials"&gt;&lt;/p&gt;
&lt;h2 id="key-takeaways"&gt;Key takeaways&lt;/h2&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/snowflake-data-for-breakfast/images/menu.jpeg" alt="Event menu"&gt;&lt;/p&gt;
&lt;h3 id="global-data-operations"&gt;Global data operations&lt;/h3&gt;
&lt;p&gt;A healthcare customer case study showed Snowflake managing secure data operations across three continents, simplifying partner data sharing while keeping high availability and strong SLAs.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/snowflake-data-for-breakfast/images/sharethrough.jpeg" alt="Data sharing"&gt;&lt;/p&gt;
&lt;p&gt;Cloud data platforms remain a practical backbone for consolidation and analytics; this event was a useful snapshot of where Snowflake is heading.&lt;/p&gt;</description></item><item><title>Renpho scale, Home Assistant, and reverse-engineering the API</title><link>https://antoineboucher.info/CV/blog/posts/renpho-health-api-blueprint/</link><pubDate>Sun, 10 Oct 2021 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/renpho-health-api-blueprint/</guid><description>&lt;h2 id="inspiration-from-bryan-johnsons-blueprint-protocol"&gt;Inspiration from Bryan Johnson’s &amp;ldquo;Blueprint Protocol&amp;rdquo;&lt;/h2&gt;
&lt;p&gt;Personal health tracking, for me, started with Bryan Johnson’s &amp;ldquo;Blueprint Protocol&amp;rdquo;—a push for self-quantification that matched how I already thought about fitness. I wanted the same granularity for my own body, and a Renpho scale with bio-impedance turned out to be a practical way to get a steady stream of numbers beyond simple weight.&lt;/p&gt;
&lt;p&gt;&lt;img src="https://antoineboucher.info/CV/blog/posts/renpho-health-api-blueprint/images/blueprint.jpg" alt="Blueprint Protocol Inspiration"&gt;&lt;/p&gt;
&lt;h2 id="forking-hass-renpho-and-the-home-assistant-ecosystem"&gt;Forking hass-renpho and the Home Assistant ecosystem&lt;/h2&gt;
&lt;p&gt;I found &lt;code&gt;hass-renpho&lt;/code&gt;, a custom integration that pulls Renpho scale data into Home Assistant. The project had gone quiet, and with the original maintainer unavailable I forked it to extend support for more of the metrics the hardware exposes.&lt;/p&gt;</description></item><item><title>Networking evolution — building a home network lab</title><link>https://antoineboucher.info/CV/blog/posts/home-networking-evolution/</link><pubDate>Mon, 06 Sep 2021 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/home-networking-evolution/</guid><description>&lt;h2 id="introduction"&gt;Introduction&lt;/h2&gt;
&lt;p&gt;Welcome to a new chapter in my blog where I dive into the intricacies of building a robust home networking system. As a software engineer with a passion for networking protocols and efficient computing, I&amp;rsquo;ve embarked on a journey to design a system that balances performance, security, and cost-effectiveness. This post will detail my experiences and the technical decisions I made along the way.&lt;/p&gt;
&lt;h2 id="embracing-the-challenge-of-home-networking"&gt;Embracing the Challenge of Home Networking&lt;/h2&gt;
&lt;p&gt;My interest in networking began during my academic years, where I learned about various protocols such as IP, VPN, and IP7. Motivated by the high costs of cloud computing, I set out to build a home-based system. My goal was to use older computers, minimizing expenses on hardware and online services, while still achieving a high degree of functionality and efficiency.&lt;/p&gt;</description></item><item><title>CodePen demos (collection)</title><link>https://antoineboucher.info/CV/blog/posts/codepen-demos-antoinebou13/</link><pubDate>Tue, 10 Jan 2017 10:00:00 -0400</pubDate><guid>https://antoineboucher.info/CV/blog/posts/codepen-demos-antoinebou13/</guid><description>&lt;p&gt;Small &lt;strong&gt;CodePen&lt;/strong&gt; experiments and two larger demos (Canvas and Three.js) below. Everything is a live embed so you can fork or read the source on CodePen.&lt;/p&gt;
&lt;style&gt;
 .codepen-pen-grid {
 display: grid;
 gap: 1.25rem;
 grid-template-columns: repeat(auto-fill, minmax(min(100%, 22rem), 1fr));
 }
 .codepen-pen-grid .codepen-grid-card {
 display: flex;
 flex-direction: column;
 min-width: 0;
 }
 .codepen-pen-grid .codepen-grid-card h3 {
 font-size: 1rem;
 font-weight: 600;
 margin: 0 0 0.35rem 0;
 line-height: 1.3;
 }
 .codepen-pen-grid .codepen-grid-card .codepen-grid-caption {
 font-size: 0.875rem;
 opacity: 0.85;
 margin: 0 0 0.75rem 0;
 line-height: 1.45;
 }
 .codepen-pen-grid .codepen-grid-card a.codepen-grid-open {
 margin-top: 0.5rem;
 font-size: 0.875rem;
 font-weight: 500;
 text-decoration: underline;
 text-underline-offset: 2px;
 }
 .codepen-pen-grid .codepen-grid-card a.codepen-grid-open:hover {
 opacity: 0.9;
 }
&lt;/style&gt;
&lt;div class="codepen-pen-grid not-prose my-8"&gt;&lt;div class="codepen-grid-card rounded-lg border border-black/10 bg-black/[0.02] p-3 dark:border-white/15 dark:bg-white/[0.03]"&gt;
 &lt;h3&gt;BMdzwx&lt;/h3&gt;
 &lt;p class="codepen-grid-caption"&gt;Compact front-end sketch—open on CodePen to see HTML, CSS, and JS.&lt;/p&gt;</description></item></channel></rss>