We definitely don’t have a shortage of methods for generating
normally-distributed random numbers from a source of uniformly-distributed
random numbers. One of such methods is the so called Polar Method, a variation
of the Box-Muller Transform, which I already described before. You might want to take a look at it
before reading this.
This algorithm is named after George Edward Pelham Box and Mervin Edgar Muller,
who published it on a two-page paper in 1958. The idea was not original, though:
it appeared already in the 1934 book Fourier Transforms in the Complex Domain,
by Raymond E. A. C. Paley and Norbert Wiener. Stigler’s
law strikes again!
We don’t know for sure when the Euclidean Algorithm was created nor by whom, but
it was made famous around 300 BC by the Elements – the magnum opus of Greek
mathematician Euclid. Wikipedia describes it as “one of the oldest algorithms in
common use” and Knuth affectionately calls it “the granddaddy of all
This is a repository of more or less random programming things, made for my own
amusement and edification. I don’t know how this will evolve over time (if at
all), but I envision this as a collection of interactive visual explanations of
algorithms and data structures.
🎄 Feliz Natal! 🎅
(In Portuguese 😉)
So it happened again. After some random event like a system upgrade or power failure(?), what I see upon booting, somehow, is Windows Boot Manager instead of GRUB. Here’s how I fixed it.
Just sharing the OpenMSX settings I am using in order to properly use an Xbox 360 controller. The mappings are optimized for Konami games (in particular, Metal Gear 2: Solid Snake).
I’ve been watching and toying a little bit with the Godot game engine for some time now, and it impressed me in several ways. One thing I missed, however, was a satisfying way to write more efficient code in those ever rarer situations in which GDScript couldn’t give the speed I wanted. I could write a module in C++, but this involved recompiling the whole engine and, well, programming in C++.
Now, with the upcoming Godot 3.0 (currently in alpha), a much nicer alternative has been introduced: GDNative. I tested it and it mostly worked. Here’s a summary of my experience.
On the second and final part of this conceptual introduction to Machine Learning (ML), I’ll discuss its relationship with other areas (like Data Science) and describe what I perceive as a common theme among many of the ML algorithms. Emphasis on “what I perceive”: don’t take this as the truth.
“Machine Learning” is not just a buzzword — arguably, it is two. Almost everybody seems to be using Machine Learning (ML) in a way or another, and those who aren’t are looking forward to use it. Sounds like a good topic to know about. I did some nice Neural Network stuff with some colleagues in school in the late 90s. Maybe I could just brag that I have nearly 20 years of experience in the field, but this would not be exactly an honest statement, as I didn’t do much ML since then.
Anyway, this is a fun, useful and increasingly important field, so, I guess it is time to do some ML for real. Here’s the first set of notes about my studies, in which I present some important concepts without getting into specific algorithms.