Rishi Sunak, the UK Prime Minister, has caused some waves in the past few days with one of his personal pledges that young people should have to study maths until 18. As he said himself in his speech:
“Every opportunity I’ve had in life began with the education I was so fortunate to receive and it’s the single most important reason why I came into politics: to give every child the highest possible standard of education.”
Rishi Sunak, January 2023
While on the surface this may seem like a great idea, critics have been keen to pull apart all the ways that compulsory maths education is either a bad thing (from Simon Pegg’s passionate rant about lack of focus on the arts, many people saying how much they hated maths at school) or simply won’t work (lack of qualified teachers or investment in schools, or even children going to school hungry). All of these criticisms have merit and I’m a big believer in ideas being criticised and pulled apart to get to something that works rather than biased idealism.
One of the great benefits of lockdown for me is the time I have to catch up on some of the papers released that are not directly related to my day to day work. In the past week I’ve been catching up on some of the more general outputs from NeurIPS 2020. One of the papers that really caught my eye was “Ultra-Low Precision 4-bit Training of Deep Neural Networks” by Xiao Sun et al.
It’s no doubt that AI in its current form takes a lot of energy. You only have to look at some of the estimated costs of GPT-3 to see how the trend is pushing for larger, more complex models with larger, more complex hardware to get state of the art results. These AI super-models take a tremendous amount of power to train, with costs out of the reach of individuals and most businesses. AI edge computing has been looking at moving on going training into smaller models on edge devices, but to get the accuracy and the speed, the default option is expensive dedicated hardware and more memory. Is there another way?
It’s been seven years of studying while working full time (and in some cases nearly double full time hours!) and I’ve now finished the degree I started for “fun” because I wasn’t being intellectually challenged in the job I had at that time. I was sceptical of all aspects of the Open University but thought I’d give it a go, knowing that without a cost to me and an exam, I would never make the time to study. While I’ve been blogging about individual modules over the years I’ve had quite a few conversations with many of you reading this blog about the pros and cons of study with the OU and one of the comments on my last post was from Korgan, who suggested I do a post about this and I’ve combined their questions with all of the others I’ve had.
Seven years ago I was in work bored and desperate for a new challenge. My daughter had recently been born and I had decided to stop playing World of Warcraft. Needing a new challenge, I had toyed with an MBA but really wanted to do something for me. So I signed up for a BSc in Mathematics with the Open University, which I knew would take about 6 years part time while working. This week, I got the results for my final module and it was confirmed I had earned a first class honours degree. But why didn’t I do maths the first time round?
This week I was due to be sat in a large hall with about 200 other Open University students taking my exam for module M347, the last of the modules for the BSc in Mathematics I started for “fun“1. As with students in traditional universities, March 2020 gave a lot of uncertainty2. While some modules were switched to be coursework based assessment, mine was confirmed to be a remote exam with the originally planned exam paper. The paper would be accessible as a PDF on the day of the exam and then submitted in two parts: a multiple choice computer marked section and then a human marked second section. We would not be time limited (other than by the 24 hours in the day!) So how did I feel about this and how did it go?
There’s a trend in job descriptions that the company may be looking for “Data Science Unicorns”, “Python Ninjas”, “Rockstar developers”, or more recently the dreaded “10x developer”. When companies ask this, it either means that they’re not sure what they need but they want someone who can do the work of a team or that they are deliberately targeting people who describe themselves in this way. A couple of years ago this got silly with “Rockstar” to the point that many less reputable recruitment agencies were over using the term, inspiring this tweet:
Many of us in the community saw this and smiled. One man went further. Dylan Beattie created Rockstar and it has a community of enthusiasts who are supporting the language with interpreters and transpilers.
While on lockdown I’ve been watching a lot of recordings from conferences earlier in the year that I didn’t have time to attend. One of these was NDC London, where Dylan was giving the closing session on the Art of Code. It’s well worth an hour of your time and he introduces Rockstar through the ubiquitous FizzBuzz coding challenge.
After watching this I asked the question to myself, could I write a (simple) neuron based machine learning application in Rockstar and call myself a “Rockstar Neural Network” developer?
Today I submitted the last assessment ahead of the exam for my tutor to mark in my Mathematical Statistics module. For once, I’m actually on track with my study but it’s not been without difficulty. If you’ve been following my OU journey then you’ll know I work full time and have a family, so dedicated study time can often be a low priority. Up until the second week of March this year1 I had a reasonable routine: I’d spend the two hours I commute Monday to Friday going through the course materials and then this extra maths wouldn’t impact work or home life.
My husband is a game developer and my contributions are usually of the sort where I look at what he’s done and say “hey wouldn’t it be great if it did this”. While these are usually positive ideas, they’re mostly a pain to code in. Today however, I was able to contribute some of my maths knowledge to help balance out one of his games.
Using an open api, he’d written a simple pokemon battle game to be used on twitch by one of our favourite streamers, FederalGhosts, and needed a way of determining player level based on the number of wins, and the number of wins required to reach the next level without recursion. While this post is specifically about the win to level relationship, you could use any progression statistic by applying scaling. Here we want to determine:
Number of wins (w) required for a given level (l)
The current player level (pl) given a number of wins (pw)
Wins remaining to the next level (wr) for a player based on current wins (pw)
Let’s take a look at a few ways of doing this. Each section below has the equations and code examples in python1. Assume all code samples have the following at the top:
My LinkedIn news feed was lit up last week by a medium post from Dario Radečić originally posted in December 2019 discussing how much maths is really needed for a job in data science. He starts with berating the answers from the Quora posts by the PhD braniacs who demand you know everything… While the article is fairly light hearted and is probably more an encouragement piece to anyone currently studying or trying to get that first job in data science, I felt that, as someone who hires data scientists1, I could add some substance from the other side.
In December, Lample and Charton from Facebook’s Artificial Intelligence Research group published a paper stating that they had created an AI application that outperformed systems such as Matlab and Mathematica when presented with complex equations. Is this a huge leap forward or just an obvious extension of maths solving systems that have been around for years? Let’s take a look.
Privacy & Cookies Policy
Necessary cookies are absolutely essential for the website to function properly. This category only includes cookies that ensures basic functionalities and security features of the website. These cookies do not store any personal information.
Any cookies that may not be particularly necessary for the website to function and is used specifically to collect user personal data via analytics, ads, other embedded contents are termed as non-necessary cookies. It is mandatory to procure user consent prior to running these cookies on your website.