A simple exercise in regressing a sinusoidal function

Frank Lanke Fu Tarimo
2 min readFeb 7, 2021

My goal is simple: To learn the frequency of a sinusoid using stochastic gradient descent (SGD) in PyTorch. Periodic signals are abundant in nature, e.g. in sound signals and it would be interesting to see what simple models we can build to generate such signals.

PyTorch feels like an easy off-the-shelf tool so it's what I'm going to use.

In the snippet below, I initialise a sinusoidal signal with a know frequency of wave_frequency. Subsequently, we create a model that linearly transforms (affine really, since the bias term is non-zero) the input variable x and passes it through torch.sin() to generate the output. We then iteratively minimise the discrepancy between the model's output and our desired wave using SGD with the mean squared error loss.

Though the example above only iterated for 2 steps, I've tried longer iterations which still weren't able to recreate the desired signal. They tend settle in some straight line that cuts diagonally across the screen.

Lots of questions comes to mind as to what could improve the optimisation process. Is the learning rate too high or low, should I use more iterations? These are all valid concerns but the most burning question IMO is: Am I even properly doing SGD? In the implementation shown above, every sample point in the wave data is processed in every iteration. For some points in the dataset our estimate is lower than the true value whereas for other points, our estimate is higher than the true value. I suspect that this creates contradicting losses which renders us stuck at the local optimum we see in the plot above.

In an upcoming post, I will be exploring fixing this issue using proper SGD :D with PyTorch Dataset and Dataloader.

--

--

Frank Lanke Fu Tarimo

PhD Candidate in Perception for Robotics. University of Oxford