Skip to content

First I have laid foundational mathematics behind this concept, then I coded it and used it in a demo dataset.

License

Notifications You must be signed in to change notification settings

alperkaya0/gradient-descent-from-scratch

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

8 Commits
 
 
 
 
 
 

Repository files navigation

gradient-descent-from-scratch

First I have laid foundational mathematics behind this concept, then I coded it and used it in a demo dataset.

My implementation and math, achieves 98% accuracy.

I have also used libraries such as njit to make training faster.

Njit turns your python code into compiled code and this is why your code can run at a great speed especially if you are using high number of loops, execution of similar or same code. If you don't know, python is not compiled but interpreted this is why it runs slower than compiled languages.

This was my homework at 2024, but when I looked back on it I realized it is a great work and I wanted to upload it to github.
math-image train-test-loss https://courses.cs.washington.edu/courses/csep546/23wi/assignments/hw2.pdf Logistic Regression at hw2 part b

About

First I have laid foundational mathematics behind this concept, then I coded it and used it in a demo dataset.

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published