First I have laid foundational mathematics behind this concept, then I coded it and used it in a demo dataset.
My implementation and math, achieves 98% accuracy.
I have also used libraries such as njit to make training faster.
Njit turns your python code into compiled code and this is why your code can run at a great speed especially if you are using high number of loops, execution of similar or same code. If you don't know, python is not compiled but interpreted this is why it runs slower than compiled languages.
This was my homework at 2024, but when I looked back on it I realized it is a great work and I wanted to upload it to github.
https://courses.cs.washington.edu/courses/csep546/23wi/assignments/hw2.pdf Logistic Regression at hw2 part b
-
Notifications
You must be signed in to change notification settings - Fork 0
First I have laid foundational mathematics behind this concept, then I coded it and used it in a demo dataset.
License
alperkaya0/gradient-descent-from-scratch
Folders and files
Name | Name | Last commit message | Last commit date | |
---|---|---|---|---|
Repository files navigation
About
First I have laid foundational mathematics behind this concept, then I coded it and used it in a demo dataset.
Resources
License
Stars
Watchers
Forks
Releases
No releases published
Packages 0
No packages published