Skip to content
GitLab
Menu
Projects
Groups
Snippets
Loading...
Help
Help
Support
Community forum
Keyboard shortcuts
?
Submit feedback
Contribute to GitLab
Sign in / Register
Toggle navigation
Menu
Open sidebar
students
eue
Commits
39c28eaa
Commit
39c28eaa
authored
May 20, 2021
by
KangMin An
Browse files
Update : 일부 주석 추가 및 수정.
parent
edecea14
Changes
1
Hide whitespace changes
Inline
Side-by-side
server/src/data_processing/linear_regression.py
View file @
39c28eaa
...
...
@@ -48,7 +48,7 @@ class LinearRegression:
def
cost_MAE
(
self
,
x
,
y
,
w
,
b
):
'''
### 비용 함수
- MAE (Mean
Squar
e Error) : 1/n * sigma|y_i - y_hat_i|
- MAE (Mean
Absolut
e Error) : 1/n * sigma|y_i - y_hat_i|
'''
y_predict
=
self
.
predict
(
x
,
w
,
b
)
n
=
y_predict
.
shape
[
1
]
...
...
@@ -81,6 +81,14 @@ class LinearRegression:
return
w_grad
,
b_grad
def
gradientDescent
(
self
):
'''
### 경사 하강법
- Linear Regression Class의 주요 동작 함수 입니다.
- 가중치와 편향을 비용함수에 넘겨준 뒤 손실을 계산합니다.
- 가중치와 편향의 각각의 편미분 값을 계산합니다.
- 편미분 값들과 학습률을 이용해 가중치와 편향 값을 갱신 합니다.
- 위의 과정을 주어진 손실 값 이하가 되거나 3000번 반복 할 때 까지 반복합니다.
'''
iteration
=
0
...
...
@@ -88,11 +96,11 @@ class LinearRegression:
b
=
self
.
bias
while
iteration
<=
3000
:
loss
=
self
.
cost_MSE
(
self
.
train_x
,
self
.
train_t
,
w
,
b
)
grad_w
,
grad_b
=
self
.
gradient
(
self
.
train_x
,
self
.
train_t
,
self
.
delta
,
w
,
b
)
loss
=
self
.
cost_MSE
(
self
.
train_x
,
self
.
train_t
,
w
,
b
)
if
iteration
%
100
==
0
:
print
(
iteration
,
" iters - cost :"
,
loss
)
print
(
"Gradients - W :"
,
grad_w
,
", b : "
,
grad_b
)
...
...
Write
Preview
Markdown
is supported
0%
Try again
or
attach a new file
.
Attach a file
Cancel
You are about to add
0
people
to the discussion. Proceed with caution.
Finish editing this message first!
Cancel
Please
register
or
sign in
to comment