machine learning - a perceptron algorithm with standard gradient descent in python -
i designed perceptron standard gradient descent, has errors non-descent number of errors, shown below. could
def predict(row, weights, bias): activation = 0 in xrange(len(row) - 1): activation += row[i] * weights[i] activation += bias return activation def data_train(dataset, weights, bias, rolling_time): sum_loss_w = [0, 0, 0, 0] loss_b = 0 time in xrange(rolling_time): sum_loss = 0 row in dataset: prediction = predict(row, weights, bias) prediction_m = row[-1] * prediction if prediction_m <= 0: sum_loss += 1 in xrange(len(single_error_w)): single_error_w[i] = row[i] * row[-1] sum_loss_w[i] += single_error_w[i] loss_b += row[-1] element in xrange(len(weights)): weights[element] = weights[element] + step_size * sum_loss_w[element] bias = bias + step_size * loss_b print'>>time = %d, step_size = %d, error = %d' %(time, step_size, sum_loss) return weights, bias
and results below,
>>time = 0, step_size = 1, error = 1000 >>time = 1, step_size = 1, error = 394 >>time = 2, step_size = 1, error = 110 >>time = 3, step_size = 1, error = 570 >>time = 4, step_size = 1, error = 9 >>time = 5, step_size = 1, error = 314 >>time = 6, step_size = 1, error = 248 >>time = 7, step_size = 1, error = 41 >>time = 8, step_size = 1, error = 235 >>time = 9, step_size = 1, error = 203
how can modify it?
you guys can download training data here: https://elearning.utdallas.edu/bbcswebdav/pid-1673953-dt-content-rid-16874944_1/xid-16874944_1
Comments
Post a Comment