- def sigmoid(inX) : return 1.0 / (1 + exp( - inX)) def gradAscent(dataMatIn, classLabels) : dataMatrix = mat(dataMatIn)#convert to NumPy matrix labelMat = mat(classLabels).transpose()#convert to NumPy matrix m,
- n = shape(dataMatrix) alpha = 0.001 maxCycles = 500 weights = ones((n, 1)) for k in range(maxCycles) : #heavy on matrix operations h = sigmoid(dataMatrix * weights)#matrix mult error = (labelMat - h)#vector subtraction weights = weights + alpha * dataMatrix.transpose() * error#matrix mult
- return weights
来源: http://lib.csdn.net/snippet/machinelearning/42883