python - Another sigmoidal regression equation question -


i posted an earlier version of yesterday, cannot seem add version posting because seems have closed posting editing, here new version in new posting.

i have script below following things:
1.) plot best fit curve sigmoidal data.
2.) re-size data based on new max , min coordinates x , y.
3.) calculate , plot new best fit curve resized data.

steps 1 , 2 seem work fine, step 3 not. if run script, see plots totally invalid curve re-sized data.

can show me how revise code below creates , plots true best fit sigmoidal curve re-sized data? needs reproducible when re-sized across spectrum of possible max , min values.

i seem able track problem new_p, defined in following line of code:

new_p, new_cov, new_infodict, new_mesg, new_ier = scipy.optimize.leastsq(      residuals,new_p_guess,args=(newx,newy),full_output=1,warning=true)    

but cannot figure out how deeper problem that. think problem may have difference between global , local variables, perhaps else.

here current draft of complete code:

import numpy np  import matplotlib.pyplot plt  import scipy.optimize   def getminrr(age):     maxhr = 208-(0.7*age)     minrr = (60/maxhr)*1000     return minrr  def sigmoid(p,x):     x0,y0,c,k=p      y = c / (1 + np.exp(-k*(x-x0))) + y0      return y   def residuals(p,x,y):      return y - sigmoid(p,x)   def resize(x,y,xmin=0.0,xmax=1.0,ymin=0.0,ymax=1.0):     # create local variables     newx = [t t in x]     newy = [t t in y]     # if mins greater maxs, flip them.     if xmin>xmax: xmin,xmax=xmax,xmin      if ymin>ymax: ymin,ymax=ymax,ymin     #----------------------------------------------------------------------------------------------         # rest of code below re-calculates values in x , in y these steps:     #       1.) subtract actual minimum of input x-vector each value of x     #       2.) multiply each resulting value of x result of dividing difference     #           between new xmin , xmax actual maximum of input x-vector     #       3.) add new minimum each value of x     # note: wrote in x-notation, identical process repeated y     #----------------------------------------------------------------------------------------------         # subtracts right operand left operand , assigns result left operand.     # note: c -= equivalent c = c -     newx -= x.min()      # multiplies right operand left operand , assigns result left operand.     # note: c *= equivalent c = c *     newx *= (xmax-xmin)/newx.max()      # adds right operand left operand , assigns result left operand.     # note: c += equivalent c = c +     newx += xmin      # subtracts right operand left operand , assigns result left operand.     # note: c -= equivalent c = c -     newy -= y.min()      # multiplies right operand left operand , assigns result left operand.     # note: c *= equivalent c = c *     newy *= (ymax-ymin)/newy.max()      # adds right operand left operand , assigns result left operand.     # note: c += equivalent c = c +     newy += ymin      return (newx,newy)  # declare raw data use in creating logistic regression equation x = np.array([821,576,473,377,326],dtype='float')  y = np.array([255,235,208,166,157],dtype='float')   # call resize() function re-calculate coordinates used equation minrr=getminrr(50) maxrr=1200 minlvet=(y[4]/x[4])*minrr maxlvet=(y[0]/x[0])*maxrr  #x,y=resize(x,y,xmin=0.3, ymin=0.3)  newx,newy=resize(x,y,xmin=minrr,xmax=maxrr,ymin=minlvet,ymax=maxlvet)  print 'x is:  ',x  print 'y is:  ',y print 'newx is:  ',newx print 'newy is:  ',newy  # p_guess starting estimate minimization p_guess=(np.median(x),np.median(y),1.0,1.0)  new_p_guess=(np.median(newx),np.median(newy),1.0,1.0)   # calls leastsq() function, calls residuals function initial # guess parameters , x , y vectors.  full_output means # function returns optional outputs.  note residuals function # calls sigmoid function.  return parameters p minimize # least squares error of sigmoid function respect original x , y # coordinate vectors sent it. p, cov, infodict, mesg, ier = scipy.optimize.leastsq(      residuals,p_guess,args=(x,y),full_output=1,warning=true)     new_p, new_cov, new_infodict, new_mesg, new_ier = scipy.optimize.leastsq(      residuals,new_p_guess,args=(newx,newy),full_output=1,warning=true)     # define optimal values each element of p returned leastsq() function. x0,y0,c,k=p  print('''reference data:\  x0 = {x0}  y0 = {y0}  c = {c}  k = {k}  '''.format(x0=x0,y0=y0,c=c,k=k))   new_x0,new_y0,new_c,new_k=new_p  print('''new data:\  new_x0 = {new_x0}  new_y0 = {new_y0}  new_c = {new_c}  new_k = {new_k}  '''.format(new_x0=new_x0,new_y0=new_y0,new_c=new_c,new_k=new_k))  # create numpy array of x-values xp = np.linspace(x.min(), x.max(), x.max()-x.min()) new_xp = np.linspace(newx.min(), newx.max(), newx.max()-newx.min()) # return vector pxp containing y values corresponding x-values in xp pxp=sigmoid(p,xp) new_pxp=sigmoid(new_p,new_xp)  # plot results  plt.plot(x, y, '>', xp, pxp, 'g-') plt.plot(newx, newy, '^',new_xp, new_pxp, 'r-') plt.xlabel('x') plt.ylabel('y',rotation='horizontal') plt.grid(true) plt.show() 

try this:

import numpy np  import matplotlib.pyplot plt  import scipy.optimize   def getminrr(age):     maxhr = 208-(0.7*age)     minrr = (60/maxhr)*1000     return minrr  def sigmoid(p,x):     x0,y0,c,k=p      y = c / (1 + np.exp(-k*(x-x0))) + y0      return y   def residuals(p,x,y):      return y - sigmoid(p,x)   def resize(arr,lower=0.0,upper=1.0):     # create local copy     result=arr.copy()     # if mins greater maxs, flip them.     if lower>upper: lower,upper=upper,lower     #----------------------------------------------------------------------------------------------         # rest of code below re-calculates values in x , in y these steps:     #       1.) subtract actual minimum of input x-vector each value of x     #       2.) multiply each resulting value of x result of dividing difference     #           between new xmin , xmax actual maximum of input x-vector     #       3.) add new minimum each value of x     #----------------------------------------------------------------------------------------------         # subtracts right operand left operand , assigns result left operand.     # note: c -= equivalent c = c -     result -= result.min()      # multiplies right operand left operand , assigns result left operand.     # note: c *= equivalent c = c *     result *= (upper-lower)/result.max()      # adds right operand left operand , assigns result left operand.     # note: c += equivalent c = c +     result += lower     return result   # declare raw data use in creating logistic regression equation x = np.array([821,576,473,377,326],dtype='float')  y = np.array([255,235,208,166,157],dtype='float')   # call resize() function re-calculate coordinates used equation minrr=getminrr(50) maxrr=1200 # x[-1] returns last value in x minlvet=(y[-1]/x[-1])*minrr maxlvet=(y[0]/x[0])*maxrr  print(minrr, maxrr) #x,y=resize(x,y,xmin=0.3, ymin=0.3)  newx=resize(x,lower=minrr,upper=maxrr) newy=resize(y,lower=minlvet,upper=maxlvet)  print 'x is:  ',x  print 'y is:  ',y print 'newx is:  ',newx print 'newy is:  ',newy  # p_guess starting estimate minimization p_guess=(np.median(x),np.min(y),np.max(y),0.01)  new_p_guess=(np.median(newx),np.min(newy),np.max(newy),0.01)   # calls leastsq() function, calls residuals function initial # guess parameters , x , y vectors.  full_output means # function returns optional outputs.  note residuals function # calls sigmoid function.  return parameters p minimize # least squares error of sigmoid function respect original x , y # coordinate vectors sent it. p, cov, infodict, mesg, ier = scipy.optimize.leastsq(      residuals,p_guess,args=(x,y),full_output=1,warning=true)     new_p, new_cov, new_infodict, new_mesg, new_ier = scipy.optimize.leastsq(      residuals,new_p_guess,args=(newx,newy),full_output=1,warning=true)     # define optimal values each element of p returned leastsq() function. x0,y0,c,k=p  print('''reference data:\  x0 = {x0}  y0 = {y0}  c = {c}  k = {k}  '''.format(x0=x0,y0=y0,c=c,k=k))   new_x0,new_y0,new_c,new_k=new_p  print('''new data:\  new_x0 = {new_x0}  new_y0 = {new_y0}  new_c = {new_c}  new_k = {new_k}  '''.format(new_x0=new_x0,new_y0=new_y0,new_c=new_c,new_k=new_k))  # create numpy array of x-values xp = np.linspace(x.min(), x.max(), x.max()-x.min()) new_xp = np.linspace(newx.min(), newx.max(), newx.max()-newx.min()) # return vector pxp containing y values corresponding x-values in xp pxp=sigmoid(p,xp) new_pxp=sigmoid(new_p,new_xp)  # plot results  plt.plot(x, y, '>', xp, pxp, 'g-') plt.plot(newx, newy, '^',new_xp, new_pxp, 'r-') plt.xlabel('x') plt.ylabel('y',rotation='horizontal') plt.grid(true) plt.show() 

alt text

your other related question not closed, appears you've registered twice, , stackoverflow not letting edit other question because doesn't recognize this user same this user.

mainly i've done in code above alter new_p_guess. finding right values initial guess kind of art. if done algorithmically, scipy wouldn't asking it. little analysis can help, "a feel" data. knowing in advance solution should like, , therefore values reasonable within context of problem helps. (which long way of saying guessed way choosing k=0.01.)


Comments

Popular posts from this blog

asp.net - repeatedly call AddImageUrl(url) to assemble pdf document -

java - Android recognize cell phone with keyboard or not? -

iphone - How would you achieve a LED Scrolling effect? -