Saturday, 28 February 2015

Review 5.5: Learn from User Clicks - Feeding Forward

Learn from User Clicks: Feeding Forward

We can also use users' feed back in the form of users' behavior. e.g. Which link does a user click after the ranked result displayed in front of the user. A very classical way to do this is artifical neural network.

In neural networks, we always use sigmoid function like hyperbolic tangent(tanh) to indicate how a neural node should respond to its input. There are two reasons we use it. First, the result is between [-1,1], so it is easy to be ranked. second, the result adjust slowly when the input get larger. This is just what we want. Because we can not assume each user will click on an answer that is appropriate for everyone.

After all, let us go deep into how the feeding forward algorithm works. Here is the sample code:
 def createTables(){
  val sqllist = List("create table if not exists hiddennode(create_key)",
      "create table if not exists wordhidden(fromid,toid, strength)",
      "create table if not exists hiddenlink(fromid,toid, strength)" )
 def setupNetwork(wordids:List[Int],linkids:List[Int]) = {
  this.wordids =  wordids
  this.linkids = linkids
  this.hiddenids = getAllHiddenIds()
  ai =⇒1.0)
  ah =⇒1.0)
  ao =⇒1.0)
  si ={ for ( i ← 0 until wordids.length )
    yield {
     for ( j ← 0 until hiddenids.length )
      yield getStrength(wordids(i),hiddenids(j),0)
  so = { for ( i ← 0 until hiddenids.length )
    yield {
     for ( j ← 0 until linkids.length )
      yield getStrength(hiddenids(i),linkids(j),1)
 def feedForword() = {
  this.ah = 
   {for ( i ← 0 until hiddenids.length )  yield {
    var sum = 0.0
    for ( j ← 0 until wordids.length ) {
     sum += ai(j) * si(j)(i)
   }}.toList = 
   {for ( i ← 0 until linkids.length ) yield {
    var sum = 0.0
    for( j ← 0 until hiddenids.length ) {
     sum += ah(j) * so(j)(i)

The algorithm works by looping all hidden layer nodes and adding together all the outputs from input layer multipled by the  strength of the links. The output of each hidden node is the tanh function of all the inputs, which is passed on to the output layer. The output layer does the same thing, multiplying the output from previous layer by the strength their links, and then applies tanh function to get the final output.

Here is a example running the code:

Because not yet been trained, the neural network give same rank for all urls.

No comments:

Post a comment