javascriptarraysalgorithmiterative-deepening

Having trouble optimizing Iterative Deepening Rush Hour algorithm in JavaScript


I have a school assignment to make an iterative deepening algorithm to solve a 6x6 Rush Hour puzzle. I chose to use JavaScript of all things because I need to practice. I am however having trouble optimizing the algorithm big way.

I tried to solve a puzzle, which had its solution 8 levels into the tree and I found that I visited 7,350,669 nodes and it took my computer almost 13 minutes to solve.

I am looking for tips and a help in understanding the algorithm itself.

I made 2 classes- Node and Vehicle. The implementation of these might be part of the problem:

class Vehicle {
  constructor(x,y,length,horizontal){
    this.x = x; //X position of the upper/left block of the vehicle
    this.y = y; //y postion
    this.length = length; //length of the vehicle
    this.horizontal = horizontal; //boolean - false if vertical
  }
}

class Node {
  constructor(grid,vehicles,moved,depth){
    this.grid = grid; //A 6x6 char array grid
    this.vehicles = vehicles; //array of vehicles on the game board
    this.moved = moved; //index of vehicle moved in last turn
    this.depth = depth; //Depth of this node
  }
}

Am I right in assuming, that having both an "two-dimensional" array for the gird as well as an array of the vehicles is an overkill? When checking for possible movements I iterate over the vehicles array, but use the gird to just quickly check wether the vehicle has free way ahead. I will get back to the problem I see with this.

I cannot post the code for the algorithm publicly, but here is how I understood IDDFS and implementated the algorithm:

  1. Check the current node if it is the final state, add it to a solution array if it is and return true.
  2. Otherwise check if you reached maxDepth and if so return false.
  3. If not, then for each of the vehicles in this state create new nodes for all the moves they can make (except for the ones that were moved in last move)
  4. Visit all the children you just made (back to step 1) and delete them if they return false.
  5. If none of the children were shown to be the final state, then go back to the parent node and explore its neighbours. Otherwise return true chain reaction ensues.
  6. Repeat until you find the final state. If you get back to the top, then increase maxDepth by 1 and repeat the whole process.

One problem I see is that my data structure might be a bit complicated. Because JavaScript passes objects and arrays of objects as references, they have to be deep copied using:

JSON.parse(JSON.stringify(node))

The major question here is- did I miss something? Is it correct to delete "bad" child nodes and keep going through the entire tree again and again in iterative deepening algorithm or did I misunderstand that? Are they just supposed to be marked as "checked" and then returned to then just pass through them later so that the whole tree does not have to be generated again?


Solution

  • First, you have to profile your code execution. Otherwise it's guesswork and you could spend a lot of time changing your code for a 0.1% gain.

    Keeping the above in mind, you can clone objects much faster when you know their structure (as is the case here) by manually copying each property rather than using stringify.