A week in the trenches with SLs new pathfinding AI.

It certainly is exciting times in SL!  Pathfinding is Second Lifes newest feature that’s just around the corner. Content developers such as myself have been working on pathfinding for quite some time — but the limitations of LSL have kept all but the simplest pathfinding out of secondlife. For the past week I’ve been spending some time on the beta grid testing out the new things.

Currently, I have buzzards and beetles that crawl across my estate for people to kill and get resources from, but they don’t use this new AI. In the past I’ve had Raiders and Marauders for people to combat, but their slow run-times made them easier targets than I wanted them to be. The cooperation system I had between them wasn’t as good as it could be because they could only make simple choices about nearby environments. They couldn’t plan for traversing a whole region to aid their allies.

The beauty of SL is that people can make whatever they want, and the way they rent the resources to do so is to ‘own’ virtual land. It’s also a double edged sword in that the environment is constantly changing. One of the biggest hurdles for even simple AI is for it to know if it’s okay to go onto a parcel of land, or as I like to call it “Is it okay to step here?”. Because I don’t own all the land in the estate, the simple LSL AI needs to make a lot of decisions about it’s next step. It examines it’s next step with the following in mind, asking if:

  1. Does my next step put me on land where scripts are disallowed for me or the group I am in?
  2. Does my next step allow me to even enter the parcel I am about to cross into, and if not are we of the same group?
  3. If I can cross into the parcel, and scripts are allowed, are there enough prims to host my presence?

If the tiny AI object can’t cross into the parcel, it has to think a few steps ahead and try to plan the shortest route around the parcel.  Usually that’s the big hangup because of script memory constraints of 64kb for any single script. Eventually I started trying to implement a simple potential field in SL to help with navigation but the execution time was terrible. I would have tried again with the improved llList* speed improvements, but…

Right now I am playing with pathfinding in SL. It’s been confirmed that it’s an API extension of Havok-AI (video).  A character (as LL is calling it) can be created in as little as 4 lines of code, and have no script time impact. Having tested myself, I can confirm that in it’s current implementation once a character is defined, it will continue running even on no-script land, though it may be brain dead. That means the physics engine is doing all the heavy lifting, and an objects behavior only changes when it’s told to change by a script. This is great because we get to use the physics engine for more than placing avatars, shooting bullets, or playing jenga.

One of the Moles (a contractor for LL), was on the beta grid getting my feedback, and answering a lot of questions — well as best he could. From what I could gather in order for pathfinding AI to work, content creators will have to set some flags up on their prims to help build a nav-mesh. A nav-mesh is what the AI looks at to determine where it can go.  Floors, stairs, and ramps will need to be flagged as walk-able, while walls and furnishings will need to be flagged as obstacles. Things that aren’t flagged will be ‘ignored’, and my best understanding of this is they’re considered obstacles.  Land owners will be able to bake the mesh for their parcel, and estate owners will be able to bake the mesh for the region.

Currently there are a lot of bugs, but that’s to be expected with any new feature — especially in beta. One of the biggest is that it currently doesn’t check for all the small steps I mentioned earlier.  Sure AI can be aware of their physical surroundings, but they also need to be aware of their system surroundings.  I’ve filed a bug on the JIRA about this, and another issue relating to collisions. One big feature I find lacking currently is the support for flying things, or swimming things. I believe once LL gets done with basic object and terrain walking, they’ll expand the feature to support different types of AI.  Until then come test on the beta grid!

Here’s the full official video from Linden Lab about pathfinding.