Sunday, April 6, 2008

filaments

OK - so we need visual feedback about the manipulative control potential of limbs projecting from avatars.

Using reverse kinetics is supposedly prohibitive as far as calculation, if you are doing it in a "robot arm" simulation kind of way. I have two solutions:

1) Use real AI that hasn't been developed yet ??? to control the limb sections of avatars via neurons receiving sensor from the limbs and sending control to the limb sections.

2) Here's what we can do right now: filaments. Make the line that extends from an avatars hands actually do things. Way less calc, and still shows other avatars a physical connection between an avatar and a thing which can be manipulated. Make color of a filament mean something; display menus along filaments.


Was it Zelazny's Changeling where the guy with the natural magical skills comes home and sees all these crazy filaments of different colors that are an interface to the magic he can access?

Now I am not suggesting we try to model knots... rather all filaments connect to objects at specific control points offered by the objects. We are here working with a 3d "network graph" of nodes (objects and avatars in the metaverse) and filaments (filaments are edges as well as control mechanisms).

Many thanks to Patrick Lee whom I hung out with back during my comp sci days at fsu and who was fascinated by the way magical user interfaces were described and then loaned me the book.

Ben