179 | | * move [LOCATION] - bot moves to specified location. uses native pathfinding. It would be nice if along the way it used Quake's goal system. That would give us free puzzle-solving/button-pushing/surfacing-for-air. |
180 | | * move [OBJECT] - finds the location of specified object, and pathfinds to it using move[LOCATION]. |
181 | | * echo [STRING] - reports a string back over the socket to the user |
182 | | * say [STRING] - Quagent "speaks" the STRING aloud. Has limited range. Possible implementation for user with speech synth. Used during Quagent collaboration. |
183 | | * look [???] - sends an image over the socket back to the user representing what the agent can see - may be zoomed or variable-resolution? Use OpenGL. |
184 | | * lookdepth [???] - like look, but sends the OpenGL Zbuffer not framebuffer back over the socket. |
185 | | * listen [TIME] - sends sounds back to the user for the specified time. This seems like it might be Hard. |
186 | | * Invokable Quake Behavior (Implement last) |
187 | | * whereareyou? - returns what the bot usually says if you ask it that in a team game ("I'm by the rail gun in the blue base." sort of things). |
188 | | * followsmart [ENTITY] - smartly follows the given character using Quake's follow chat command. |
189 | | * Sensors |
190 | | * currentItem - returns description of currently held items. |
| 182 | * look [...] - sends an image over the socket back to the user representing what the agent can see - may be zoomed or variable-resolution? Use OpenGL. |
| 183 | * lookdepth [...] - like look, but sends the OpenGL Zbuffer not framebuffer back over the socket. |
| 184 | |