Danc of Lost Garden has posted a new game challenge that includes a design and
Under the cover of "testing the latest pyglet features" (but really
for the hell of it) I made a quick stab at it last night and today.
You can play with peas-1.zip, you'll need pyglet trunk r1760. It's nothing like
complete: you can place and remove blocks, and the peas will A* themselves up to
the higest point and "ninja" themselves off it; but the scoring, flag
raising and deathliness is missing.
The physics seems quite stable (using Verlet integration with a fixed timestep),
but the contact constraints on the corners are doubled, which is why the peas
seem to gain energy when bouncing off walls.
I made a botch of the AI movement: every specific combination of blocks that can
be moved between has to be listed (e.g., left side of standard block to top side
of left ramp, ...). I didn't get around to listing them all, so the peas will
refuse to move around certain objects that seem trivial. If you just place
standard blocks they'll do fine. I would rewrite this to something a bit
cleverer if I was going to continue any further...
...which I don't think I will. These peas are cute, but I don't think the game
is going to turn out all that great, relative to the effort required to fix up
the AI and physics.
I was able to pick up a few pyglet bugs writing this. The sprite module seems
(now) pretty solid and absolutely useful for this kind of game -- I didn't need
to use pyglet.gl at all.
I ported delta-v to pyglet 1.0 beta 3, and also took the opportunity to fix some
* Baddies now die when shot, rather than just being pushed away.
* The field of view is reduced.
* There is a more obvious visual indicator when the player is hurt.
* Player's health is gradually restored continually.
* Motion blur is greatly reduced.
* Survival mode score resets to zero before each attempt.
This makes the game far less frustrating.
Much improved, as predicted.
I ripped up the front of the tribot and replaced the jaws with an articulated
light sensor, which can sweep +/- around 60 degrees from center. It took quite a
while to figure out how to gear it and get everything solid enough; all these
new-fangled lego pieces I've not seen before.
This is what it does:
It's much faster than the last one -- when it gets it right, it can work out the
curvature of the line and steer exactly along it. When it loses the line, it
backtracks a little to find it. At the end of the video you can see the sensor
can't pick up the sharp turn in the line and the robot drives off into oblivion
(more work required).
On the programming side of things, I'm getting more used to the NXT environment.
I used two threads: one to control the sensor motor and one for the drive
motors. They communicate through two semaphores: the angle of the sensor motor,
and a flag that gets raised when the sensor has lost the line.
It probably would be feasible to do it in a single run-loop, albeit with more
state variables, and the NXT tool has fits with two many nested conditionals (it
also has fits about most other things, to be fair).
I find I have to change the tolerances of the light/dark sensing very carefully
to avoid incorrect readings. I have to adjust these tolerances throughout the
day, by trial and error, due to the change in ambient light. I still haven't had
any luck with the calibration.
Here's the source code; this time I've added some annotations that might give a
feel for what it's all about. Note that this is the only view of the program
you're given, and it can't be zoomed out (nor are there scrollbars...).
First Mindstorms robot. Here it is in action:
I started out with the tribot, which has building instructions but no real
programs. I had to rip up the front of it to move the light sensor further from
the floor -- in its initial position it couldn't tell the difference between
white and black.
You can show the current readings of the sensors on the device, which would be
handy when programming thresholds, but they don't seem to match up at all... so
it needed quite a bit of tweaking.
Here's the source code:
Lots of room for improvement...