I felt charty this evening, and this is what happened:
That's a breakdown by release and distribution of pyglet, from 1.0 alpha 1
through to 1.1 alpha 1 (alpha 2, released hours ago, doesn't yet rate on this
Interesting to note: the eggs are popular for release, but not development. The
only reason they're featured in 1.0 alpha 1 is that they were actually linked
from the download page then. (Now people are only likely to grab an egg if
they're using easy_install from the command line).
The Windows installer is quite the popular, which makes me glad I spent so much
time on it. (I'm not making installers available for 1.1 alpha releases).
Here are the download stats for the packages including documentation/examples.
Looks to me like the number of developers isn't really increasing, but perhaps
those developers are testing on more machines, and perhaps there are some runtime
installations being made on non-developer machines. Or maybe there are more
developers but they're happy to use old or online documentation.
There are currently 271 pyglet-users members (or 269, depending on which counter
on the same page you read).
Danc of Lost Garden has posted a new game challenge that includes a design and
Under the cover of "testing the latest pyglet features" (but really
for the hell of it) I made a quick stab at it last night and today.
You can play with peas-1.zip, you'll need pyglet trunk r1760. It's nothing like
complete: you can place and remove blocks, and the peas will A* themselves up to
the higest point and "ninja" themselves off it; but the scoring, flag
raising and deathliness is missing.
The physics seems quite stable (using Verlet integration with a fixed timestep),
but the contact constraints on the corners are doubled, which is why the peas
seem to gain energy when bouncing off walls.
I made a botch of the AI movement: every specific combination of blocks that can
be moved between has to be listed (e.g., left side of standard block to top side
of left ramp, ...). I didn't get around to listing them all, so the peas will
refuse to move around certain objects that seem trivial. If you just place
standard blocks they'll do fine. I would rewrite this to something a bit
cleverer if I was going to continue any further...
...which I don't think I will. These peas are cute, but I don't think the game
is going to turn out all that great, relative to the effort required to fix up
the AI and physics.
I was able to pick up a few pyglet bugs writing this. The sprite module seems
(now) pretty solid and absolutely useful for this kind of game -- I didn't need
to use pyglet.gl at all.
I ported delta-v to pyglet 1.0 beta 3, and also took the opportunity to fix some
* Baddies now die when shot, rather than just being pushed away.
* The field of view is reduced.
* There is a more obvious visual indicator when the player is hurt.
* Player's health is gradually restored continually.
* Motion blur is greatly reduced.
* Survival mode score resets to zero before each attempt.
This makes the game far less frustrating.
Much improved, as predicted.
I ripped up the front of the tribot and replaced the jaws with an articulated
light sensor, which can sweep +/- around 60 degrees from center. It took quite a
while to figure out how to gear it and get everything solid enough; all these
new-fangled lego pieces I've not seen before.
This is what it does:
It's much faster than the last one -- when it gets it right, it can work out the
curvature of the line and steer exactly along it. When it loses the line, it
backtracks a little to find it. At the end of the video you can see the sensor
can't pick up the sharp turn in the line and the robot drives off into oblivion
(more work required).
On the programming side of things, I'm getting more used to the NXT environment.
I used two threads: one to control the sensor motor and one for the drive
motors. They communicate through two semaphores: the angle of the sensor motor,
and a flag that gets raised when the sensor has lost the line.
It probably would be feasible to do it in a single run-loop, albeit with more
state variables, and the NXT tool has fits with two many nested conditionals (it
also has fits about most other things, to be fair).
I find I have to change the tolerances of the light/dark sensing very carefully
to avoid incorrect readings. I have to adjust these tolerances throughout the
day, by trial and error, due to the change in ambient light. I still haven't had
any luck with the calibration.
Here's the source code; this time I've added some annotations that might give a
feel for what it's all about. Note that this is the only view of the program
you're given, and it can't be zoomed out (nor are there scrollbars...).