Christmas Hack Day Zero: The Plan

I’m going to start working on my game idea tomorrow. I have a pretty good idea of the mechanics of a single level. I’ve picked a language (C, with prototyping in Ruby), a networking library (ZeroMQ), a graphics library (SDL 2.0). I’ve created a skeleton project on GitHub.

My plan is to get at least an hour of work in every day, but hopefully a lot more. I do have social plans throughout the holidays — spending time with two different friends over this weekend, time with family on Christmas Eve and Christmas, a holiday party with friends later this week, and possibly some New Year’s stuff. I also have committed to getting out of the house every day and making sure I keep up on my chores around the house. These are the kind of things you have to commit yourself to when you’re a chronic depressive.

Tomorrow I’ll be pretty busy cleaning up my house and spending time with a friend, but I’m hoping to at least start doing some concrete design. I hope to have basic graphics rendering by Monday, recognizable gameplay by the end of the day on Christmas, working netplay by the 28th, and a beatable single level by Monday the 30th. I’ll be making at least one status post every day.

I know nobody’s reading this but me, but I’m hoping to be able to look back on what I did and learn more about how I think and work.

Here goes!

CH ZeroMQ experiment part 1.5

Aside

More later, but I think this says something about how different my mindset is after working in C versus working in Ruby:

:!make                                                                                                                                                                                                                            
cc -Ibstring -g   -c -o duplex_msg.o duplex_msg.c
gcc duplex_msg.o bstring/bstrlib.c -lzmq -lczmq -o duplex_msg

:!./duplex_msg 5555 5556                                                                                                                                                                                                          
hi
hi
HEARING >> duh: 238: hi? hi!
YO
HEARING >> duh: 424: hi!? YO? WOW!
FUCK
HEARING >> duh: 614: YO? WOW!? FUCK? YES!
WORKS
HEARING >> duh: 874: WOW!? FUCK? YES!? WORKS? HA HA HA!
YES YES YES
HEARING >> duh: 1226: FUCK? YES!? WORKS? HA HA HA!? YES YES YES? HAHAHAHAAHAAHAAA!

CH ZeroMQ experiments

I’m doing some experimental work to figure out whether the netplay design I have in mind would be feasible using ZeroMQ. I’m working in Ruby at the moment because it’s the most natural language for me to work in. The ZMQ guide has Ruby examples using ffi-rzmq, which maps pretty closely to the underlying API. Whatever design I prototype in Ruby should work equally well in another language.

I’m thinking about the networking requirements in tandem with my concurrency requirements, which is something ZMQ kind of encourages you to do. You’re encouraged to rely on using ZMQ in-process communication instead of mutexes, condition variables, etc. to coordinate between your threads. This is worth trying out, although I don’t necessarily know how well it will play with whatever graphical toolkit I end up using.

So the way I’m envisioning this right now is that each player has an outgoing event queue and an incoming event queue. Messages can be received and delivered asynchronously. Successful delivery counts on TCP’s retry mechanisms, and there’s no acks in the protocol. Each of these queues lives in its own thread. An additional thread is responsible for running the simulation. It reacts to events fed into it from the incoming event queue and emits events for the outgoing event queue. It also listens for input from the UI thread (i.e., whatever the user is doing). Finally, it ticks the game logic around 60 times per second so that objects can move, AI can operate, etc.

What you end up having is the simulation thread using input from the current player, events from the other player, and the current state of the arena to 1) set up the next state of the arena so it can be displayed and 2) emit events for the other player’s arena.

Setting up a minimal version of this design in Ruby was easy. In essence, the app works like a chat system — you type in text, hit enter, and the message is relayed to the local simulation thread. The simulation thread adds identifying information and its current tick to the message (so you can see it’s working) and passes it on to the output queue. The other instance of the app receives the message in its input queue, passes it into the simulation thread, and displays it.

Here’s one side of the exchange:

% ruby duplex_msg.rb 5555 5556 right 
are you there 
HEARING >> left: 132: are you there? I AM HERE! 
holy shit dude 
HEARING >> left: 705: holy shit dude? NO KIDDING RIGHT! 
cannot believe this works 
HEARING >> left: 1090: cannot believe this works? ME NEITHER!

And the other side:

% ruby duplex_msg.rb 5556 5555 left
HEARING >> right: 100: nothing? are you there!
I AM HERE
HEARING >> right: 586: I AM HERE? holy shit dude!
NO KIDDING RIGHT
HEARING >> right: 1003: NO KIDDING RIGHT? cannot believe this works!
ME NEITHER

Note the increasing tick number, which goes up in real time. This stands in for the local game logic. The last message sent is incorporated into the reply in order to simulate a change to the game state from the incoming event.

The implementation is pretty simple — 80 lines of Ruby. I’ve created a Gist for it. I’m pretty confident at this point that ZMQ will be useful for my purposes. This design seems to work acceptably for much higher volumes than I expect to require, too: piping ‘yes’ to both ends churns out a great deal of output and eats up a lot of CPU but proceeds diligently without any weird behavior.

CH events & networking

An event can be identified as a 32-bit integer. The most significant 16 bits represent an event group, and the least significant 16 bits identify the particular event. For the purposes of the initial build, though, I only expect to use group 0, which will be used for named built-in events. Within that group, I’m thinking of splitting the 16-bit word into two bytes, the first being a type (like KEY and SWITCH) and the second allowing up to 255 of that type of event to be predefined.

Obviously, I’m unlikely to make use of even 255 distinct types of event, but there’s no reason not to leave some room to grow. Defining predefined ranges of events should make easier to think about puzzles in a symbolic manner — if you don’t use more than one type of key or switch etc. on a stage, you can just think of “key” as opposed to “key 1” or “event 7”. It’ll also make it easier to write level creation tools.

Events don’t necessarily have to be defined this way in the game’s internals, but we’ll need some way to represent them over the wire (and probably in saved level files). An alternative would be to actually represent them as string symbols: “key”, “key b”, “super special key”, etc. This would possibly make the network representation of events larger, but that might not be an issue.

As far as networking goes: it’s important that events be delivered successfully. It’s possibly important that they be delivered in order. On the other hand, we don’t expect to be particularly chatty, and we’re not trying to track realtime movement from the other side. As I understand it, this fits TCP better than UDP. My (poorly informed) understanding is that if you can transmit your data in discrete chunks of less than around 1500 bytes, and if you set NODELAY on your socket, you get relatively low latency from TCP. Everything I just wrote might be half-misremembered nonsense, though.

I’m interested in researching some open source netplay code. This might be my next step.

CH state transitions

Fixtures

Single-Press Button: <event n

  • State “Open”
  • State “Closed”
  • Transition “Open->Closed”: when activated; emit <event n>

Two-state Lever: <event n>, <event m>

  • State “Up”
  • State “Down”
  • Transition “Up->Down”: when activated; emit <event n>
  • Transition “Down->Up”: when activated; emit <event m>

Weight-sensitive Plate: <event n>, <event m>

  • State “Up”
  • State “Down”
  • Transition “Up->Down”: when object enters; emit <event n>
  • Transition “Down->Up”: when object exits; emit <event m>

Double-Action Door: <event n>

  • Quality “blocks movement”
  • State “Closed”
  • State “Opening”
  • State “Open”
  • Transition “Closed->Opening”: when receives <event n>
  • Transition “Opening->Open”: when receives <event n>; remove “blocks movement”
  • Transition “Opening->Closed”: when receives <event m>

Unlockable Door: <event n>

  • Quality “blocks movement”
  • State “Closed”
  • State “Open”
  • Transition “Closed->Open”: when receives <event n>; remove “blocks movement”

Props

Key: <event n>

  • Quality: emits <event n> on collision, when held

State And Transition Attributes

By default: every state has an image, every transition has an animation

Transition Triggers

  • object enters: something entered the fixture’s bounding box (and nothing was there before)
  • object leaves: something entered the fixture’s bounding box (and nothing else is there)
  • activated: the player activates (clicks on) the fixture
  • receives <event n>: event n was emitted globally, or it was emitted locally and reached this point

Transition Actions

  • emits <event n> (globally)
  • emits <event n> (locally, within m blocks)
  • add “quality”
  • remove “quality”

Qualities

  • blocks movement
  • emits <event n> on collision (when held)

CH control scheme

(It’s amazing how much thinking you can get done while waiting for duck stock to finish cooking)

One idea for keyboard and mouse:

  • WASD for movement
  • Left-click on an adjacent prop to pick it up; left-click on an empty space while holding a prop to drop or throw it; left-click on fixtures to interact with them
  • Right-click to glass an object (show it to your coplayer)

For mouse alone:

  • Right-click on an empty space to pathfind to it; (right-click on a fixture to glass it?)
  • Left-clicking as above; (left-click on a fixture out of range to glass it?)

For touch interfaces:

  • Tap on an empty space to pathfind to it
  • Tap on a fixture to interact with it
  • Drag an adjacent prop toward you to pick it up
  • Drag from yourself while holding a prop to drop or throw it
  • Tap and hold on a fixture to glass it (or maybe circle it?)

EDIT: Some of the puzzles I’m thinking of involve activating the item you’re holding (like to turn on a jetpack to get a period of flight). For keyboard input, that’s pretty easy — press ‘E’ or something like that. For other input mechanisms, it might be easiest to just have an icon/button that appears somewhere on the screen to activate whatever item you’re holding.

CH net interactions

Phase 1: Multiplayer discovery/handshake: two instances become aware of each other, one offers to host the game and becomes the host, the other accepts and becomes the guest.

Phase 2: Host picks a stage and transmits the arena data to the guest. This might just be “stage 123, player 2” or it might be a representation of the arena to allow sharing custom stages on the fly.

Phase 3: Host and guest eventually notify each other they’re ready to go. Some kind of tick synchronization might be necessary to make the timing work. This is the kind of stuff I’m doing this to learn!

Phase 4: Game behavior triggers various types of events:

  • “prop available at n“: a prop was placed on gateway n on the other arena and appears on gateway on this arena. Details about the prop are transmitted, also.
  • “prop removed from n“: the prop was picked up from gateway n on the other arena and is no longer available on gateway n on this arena.
  • “event n triggered”: event n was triggered globally on the other arena and should be delivered to anything that consumes n on this arena.
  • “object glassed”: the other player flagged an entity for this player to see. Identifying information sufficient to display a picture of the object is transmitted as well.
  • “goal n reached”: the other player’s character has activated goal n. If the host knows that both its player and the guest have reached goal n, the stage is completed.

Phase 4 continues until the stage is completed (or either player has quit, going to Phase 5). At that point, flow returns to Phase 2.

Phase 5: One of the players quits and broadcasts a goodbye, or they don’t respond for a given timeout.