Block Game using Pebble Accelerometer

Edit: A gist of the file is now available, but not polished.

In snatches of down time over the last few weeks I created a stream of Pebble accelerometer data and integrated it into a new version of my Android game engine, which I have plans for using over the summer for a proper implementation of a few game ideas I’ve toyed with over the last year or so.

After further small bits of work, I created a class called PebbleGestureModel, which receives new X, Y and Z data and performs threshold and duration checks (to prevent continuous firing) before executing abstract actions, implemented upon instantiation. Below is an example with no actions assigned for an acceleration threshold of 800 (g is approximately 1000), a minimum duration between firing actions and operating in the tilt mode :

PebbleGestureModel model = new PebbleGestureModel(800, 1000L, PebbleGestureModel.MODE_TILT) {

	public void onWristUp() {


	public void onWristRight() {


	public void onWristLeft() {


	public void onWristDown() {


	public void onActionEnd() {



The result of this is a simple ‘game’ consisting of a randomly generated ‘world’ of 10 x 10 blocks, with two blocks nominated as the Finish and Player respectively. Touching the ‘world’ generates a new random one. At the moment the Player and Finish are randomly placed on valid non-solid tiles, but are not path-checked. If no path connects them, I just touch for a new one until a valid one is found.


The Player block is controlled by accelerometer data from the Pebble, and can operate in two modes: MODE_FLICK and MODE_TILT. In MODE_FLICK a flick of the extended watchface-up wrist in each direction will trigger an abstract method to allow an action to be taken. Similarly in MODE_TILT the actions are triggered when the wrist is tilted left or right, or the arm is pointed up or down. The START button is used to start the data stream and the INSTALL button is used to install the streaming watchapp. The four black squares show the current actuating direction induced by the watch, and the first sample of the last received AppMessage (currently 5 samples per message) is shown at the bottom.

Here is a video of the ‘game’ in action, showing the accelerometer control:

I’m not releasing the source code to this yet, as it’s untidy due to it’s ad-hoc development and it doesn’t do much game-wise, but may tidy it up and release it soon.

  1. Oragsa said:

    Hi Chris,
    I’m interested in your source code for detecting the gestures you are describing.
    Is it possible to upload it to your Github account?
    Btw, nice effort on your website in general. Your tutorials in particular were very helpful!

    • bonsitm said:

      Hi, Oragsa. I have updated the post with a link to a Gist containing the class source. I don’t have time for a full example though, so apologies for that.

      • Oragsa said:

        TY for the fast reponse!
        Though I also need the implementation class for PebbleAccelPacket to make it work…
        (I checked your previous article on accelerometer stuff, but it didn’t have it.)
        Thanks in advance!

      • bonsitm said:

        Apologies again, I forgot that in my haste. It has now been added to the aforementioned Gist. You simply construct a PebbleAccelPacket from the tuples received in PebbleKit Android and pass to the update() function in the PebbleAccelModel to power the gestures. The abstract methods will be called when the data you supply matches predefined conditions.

  2. Oragsa said:

    @bonsitm / Chris
    Thanks, it’s working! Time to experiment and finetune 🙂

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: