Wednesday, August 31, 2011

Backbone.js: Telling the View to Delete the Model, Which Tells the View to Delete Itself

‹prev | My Chain | next›

I think that I have more or less figured out how to delete things in Backbone.js. I also have a halfway decent, event driven means of removing deleted things from the UI. But, to date, I am doing all of that deleting in the Javascript console. Tonight, I would like to add a UI element to do the deleting.

Adding the UI element is easy enough. I add an "X" inside a <span> with a class of "delete" to my calendar event template:
<script type="text/template" id="calendar-event-template">
  <span class="event" title="<%= description %>">
      <%= title %>
      <span class="delete">X</span>
  </span>
</script>
On the page, the "X" displays like:
To hook that "X" up to a function call, I need to add a click event to the View. For now, I stick with tracer bullets:
window.EventView = Backbone.View.extend({
      // ...
      events: {
        'click .delete': 'test'
      },
      test: function() {console.log("delete")},
    });
So, when a click event is received in this view for an element with the delete class, the "test" function is called. Sure enough, clicking on the X inside the delete <span> logs "delete" messages to the console:
Cool beans, but that is not the final target I hope to hit. So, I replace the test function with a handler for the "click .delete" event:
window.EventView = Backbone.View.extend({
      // ...
      events: {
        'click .delete': 'deleteClick'
      },
      deleteClick: function() {
        this.model.destroy();
      },
      remove: function() {
        $(this.el).find('.event').remove();
      }
    });
It feels a little awkward having a remove() method and a deleteClick(). The former removes the UI element from the page. The latter handles clicks that should signal the model to delete itself, which will, in turn, tell the view to remove itself from the page. I will worry about the odd feeling another day. For now, I am not quite done with my delete.

I am telling a CouchDB store to delete a record. Since CouchDB uses optimistic locking, I need to supply the revision ID when deleting the record. The revision is already stored in the model, so I have been deleting like this:
e.destroy({headers: {'If-Match':e.model.get("_rev")} })
It seems really wrong to me that the View should be responsible for knowing about this. But how to get the model to do this? I could create a new destroyWithRevision method on the model, but the view would still need to know to call this instead of the conventional destroy() method.

Luckily, Backbone.js does support overriding methods and calling the superclass's original method:

    window.Event = Backbone.Model.extend({
      // ...
      destroy: function() {
        Backbone.Model.prototype.destroy.call(this, {
          headers: {'If-Match': this.get("_rev")}
        });
      }
    });
That is slick. I call the destroy function that resides on the Backbone.Model prototype. Since I am invoking that function directly, I need to supply an object instance so that the method has a this (or self if you're a Rubyist) to which it can refer. That is the first argument to a Javascript call method. Then I can supply the arguments that inform CouchDB of the revision being deleted.

Slick indeed. Now, when the view tells the model to delete itself, it can call the very conventional destroy() method—completely unaware of this complexity. As an added bonus, I am not losing any of the benefits of optimistic locking—if the loaded model was superseded before the user clicked the "X", the delete would fail. And yes, when I click the little "X", the calendar event goes away from the UI:


That is a good stopping point for tonight. Up tomorrow, I think that my partner in crime on the Recipes with Backbone book has given me some food for thought on how to improve my view.


Day #130

Tuesday, August 30, 2011

Backbone Removing from the UI Too!

‹prev | My Chain | next›

Last night, I figured out how to get Backbone.js to delete calendar events from my backend CouchDB store. In the end it required very little setup and code, which gives it a good feel—at least to this beginner. One deficiency of my approach, however, was that the calendar events were not immediately removed from the page. Tonight I hope to rectify this.

Even after I told Backbone to the remove the "foo" event from the calendar and verified that it was, in fact, removed from the CouchDB store, it still remained on the web page for Sep 1:
After manually reloading the page, the bogus calendar is no more, but there has to be a better way. Right?

After digging through the documentation a bit, I think that the solution is binding Backbone events. Specifically, binding a calendar event's "destroy" event to the view's remove() sounds about right. This most certainly not the same thing as telling the model about the view and how to destroy it. That would constitute an insane degree of couple between the view and model.

Rather, the model is always emitting events as it changes. Why not teach the view to listen for one such event (destroy, for instance) and to act accordingly. It might seem that the view now has too much knowledge of the model and that coupling is taking place in spite of my desires. But this not the case. Something needs to bind the event emitter to the listener, but neither the listener or emitter care to whom the event is being sent or from where the event came. I could just as easily trigger a manual destroy event and see the same view behavior.

Anyhow, to actually bind the two, this ought to work:
    window.EventView = Backbone.View.extend({
// ...
      initialize: function(options) {
        options.model.bind('destroy', this.remove, this);
      },
// ...
    });
When a view object is created, one of the attributes sent is the model being represented by the view:
view = new EventView({model: calendar_event, el: el});
So the bind() call adds a listener to the model's event emitting function—but only for 'destroy'. Models can be instructed to emit arbitrary event strings, but Backbone includes some built-in events like 'change' and 'destroy'. It is the latter in which I am interested today.

On reception of such an event, my View object should call its own remove() method (built into Backbone.js). Theoretically, this should remove the calendar event from the calendar.

Theory is great, but in practice, this winds up removing the entire Sep 1 day from the calendar:
Yikes!

This occurs because the element to which I tie my views is the calendar day:

          var el = $('#' + event.get("startDate")),
              view = new EventView({model: event, el: el});
In my view class, I then append calendar events to this element via a specialized render() method:
    window.EventView = Backbone.View.extend({
//...
      render: function() {
        $(this.el).append(this.template(this.model.toJSON()));
        return this;
      }
    });
Since I am already off the rails for my render() method, I need to do the same for my remove() method. Fortunately, Backbone does not make it too hard to concoct such a thing:
    window.EventView = Backbone.View.extend({
//...
      remove: function() {
        $(this.el).find('.event').remove();
      }
    });
Now, when I delete the offending calendar event (still in the Javascript console), the event is removed both from the backend and from the calendar UI:
Best of all: no lost day.

I worry that the need for customized render() and remove() methods are Backbone's vinegar, telling me that I should refactor things a tad. I will circle back on that question in a day or two. But up tomorrow, I will try adding a UI element to delete calendar events so that I do not have to drop down to the Javascript console every time I spy an appointment I'd rather not keep.


Day #129

Monday, August 29, 2011

Deleting Things in Backbone.js

‹prev | My Chain | next›

After getting a pretty decent Backbone.js view implementation in place yesterday, today I would like to see if I can add a bit of interactivity to the beasty. The easiest thing seems to be deleting. "Easy" usually turns out to be a red-flag, but who knows? Maybe this time it'll just work.

Anyhow, the first thing I need is a delete route in my express.js app. Nothing too fancy ought to be required. I just need to make an HTTP request with a method of "DELETE" to my CouchDB backend. The response from CouchDB can then be piped directly to my Backbone app. Something like this ought to do:
app.delete('/events/:id', function(req, res){
  var options = {
    method: 'DELETE',
    host: 'localhost',
    port: 5984,
    path: '/calendar/' + req.params.id
  };

  // Send the HTTP request with the DELETE options
  var couch_req = http.request(options, function(couch_response) {
    console.log("Got response: %s %s:%d%s", couch_response.statusCode, options.host, options.port, options.path);

    // Pipe the response from CouchDB to the browser
    couch_response.pipe(res);
  }).on('error', function(e) {
    console.log("Got error: " + e.message);
  });

  // Send the complete request.
  couch_req.end();
});
Now to my Backbone application. As usual when I am exploring, I use Chrome's Javascript console for interacting with page elements and Javascript objects. In this case, I would like to delete the Backbone model responsible for the "foo" calendar event on the first of next month:
In the console, I find that entry is the second of four calendar events (clearly, I need to investigate sorting another day):
Assuming that is an Event model, all I need do is call its destroy method and the offending event should be stricken from existence:
> e.destroy()
     => child
Hrm... Dunno what I expected. I suppose it has to be chainable, so maybe it worked..? Actually, no, it did not. Examining the express.js app's log, I see no log entries. Reloading the page, I see that the calendar event is still there. So what gives?

One of the first places I check is the Event model itself:
window.Event = Backbone.Model.extend({});
Say, that looks a bit spartan. Perhaps more is needed for Backbone to know how to delete a thing from the database.

After a bit of research, I find that yes, two things are needed for this to work: a URL root (e.g. /events) and a record ID. In retrospect, both make all kinds of sense. How else is Backbone supposed to infer the resource to be DELETEd?

Anyhow, the fix should be pretty easy. My new and improved Event model looks like:
    window.Event = Backbone.Model.extend({
      urlRoot : '/events',
      initialize: function(attributes) { this.id = attributes['_id']; }
    });
The urlRoot property is fairly self-explanatory. The initialize method for setting the model's id is less so. CouchDB stores document IDs in the "_id" attribute:
{
   "_id": "a38f51509190f265959bbb2b5d001128",
   "_rev": "1-174f31204df52e79a92c1ac875ac09a2",
   "startDate": "2011-09-01",
   "title": "foo",
   "description": "bar"
}
This is available in the model's attributes (I code call event.get("_id") to retrieve it), but Backbone has no way to tie it to the special id property of a Backbone model. So I link the two manually in the model's initializer.

Now I should be able to delete the bogus calendar event. Reloading the page and trying again I get:
Well that is progress. I am seeing an AJAX request logged, but is it doing anything? Checking the express.js logs, it is failing to do something:
Got response: 409 localhost:5984/calendar/a38f51509190f265959bbb2b5d001128
That is certainly progress, but 409?!

Ah, wait. This is CouchDB. I need to work a little harder to delete things. Specifically, I have to assure the database that I am acting on the same revision that is currently in the database. To work properly with this optimistic locking, I need to supply the revision as a query parameter:
DELETE /calendar/a38f51509190f265959bbb2b5d001128?rev=1-174f31204df52e79a92c1ac875ac09a2 HTTP/1.0
Or as an If-Match header:
DELETE /calendar/a38f51509190f265959bbb2b5d001128 HTTP/1.0
If-Match: "1-174f31204df52e79a92c1ac875ac09a2"
Hrm... I tend to think it would be easier to transmit the revision via the If-Match header. If I could set that in my Backbone application, then I ought to be able to pass that directly through to CouchDB:
app.delete('/events/:id', function(req, res){
  var options = {
    method: 'DELETE',
    host: 'localhost',
    port: 5984,
    path: '/calendar/' + req.params.id,
    headers: req.headers
  };

  var couch_req = http.request(options, function(couch_response) {
    // ...
  couch_req.end();
});
I rather like that one line change. No futzing with query parameters just feels cleaner.

But is it even possible to set HTTP headers in Backbone? Actually, it is relatively easy. Anything supplied in the destroy (or update or create) method is sent along to jQuery as an option. Since jQuery AJAX requests recognize a headers attribute, something like this ought to work:
> e.destroy({headers: {'If-Match':'1-174f31204df52e79a92c1ac875ac09a2'} })
And, finally, I see a change in the CouchDB response:
Got response: 200 localhost:5984/calendar/a38f51509190f265959bbb2b5d001128?rev=1-174f31204df52e79a92c1ac875ac09a2
Most importantly, I no longer have a bogus entry on the first:
That is good progress for tonight. I still have work to do with deleting. It would be nice to do this from the UI rather than the Javascript console. Also, I should not have to refresh my display to see the calendar event removed. But I will worry about those things tomorrow.


Day #128

Sunday, August 28, 2011

Getting Started with Backbone.js View Templates

‹prev | My Chain | next›

I take a little time today to pretty up the Funky Calendar application:
OK. It is not that pretty, but I do take some time to ensure that the UI elements are consistent throughout the DOM tree:
I do this not because I expect to do anything real with Funky Calendar. Rather, I would like to begin playing with some the Backbone.js view stuff tonight. The hope is that a more consistent DOM will better lend itself to Backbone views.

From yesterday, I have Backbone view code that adds calendar events via a not-too-compact render() method:
    window.AppView = Backbone.View.extend({
      initialize: function() {
        Events.bind('all', this.render, this);
        Events.fetch();
      },
      render: function() {
        Events.each(function(event) {
          var start_date = event.get("startDate"),
              title = event.get("title"),
              description = event.get("description"),
              el = '#' + start_date;

          $('<span class="event" title="' + description + '">' +
              title +
            '</span>').appendTo($(el))
        });
      }
    });

    window.App = new AppView;
One thing that I can do to realize some cleaner code is to separate that view "template" out into a real template:
|<script type="text/template" id="calendar-event-template">
|  <span class="event" title="<%= description %>">
|      <%= title %>
|  </span>
|</script>
That is not so much a Backbone.js thing as it is a simple Underscore.js template. It is a bit of a cheat. It is a script that will not be executed because it is type="text/template". Since it is not evaluated, the contents remain undisturbed as a string child element, which can subsequently be extracted via a jQuery html() call.

To use that template, I replace the existing code with:
          var event_html = _.template($('#calendar-event-template').html());
          $(event_html({title: title, description: description})).appendTo($(el))
The _.template() Underscore method takes as its argument the "html" extracted from the type="text/template" <script> tag. The result of the _.template() call is a function that will return a string with the things inside <%= ... > replaced with key-values passed into the function.

Since the template only requires that title and description values be defined (and since I conveniently have local variables named title and description), I can invoke the function as event_html({title: title, description: description}). To append that to the existing table cell (whose ID is stored in the local variable el), I convert the resulting HTML string into a jQuery object and append it to the cell element:
$(event_html({title: title, description: description})).appendTo($(el))
At this point, I have separated (at least some of) my view logic out into a template. This does not quite qualify as the "Backbone way".

What I think qualifies as more of the Backbone way is to create an individual view that corresponds to single calendar event. The results is much simpler conceptually. All that I need implement in the view class is a template attribute and a render method:
    window.EventView = Backbone.View.extend({
      template: _.template($('#calendar-event-template').html()),
      render: function() {
        $(this.el).html(this.template(this.model.toJSON()));
        return this;
      }
    });
The template attribute is the very same template function from earlier. The render method becomes just as simple—it extracts the attributes from the model (which include the necessary title and description attributes) and passes them to the template function. Nice!

To instantiate and render these view objects, I need to tell last night's collection to fetch itself. Once retrieved, the building and rendering can commence:
    Events.fetch({success: function(collection, response) {
        collection.each(function(event) {
          var view = (new EventView({model: event, id: event.get("startDate")}));
          view.render();
        });
      }
    });
(I am pretty sure that some of that needs to go into an "Application View" class, but I'll worry about that another day)

And when I load up my page, I get... nothing. Cleverly placed console.log() statements reveals that the elements to which I am binding my view objects are not the <td> elements in the calendar table. Rather, they are new <div> tags:
This situation is addressed in the Backbone.js documentation on view constructors. To ensure that my existing ID is used rather than creating a new one, I need to pass in the element, not the ID:
        collection.each(function(event) {
          var el = $('#' + event.get("startDate")),
              view = new EventView({model: event, el: el});
          view.render();
        });
I might argue that, if the ID already exists, it should certainly be used rather than creating a new, identical identity element, but I am prolly not proficient enough with Backbone to be able to quibble yet.

Anyhow, I have reduced my code fingerprint significantly with the view code rework. Up tomorrow, I think, I will begin investigating CRUD with Backbone.

Day #127

Saturday, August 27, 2011

Converting to Backbone.js

‹prev | My Chain | next›

Last night I got my super slick, Funky Calendar populated with events from a CouchDB backend via AJAX. As I consider the harsh realities of what faces me when trying to create, update, delete those events in the UI, I think, maybe, I could use some help. So let's see if Backbone.js might do the trick.

First up, I install backbone.js locally and source it in my Jade layout:
!!!
html
  head
    title= title
    link(rel='stylesheet', href='/stylesheets/style.css')
    script(src='/javascripts/jquery.min.js')
    script(src='/javascripts/backbone.js')
  body!= body
Now, I need a Backbone model to encapsulate my events:
script
  $(function() {
    window.Event = Backbone.Model.extend({});
    });
  });
I do not think that I need any default values, specialized construction methods, or any other helper methods. Before moving on, I do a quick sanity check to make sure that that very simple thing is working. In fact, it is not. In my Javascript console, I see:
Uncaught TypeError: Cannot call method 'extend' of undefined  backbone.js:150
Checking out line 150 of backbone.js, I find:
  // Attach all inheritable methods to the Model prototype.
  _.extend(Backbone.Model.prototype, Backbone.Events, {
    // ...
Say, what's that funny underscore at the beginning of that line? Ohhh....

What I meant to say earlier was that I install backbone.js and underscore.js locally. I then source both in my site layout file:
!!!
html
  head
    title= title
    link(rel='stylesheet', href='/stylesheets/style.css')
    script(src='/javascripts/jquery.min.js')
    script(src='/javascripts/underscore.js')
    script(src='/javascripts/backbone.js')

  body!= body
(and it seems that underscore.js needs to come before backbone.js)

Now when I load the page, I get no errors. But, of course, I hope to do a little more than no errors, so now it is time to create a collection of events:
    window.EventList = Backbone.Collection.extend({
      model: Event
    });
Now I need to populate the EventList collection. The Backbone.js documentation seems to suggest bootstrapping the data inside the template as a preferred solution. I am not really a fan of this as it seems an opportunity for blocking the request as I lookup the data. Besides this is not how I did it with the AJAX version of Funky Calendar—I sent the static page, then loaded the events via an AJAX call.

To do the same in Backbone, I think I need to specify the URL for the collection and then add a parse() method to massage the CouchDB _all_docs resource into something that can be consumed by my Event model's initializer. CouchDB's response, proxied thru my app's /events resource, looks like:
{"total_rows":2,"offset":0,"rows":[
  {"id":"fdbed27594feb433c74e82eb910015e0",
   "key":"fdbed27594feb433c74e82eb910015e0",
   "value":{"rev":"2-b7c22d428e648a6cdd2978c213f79ec0"},
   "doc":{"_id":"fdbed27594feb433c74e82eb910015e0",
          "_rev":"2-b7c22d428e648a6cdd2978c213f79ec0",
          "startDate":"2011-08-25",
          "title":"create blog post",
          "description":"talk about node and CouchDB"}},
  {"id":"fdbed27594feb433c74e82eb91001f45",
   "key":"fdbed27594feb433c74e82eb91001f45",
   "value":{"rev":"1-2b18432cf6e63b82c6507ff28af9724c"},
   "doc":{"_id":"fdbed27594feb433c74e82eb91001f45",
          "_rev":"1-2b18432cf6e63b82c6507ff28af9724c",
          "startDate":"2011-08-26",
          "title":"blog again",
          "description":"add backbone into the node + couch mix"}}
]}
The following url and parse attributes on my collection ought to translate into an array of attributes that Backbone will send directly into the Event model's initializer:
    window.EventList = Backbone.Collection.extend({
      model: Event,
      url: '/events',
      parse: function(response) {
        return _(response.rows).map(function(row) { return row.doc ;});
      }
    });
I am using underscore.js to massage the rows in the CouchDB response into something with a map() function. I then map the rows to return just the doc attribute. This ought to produce the following array:
[{"_id":"fdbed27594feb433c74e82eb910015e0",
  "_rev":"2-b7c22d428e648a6cdd2978c213f79ec0",
  "startDate":"2011-08-25",
  "title":"create blog post",
  "description":"talk about node and CouchDB"},
 {"_id":"fdbed27594feb433c74e82eb91001f45",
  "_rev":"1-2b18432cf6e63b82c6507ff28af9724c",
  "startDate":"2011-08-26",
  "title":"blog again",
  "description":"add backbone into the node + couch mix"}]
To test this out, I instantiate an Events collection object and tell it to fetch() results from the server:
    window.Events = new EventList;
    Events.fetch();
Since I don't have this hooked up to anything in my UI, I drop down to the console to try this out:
Nice! Just like that, I have direct access to objects with the appropriate calendar event attributes.

Last up is to actually get the event data to display in my currently empty calendar:
For that, I will need a Backbone.js view object, which I cleverly name AppView:
    window.AppView = Backbone.View.extend({
    });
In there, I need to initialize the object to render whenever my Events collections loads all of its data. I also need to ensure that all of the data is fetched from the /events resource:
    window.AppView = Backbone.View.extend({
      initialize: function() {
        Events.bind('all', this.render, this);
        Events.fetch();
      }
    });
If that does what I expect it to, then all I need is a render method for this AppView class:
    window.AppView = Backbone.View.extend({
      initialize: function() {
        Events.bind('all', this.render, this);
        Events.fetch();
      },
      render: function() {
        Events.each(function(event) {
          var start_date = event.get("startDate"),
              title = event.get("title"),
              description = event.get("description"),
              el = '#' + start_date;

          $(el).html(
            '<span title="' + description + '">' +
              title +
            '</span>'
          );
        });
      }
    });
I am not getting into Backbone's templating in that render function. Rather I am just using jQuery for now—just like I did last night in my pure AJAX solution. I get the start data of the event, knowing that my HTML calendar cells have ISO 8601 date IDs just like my start dates (e.g. '#2011-08-27'). If the event start date is on my calendar, then I replace the matching element's html with a <span> containing the event title.

Once I have this class in place, all that is left is to instantiate it:
    window.AppView = Backbone.View.extend({
      initialize: function() { /* ... */},
      render: function() { /* ... */}
    });

    window.App = new AppView;
And now, I have my calendar events populated on my calendar via Backbone.js:
In the end, that turns out to be a heck of a lot more code than my pure AJAX solution (34 LOC vs 16). The idea behind Backbone.js is not to simplify simple applications. Rather it provides structure when building complex client-side applications. So I will pick back up tomorrow adding some complexity like creating, removing, and moving calendar events.

Day #126

Friday, August 26, 2011

A Simple Node.js + CouchDB Calendar

‹prev | My Chain | next›

Last night, I got a nice, little node.js and CouchDB app thrown together. The node (really express.js) app serves up simple HTML and also passes through requests to the CouchDB database. Tonight I would like to serve up a static HTML calendar and populate events from the CouchDB store.

For as long as I have been building HTML calendars, I have always put the ISO8601 date on the day cells. In Jade templating this looks like:
...
  tr#week4
    td.sunday
    td.monday 22
    td.tuesday
    td.wednesday
    td#2011-08-25.thursday
    td#2011-08-26.friday
    td.saturday
...
The resultant HTML is then:
And the resultant page looks like:
The benefits of using ISO 8601 are numerous, which is why it is the de facto standard for XML and JSON dates and times. Since CouchDB is returning JSON, I can be pretty sure that it will be returning ISO 8601 (especially since I created the data in the first place). By identifying the date cells by ISO 8601, it will make it easy to tie date records to date cells. Let's have a look at what I mean...

Accessing the /events resource in my app returns:
{"total_rows":2,"offset":0,"rows":[
  {"id":"fdbed27594feb433c74e82eb910015e0",
   "key":"fdbed27594feb433c74e82eb910015e0",
   "value":{"rev":"2-b7c22d428e648a6cdd2978c213f79ec0"},
   "doc":{"_id":"fdbed27594feb433c74e82eb910015e0",
          "_rev":"2-b7c22d428e648a6cdd2978c213f79ec0",
          "startDate":"2011-08-25",
          "title":"create blog post",
          "description":"talk about node and CouchDB"}},
  {"id":"fdbed27594feb433c74e82eb91001f45",
   "key":"fdbed27594feb433c74e82eb91001f45",
   "value":{"rev":"1-2b18432cf6e63b82c6507ff28af9724c"},
   "doc":{"_id":"fdbed27594feb433c74e82eb91001f45",
          "_rev":"1-2b18432cf6e63b82c6507ff28af9724c",
          "startDate":"2011-08-26",
          "title":"blog again",
          "description":"add backbone into the node + couch mix"}}
]}
(this is just a pass-thru to CouchDB's _all_docs?include_docs=true)

To get those events into the calendar, I perform a jQuery getJSON call inside a document-ready:
  $(function() {
    $.getJSON('/events', function(data) {
      $.each(data.rows, function(i, rec) { add_event(rec.doc) });
    });
  });
For each of the rows in the events returned from CouchDB, I extract the document (the event itself) and make a call to add_event().

The add_event() function then exploits the fact that the cells in my calendar are identified with an ISO 8601 date:
  function add_event(event) {
    var date = event.startDate,
        title = event.title,
        description = event.description;

    $('#' + date).html(
      '<span title="' + description + '">' +
        title +
      '</span>'
    );
  }
If the startDate from CouchDB is 2011-08-26, then this function finds the correct cell via a jQuery $('#2011-08-26') selector. If that selector is found, then the inner HTML is replaced with the event's title (and the description in a <span> title attribute). If the calendar event is for a date not currently displayed, no worries, the selector returns an empty wrapped set, in which case the html() has nothing to do.

The result is a rather snappy:
Nice. There is no ability to modify or remove elements just yet, but it was quite easy to get this calendar populated quickly. Up tomorrow, I think I shall begin exploring doing this again, but with Backbone.js.


Day #126

Thursday, August 25, 2011

Pass-Thru Node.js and CouchDB

‹prev | My Chain | next›

Up tonight, I try to get started with a simple node.js / CouchDB application. I am a big fan of CouchDB with node.js because CouchDB speaks HTTP natively. There is no need for middleware or data drivers with CouchDB—I can just make HTTP requests and process the response or send it along to the client.

I already have the latest node.js installed and a CouchDB server running.

My first step is to create a simple express.js application to play with:
➜  repos  express calendar   
   create : calendar
   create : calendar/package.json
   create : calendar/app.js
   create : calendar/public/stylesheets
   create : calendar/public/stylesheets/style.css
   create : calendar/public/javascripts
   create : calendar/public/images
   create : calendar/views
   create : calendar/views/layout.jade
   create : calendar/views/index.jade
In the new "calendar" app directory, I need to install the express and jade packages from npm:
➜  calendar git:(master) ✗ npm install express jade
jade@0.14.2 ./node_modules/jade 
express-unstable@2.4.3 ./node_modules/express
├── mime@1.2.2
├── connect@1.6.0
└── qs@0.3.1
My next step is to use the Futon admin interface to create a database. I fancy a calendar app, so I name my database accordingly:


Next, I create a calendar event:


And an event for tomorrow:


To allow my simple express.js app to pass those events back to the browser, I need to establish a route for all events. In that route, I access the special _all_docs resource in CouchDB to pull back all records in the calendar DB (I prolly would not do that in a larger DB). Once the Couch DB response comes back, I write the data back to the browser:
app.get('/events', function(req, res){
  var options = {
    host: 'localhost',
    port: 5984,
    path: '/calendar/_all_docs'
  };

  http.get(options, function(couch_response) {
    console.log("Got response: %s %s:%d%s", couch_response.statusCode, options.host, options.port, options.path);

    res.contentType('json');

    // Send all couch data to the client
    couch_response.on('data', function (chunk) {
      res.write(chunk);
    });

    // When couch is done, so is this request
    couch_response.on('end', function (chunk) {
      res.end();
    });
  }).on('error', function(e) {
    console.log("Got error: " + e.message);
  });
});
That's a bit of work establishing 'data' and 'end' listeners. Fortunately, node has an answer for this case in the pipe method for all stream objects:
  http.get(options, function(couch_response) {
    console.log("Got response: %s %s:%d%s", couch_response.statusCode, options.host, options.port, options.path);

    couch_response.pipe(res)
  })
Any events emitted by couch_response will be sent to the original Response object. The result of calling accessing the /events resource is:


As can be seen in the screen shot, the actual event data is not being returned—just IDs and other metadata. To get the full event, I can create another express.js route with a similar callback:
app.get('/events/:id', function(req, res){
  var options = {
    host: 'localhost',
    port: 5984,
    path: '/calendar/' + req.params.id
  };

  http.get(options, function(couch_response) {
    console.log("Got response: %s %s:%d%s", couch_response.statusCode, options.host, options.port, options.path);

    couch_response.pipe(res);
  }).on('error', function(e) {
    console.log("Got error: " + e.message);
  });
});
This route makes use of some nifty parameter assignments in express.js. Named parameters in the route (the :id in /events/:id) are made available in the request params object (e.g. req.params.id). That bit of coolness aside, this is nearly identical to the all-events route from above.


That is all well and good, but I do not think that I want my client making dozens of calls to this resource for each event on a particular calendar. Instead, I go back into my /events route and add include_docs=true to the URL. This includes the documents (which are relatively small) along with the meta data:


Cool beans. That is a good stopping point for tonight. Tomorrow I will hook those into jQuery Ajax calls to build a month-view calendar. And then the fun begins with a backbone.js equivalent.

Day #124

Backbone.js Chain

‹prev | My Chain | next›

I loved doing my SPDY chain / book. I loved it so much that I am doing it again!

This time, I am learning Backbone.js, the awesome little client-side Javascript framework for building interactive applications. My goal is to again produce a book. The timeline for Recipes with Backbone:
  • Sep 30—alpha version (available on the cheap)
  • Oct 31—beta version
  • Nov 30—final version
But wait! That's not all! This time around I am teaming up with the incomparable Nick Gauthier, who has been doing incredible things with Backbone.js for many a month over at Shortmail.

My chains have always been about improving my craft and maximizing my learning. In my original chain, I accomplished this by blogging every day. The idea being that I need to learn something every day well enough to write about it. Some days work better than others, but the overall investment was crazy worth it.

For my SPDY chain, I decided to take the learning and kick it up a notch. In addition to the daily blogging, I also committed to producing the best book about the SPDY protocol that I possibly could. How better to learn a thing than work my ass off to produce the best book on said thing? I am quite proud of the end result. The SPDY Book is a strong tome on the subject. You should totally buy it from me or from The Pragmatic Bookshelf who are also distributing it.

This time around, I am kicking it up yet one more notch by working with Nick. I learned a ton blogging every day. I increased my learning tenfold writing a book. What will happen when I blog every day, write a book, and co-author a book with a super talented subject matter expert? Could it be a learning singularity? Maybe.

Stick around to find out…

Wednesday, August 24, 2011

No Connection Header in Node-SPDY

‹prev | My Chain | next›

Up today, I hope to finally resolve issue #1 of express-spdy.

According to draft #2 of the SPDY specification, servers should not set Connection or Keep-Alive headers. In draft #3, that changed to must not set:
The Connection, Keep-Alive, Proxy-Connection, and Transfer-Encoding headers are not valid and MUST not be sent.
But, as issue #1 points out, the Connection header is being set and is being set to Keep-Alive.

Last night, I tracked that down to the following node-spdy code:
var Response = exports.Response = function(cframe, c) {
//...
  this._headers = {
    'Connection': 'keep-alive'
  };
//...
};
The solution seems simple enough—remove the Connection key-value pair from headers. But before doing that for real, I need to make sure that Chrome will continue to allow connections without it. This would not be the first time that Chrome failed to follow the SPDY spec.

So, I comment out that line in node-spdy, start up my test express-spdy app and see:

As always, it looks very much like a vanilla express.js app, but, checking the SPDY tab in Chrome's about:net-internals, I see a beautiful SPDY session ongoing:
t=1314238477571 [st= 47]     SPDY_SESSION_SYN_REPLY  
                             --> flags = 0
                             --> content-length: 50360
                                 content-type: text/html
                                 status: 200 OK
                                 version: HTTP/1.1
                                 x-powered-by: Express
                             --> id = 1
Nice. The Content header is definitely missing from there. Even better, subsequent requests to the sample app produce normal SPDY SYN_STREAMs and SYN_REPLYs with nary a RST_STREAM in sight:
t=1314239371362 [st=893838]     SPDY_SESSION_SYN_STREAM  
                                --> flags = 1
                                --> accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
                                    accept-charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
                                    accept-encoding: gzip,deflate,sdch
                                    accept-language: en-US,en;q=0.8
                                    host: jaynestown.local:3000
                                    method: GET
                                    referer: https://jaynestown.local:3000/one.html
                                    scheme: https
                                    url: /two.html
                                    user-agent: Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/535.2 (KHTML, like Gecko) Chrome/15.0.854.0 Safari/535.2
                                    version: HTTP/1.1
                                --> id = 9
t=1314239371388 [st=893864]     SPDY_SESSION_SYN_REPLY  
                                --> flags = 0
                                --> accept-ranges: bytes
                                    cache-control: public, max-age=0
                                    content-length: 49489
                                    content-type: text/html; charset=UTF-8
                                    etag: "49489-1309920451000"
                                    last-modified: Wed, 06 Jul 2011 02:47:31 GMT
                                    status: 200 OK
                                    version: HTTP/1.1
                                    x-powered-by: Express
Cool. So it seems that Connection/Keep-Alive can be safely removed.

Now to drive this implementation with tests. I revert my change and create a new test/response-test.js file. For now, I am only testing the response headers, so the overall vows.js structure will be:
vows.describe('SPDY Response').
addBatch({
  'headers': {
    topic: function() {
      // Setup connection request and send response here
    },
    'it should not contain a Connection header': function(cframe) {
      // Assertions here
    }
  }
})
The assertions ought to be fairly straight forward:
'it should not contain a Connection header': function(cframe) {
      assert.equal(cframe.headers['Connection', undefined);
    }
The topic of the test requires a little more effort. The topic will actually come through the connection object's write method. So I need a connection object, a bare-bones request, and a response to send along that connection:
topic: function() {
      var callback = this.callback,
          connection = {
            zlib: spdy.createZLib(),
            write: function(cframe) { callback(null, cframe); }
          },
          req = { data: { streamID: 1 } },
          response = spdy.createResponse(req, connection);

      response._flushHead();
    }
Only, when I run this test, I get:
➜  node-spdy git:(master) ✗ vows test/response-test.js
✗

  headers
    ✗ it should not contain a Connection header
      » An unexpected error was caught: {"0":128,"1":2,"2":0,"3":2,"4":0,"5....>
The unexpected error actually goes on for a looong time. That's just the control frame itself. Nothing too interesting there, so why is it being reported as an "unexpected error"? Hrm... something seems vaguely familiar with that error. Where have I seen that before? Oh, yeah. Sheesh, I keep coming across that. I am beginning to think this may be a vows issue not a Chris issue. Anyhow, the solution, is to ensure that the callback is being called with two values:
write: function(cframe) { callback(null, cframe); }
And, with that, I have my test fixed. And by "fixed", I mean broken:
➜  node-spdy git:(master) ✗ vows test/response-test.js --spec

♢ SPDY Response

  headers
    ✗ it should not contain a Connection header
      » expected undefined,
        got      'keep-alive' (==) // response-test.js:31
 
✗ Broken » 1 broken (0.018s)
Now, I remove the "Connection" header from node-spdy's Response class for good and, just like that, I have BDD'd a fix for my issue:
➜  node-spdy git:(master) ✗ vows test/response-test.js --spec

♢ SPDY Response

  headers
    ✓ it should not contain a Connection header
 
✓ OK » 1 honored (0.005s)
Yay!

I'm not sure exactly what I'll work on tomorrow, but I'm pretty sure it's gonna be awesome.

Day #122

Tuesday, August 23, 2011

Batching Vows (in express-spdy)

‹prev | My Chain | next›

Over the past few nights, I think that vows.js has more or less beaten me into submission. I had been trying to do some setup outside of vows' batches, but bitter experience seems to be telling me that this is a bad idea.

Ultimately, what I want to do inside my test suite is establish a simple express-spdy server, open a SSL/SPDY connection, send a SPDY request and test the response. Rather than do any of that outside of the tests themselves, I can do them in "batches" to be run sequentially. That way, I can be guaranteed that connections are ready when I expect them to be.

The server needs to be in the first "batch" of tests—it needs to be running in order for connections to be made:
vows.describe('Express SPDY response').
addBatch({
  '[setup]': {
    'establish a simple SSL express server': function() { // ... }
  }
})
I can then craft the SPDY request and open a client SSL connection in the next batch:
.addBatch({
  '[setup]': {
    'open an SSL connection to the server': function() { // ... },
    'craft a simple SPDY request': function() { // ... }
  }
})
In neither of these batches do I have a topic or an actual vow (as seen below, the "vows" just setup a server, a SSL client connection, and create a request object). I am abusing vows batches to ensure that things are done in the proper order. I apply a convention of describing these setup batches with the '[setup]' label. I think it kinda-sorta almost works.

In the last batch, I send the created request over the established connection and perform the actual test:
.addBatch({
  'Sending the request': {
    topic: function () {
      // Send request here
    },
    'should get a response with no Keep-Alive': function(cframe) {
      // Assertions here
    }
  }
})
I put the sending of the request in a separate batch because it needs to have both the connection and request in place. Multiple contexts in the same batch can be run in any order (or in parallel). In practice, they are executed sequentially, but I would rather not set myself up for failure in the future.

I could have also made the actual sending of the packet a sub-context of the connection setup. The implementation would have been cleaner, but I stick with separate batches to keep the --spec output as pretty (and readable) as possible.

As mentioned, the actual implementation of the test is muddled—by use of global variables that pass the connection and SPDY request between batches. Ick, I know, but this is test code. Maintainability, while important, is not as important as readability. So...
var connection,
spdy_request;

vows.describe('Express SPDY response').
...
The complete test looks like:
var connection,
    spdy_request;

vows.describe('Express SPDY response').
addBatch({
  '[setup]': {
    'establish a simple SSL express server': function() {
      var server = express.createServer(options);

      server.get('/', function(req, res){
        res.send('wahoo');
      });

      server.listen(PORT);
      return true;
    }
  }
}).
addBatch({
  '[setup]': {
    'open an SSL connection to the server': function() {
      connection = tls.connect(PORT, 'localhost', options, function() {});
    },
    'craft a simple SPDY request': function() {
      spdy_request = spdy.createControlFrame(
        spdy.createZLib(),
        { type: spdy.enums.SYN_STREAM, streamID: 1, flags: 0 },
        { version: 'HTTP/1.1', url: '/', method: 'GET' }
      );
    }
  }
}).
addBatch({
  'Sending the request': {
    topic: function () {
      var callback = this.callback;

      var parser = spdy.createParser(spdy.createZLib());
      connection.pipe(parser);

      parser.on('cframe', function(cframe) {
        if (cframe.headers.type == spdy.enums.SYN_REPLY) {
          callback(null, cframe);
        }
      });

      connection.write(spdy_request, function(){});
    },
    'should get a response with no Keep-Alive': function(cframe) {
      assert.notEqual(cframe, undefined);
      assert.notEqual(cframe.data, undefined);
      assert.notEqual(cframe.data.nameValues, undefined);
      assert.equal(cframe.data.nameValues['connection'], undefined);
    }
  }
})
And the output from the test is:
➜  express-spdy git:(master) ✗ vows test/response-test.js --spec

♢ Express SPDY response

  [setup]
    ✓ establish a simple SSL express server
  [setup]
    ✓ open an SSL connection to the server
    ✓ craft a simple SPDY request
  Sending the request
    ✗ should get a response with no Keep-Alive
      » expected undefined,
        got      'keep-alive' (==) // response-test.js:74

✗ Broken » 3 honored ∙ 1 broken (0.043s)
I rather like that. I have two, separate [setup] blocks for the server and the connection. Finally, I have my test that actually tries to send the request and validate that there is not a keep-alive issued on the connection.

At this point, I am good to go. I have my necessary failing test describing the undesired behavior. Just as importantly, I have nice, readable spec output.

So how do I fix that failure? Well, first I check through express-spdy, but it is kinda tiny:
➜  express-spdy git:(master) ✗ wc -l *js
  25 express.js
   1 index.js
  67 spdy.js
  93 total
There is nothing in there except for hooking this prototype up to that method and choosing this server implementation if that argument is present. It is really, really small.

So where do I need to make the change then? Ugh. If it is in express.js, things are going to get ugly fast. I will have to pull in a lot more than 93 lines of javascript to get that working. So, mostly in an effort to avoid the mere thought of that work, I start in node-spdy. Specifically, I look through response.js in node-spdy, when what do my wondering eyes see?
var Response = exports.Response = function(cframe, c) {
// ...
  this._headers = {
    'Connection': 'keep-alive'
  };
//...
};
Aw, fer cryin' out loud. I spent the last four nights flailing through self-inflicted bad vows.js code only to find that I am testing the wrong thing?

Yup, that's exactly what I did. Commenting out the 'Connection' line in node-spdy response.js leaves me with a passing test:
➜  express-spdy git:(master) ✗ vows test/response-test.js --spec

♢ Express SPDY response

  [setup]
    ✓ establish a simple SSL express server
  [setup]
    ✓ open an SSL connection to the server
    ✓ craft a simple SPDY request
  Sending the request
    ✓ should get a response with no Keep-Alive

✓ OK » 4 honored (0.031s)
Ah well, it is not a complete waste. I have a much better grasp on vows.js topics, contexts, sub-contexts and vows. I also have a very pretty spec for express-spdy (even if it is testing underlying node-spdy functionality). Up tomorrow I will switch back to node-spdy and pursue the issue there.


Day #122

Monday, August 22, 2011

More Fun with Vows.js App Server Testing

‹prev | My Chain | next›

Up tonight, I hope to write an actual vows.js test for issue #1 in express-spdy.

Last night I found the following error when running my simple, bullet tracer test:
➜  spdybook git:(master) cd ~/express-spdy
➜  express-spdy git:(master) ✗ vows test/response-test.js 

node.js:205
throw e; // process.nextTick error, or 'error' event on first tick
^
TypeError: Object 
Ultimately, I tracked that down to using an older version of node-spdy—one that didn't quite work with node.js version 0.5.5. Currently, that version is only available in the github repository, so I replace the npm installed node-spdy with a link to my local copy:
➜  express-spdy git:(master) ✗ cd node_modules 
➜  node_modules git:(master) ✗ ls
connect-spdy  express-unstable  spdy  vows
➜  node_modules git:(master) ✗ mv spdy spdy.bak
➜  node_modules git:(master) ✗ ln -s ~/repos/node-spdy spdy
➜  node_modules git:(master) ✗ cd ..
➜  express-spdy git:(master) ✗ 
Now, when I check my vows, I find:
➜  express-spdy git:(master) ✗ vows test/response-test.js
here
· 

✓ OK » 1 honored (0.014s)
Wow. That's just plain anti-climatic. I spent way too much time yesterday trying to get that working. And the ultimate solution is so simple as to be borderline silly. Ah well, such is progress sometimes.

And that is all that the passing test is—progress—because I am not actually testing anything yet. Just logging the work "here" in the vows:
vows.describe('Express SPDY response').addBatch({
'with an ordinary SPDY GET request': {
topic: function() {
var connection = tls.connect(PORT, 'localhost', options, function(){});
connection.write(spdy_request, this.callback);
},
    'receives a SYN_REPLY' : function() {
console.log("here")
}
}
}).export(module);
So I try to actually test the response, but wind up getting:
node.js:205
throw e; // process.nextTick error, or 'error' event on first tick
^
Error: ECONNREFUSED, Connection refused
at Socket._onConnect (net_legacy.js:606:18)
at IOWatcher.onWritable [as callback] (net_legacy.js:186:12)
After much fiddling, I find that moving the connection into the vow.js topic does the trick:
vows.describe('Express SPDY response').addBatch({
'with an ordinary SPDY GET request': {
topic: function() {
var callback = this.callback;
var connection = tls.connect(PORT, 'localhost', options, function(){console.log('**** connection')});
connection.write(spdy_request, callback);
},
'receives a SYN_REPLY' : function(foo) {
console.log(foo)
console.log("here")
}
}
}).export(module);
The only problem I am left with is that the server never returns and the callback is returning empty arguments:
➜  express-spdy git:(master) ✗ node test/response-test.js --spec
Express server listening on port 23432
**** connection
undefined
here
· ✓ OK » 1 honored (0.020s)
wahoo
I suppose this is why the node-spdy module proper establishes servers and makes the connections is separate vows.js batches. I will investigate that tomorrow.

Day #121

Sunday, August 21, 2011

Sometimes There is No Substitute for Debugger

‹prev | My Chain | next›

Up tonight, I'd like to get back in the swing of express-spdy things by working on the only outstanding issue in the project's github tracker.

I think it would be fairly easy to just "fix" that issue, but I would like to take this opportunity to actually start testing express-spdy. I have been in the mode of getting to work against Chrome rather than BDDing it. I have yet to even take some time to accurately describe how it ought to behave. Now is a good time to get started on that.

I will use the very nice vows.js testing framework for this. I know that express.js proper likes to use espresso for testing, but I am more familiar with vows.

First up in my test, I have a lot of requiring to do:
var vows = require('vows'),
assert = require('assert'),
express = require('..'), // express-spdy
spdy = require('spdy'),
fs = require('fs'),
tls = require('tls');
I need vows and assert for testing. The request are required for establishing and configuring clients and servers for the test.

I copy a few options from the node-spdy testing framework:
// Common options

var PORT = 23456;

var options = {
key: fs.readFileSync(__dirname + '/../node_modules/spdy/keys/spdy-key.pem'),
cert: fs.readFileSync(__dirname + '/../node_modules/spdy/keys/spdy-cert.pem'),
ca: fs.readFileSync(__dirname + '/../node_modules/spdy/keys/spdy-csr.pem'),
npnProtocols: ['spdy/2'],
debug: true
};
With that, I can establish a test express-spdy server:
// Server

var app = express.createServer(options);

app.get('/', function(req, res){
res.send('wahoo');
});

app.listen(PORT);
That server should reply with nothing more that the string "wahoo", which should be easy enough to test. But first, I need to craft a SPDY SYN_SESSION to be used to request data from the express-spdy server:
// Client

var zlib = spdy.createZLib();

var spdy_request = spdy.createControlFrame(zlib,
{ type: spdy.enums.SYN_STREAM, streamID: 1, flags: 0 },
{ version: 'HTTP/1.1', url: '/', method: 'GET' }
);
Lastly a vow topic that creates the connection and sends the response into the vows via the this.callback mechanism built into vows.js:
vows.describe('Express SPDY response').addBatch({
'spdy.createServer': {
topic: function() {
var connection = tls.connect(PORT, 'localhost', options, function(){});
connection.write(spdy_request, this.callback);
},
'foooo' : function() {
console.log("here")
}
}
})
For now, I am not actually testing anything in here. I am fine just making it to that console.log statement. Unfortunately, when I run my test, I am not coming close to reaching it:
➜  express-spdy git:(master) ✗ vows test/response-test.js

node.js:205
throw e; // process.nextTick error, or 'error' event on first tick
^
TypeError: Object
Bah! I am not getting any kind of back trace in there to tell me where things are going wrong. Debugging with console.log tells me that the error is somewhere inside createControlFrame:
var spdy_request = spdy.createControlFrame(zlib,
{ type: spdy.enums.SYN_STREAM, streamID: 1, flags: 0 },
{ version: 'HTTP/1.1', url: '/', method: 'GET' }
);
But where?

To answer that, I drop down to the node debugger:
➜  express-spdy git:(master) ✗ node debug test/response-test.js
debug> run
debugger listening on port 5858connecting... ok
breakpoint #1 in #<Object>.[anonymous](exports=#<Object>, require=function require(path) {
return self.require(path);
}, module=#<Module>, __filename=/home/cstrom/repos/express-spdy/test/response-test.js, __dirname=/home/cstrom/repos/express-spdy/test), /home/cstrom/repos/express-spdy/test/response-test.js:1
(function (exports, require, module, __filename, __dirname) { var vows = require('vows'),
^
debug> step
...
debug> s
break in nvsToBuffer(zlib=#<Object>, headers=#<Object>, nvs=#<Object>), /home/cstrom/repos/express-spdy/node_modules/spdy/lib/spdy/protocol.js:132
buff.writeUInt16(nvsCount, 0, 'big');
After stepping through much of the code, I eventually see that the cause is a call to writeUInt16. Dammit. I am using the wrong freaking version of node-spdy with node 0.5.5. Ah well, that's easy enough to fix. Tomorrow.


Day #120

Saturday, August 20, 2011

Honoring Vows and Events in Node-SPDY

‹prev | My Chain | next›

I found yesterday that tests in node-spdy master were failing with the most recent version of vows.js. Since I am working with edge node-spdy, edge node.js and a version of vows.js that is less than a week old, it took some time to track down the issue.

The failure looks like:
➜  node-spdy  vows --spec ./test/spdy-basic-test.js

♢ SPDY/basic test

✗ Errored » callback not fired
in spdy.createServer
in SPDY/basic test
in test/spdy-basic-test.js
✗ Errored » 1 errored ∙ 1 dropped
Since vows is meant to test a reactor patten framework, it has the concept of testing callbacks built-in. But the thing is, the failing test is not testing a callback, it is just trying to test whether or not the createServer function returns a SPDY server thingy:
vows.describe('SPDY/basic test').addBatch({
'spdy.createServer': {
topic: function() {
return spdy.createServer(options);
},
'should return spdy.Server instance': function (_server) {
assert.instanceOf(_server, spdy.Server);
server = _server;
}
}
}).addBatch({
//...
Clever placement of console.log statements proves that spdy.createServer() is, indeed returning a SPDY server thingy, but vows is not even making it into the assertion function. It is failing because the supposed callback is not being fired. What gives?

Well, I tracked this down to a recent commit in vows.js: 76565ef. Specifically, the heuristic used to decide if a topic is a callback thingy changed from looking at the constructor:
// If the topic doesn't return an event emitter (such as a promise),
// we create it ourselves, and emit the value on the next tick.
if (! (topic && topic.constructor === events.EventEmitter)) {
// ... 
Instead, vows now asks if the topic is an instanceof events.EventEmitter:
// If the topic doesn't return an event emitter (such as a promise),
// we create it ourselves, and emit the value on the next tick.
if (! (topic && (topic instanceof events.EventEmitter))) {
// ... 
In node-spdy, createServer uses the Server constructor:
core.createServer = function(options, requestListener) {
  return new Server(options, requestListener);
};
Thus, the earlier version of vows would have moved right past such a topic—the constructor is Server, not events.EventEmitter. But... events.EventEmitter is part of the prototype chain by virtue of an util.inherits:
var Server = core.Server = function(options, requestListener) {
// ...
};
util.inherits(Server, tls.Server);
So, now that vows is checking the entire prototype chain (and probably rightly so) to see if the topic is an events.EventEmitter, the first node-spdy test is no longer valid:
vows.describe('SPDY/basic test').addBatch({
'spdy.createServer': {
topic: function() {
      return spdy.createServer(options);
},
'should return spdy.Server instance': function (_server) {
assert.instanceOf(_server, spdy.Server);
server = _server;
}
}
}).addBatch({
//...
But how to fix it?

I could argue that the test itself is weak and should go away. Most, if not all of the remaining tests would fail if, somehow, spdy.createServer were to start returning a String. Still, it's a nice sanity check.

So I wrap the topic in a thin, non-events.EventEmitter wrapper. More specifically, I wrap it in an object literal:
vows.describe('SPDY/basic test').addBatch({
'spdy.createServer': {
topic: function() {
      return {server: spdy.createServer(options)};
},
'should return spdy.Server instance': function (topic) {
      server = topic.server;
assert.instanceOf(server, spdy.Server);
}
}
}).addBatch({
//...
That is kind of a hack, but for a weak test, it is not that much of a stretch. Or maybe I am just rationalizing. I've been known to do that.

Anyhow, I have the test passing again:
vows runner running SPDY/basic test 
♢ SPDY/basic test

spdy.createServer
✓ should return spdy.Server instance
That is a fine stopping point for today. Up tomorrow, hopefully I can actually get back to the bug that I had hoped to investigate before I came across my latest rabbit hole.

Day #119

Friday, August 19, 2011

Broken Vows in Node-SPDY

‹prev | My Chain | next›

For some reason tonight, I give the test suite in node-spdy a try. This was greeted with the following:
➜  node-spdy  node test/spdy-basic-test.js

✗ Errored » callback not fired
in spdy.createServer
in SPDY/basic test
in undefined✗ Errored » 1 errored ∙ 1 dropped
I am unsure what could have changed to break the vows.js between the last release of node-spdy. Hopefully it wasn't my changes last night.

Checking things out with node 0.5.3 (instead of edge node 0.5.5), I see that the suite is still passing:
➜  node-spdy  vows --spec test/spdy-basic-test.js

♢ SPDY/basic test

spdy.createServer
✓ should return spdy.Server instance
Listening on this server instance
✓ should be successfull
Calling spdy.createZLib
✓ should return instance of spdy.ZLib
Creating new connection to this server
✓ should receive connect event
Creating parser and waiting for SYN_REPLY
✓ should end up w/ that frame
Sending control SYN_STREAM frame should be successfull and sending request body
✓ should emit data and end events on request
Creating parser and waiting for Data packet
✓ should end up w/ that frame
When calling server.close
✓ all connections will be dropped

✓ OK » 8 honored (0.038s)
Just to be 100% certain that I had not messed up earlier by installing different versions of vows locally and globally or something else along those lines, I reinstall the latest against node 0.5.5 locally:
➜  node-spdy  npm install vows
vows@0.5.10 ./node_modules/vows
└── eyes@0.1.6
...and globally:
➜  node-spdy  npm install vows -g
/home/cstrom/local/node-v0.5.5/bin/vows -> /home/cstrom/local/node-v0.5.5/lib/node_modules/vows/bin/vows
eyes@0.1.6 /home/cstrom/local/node-v0.5.5/lib/node_modules/vows/node_modules/eyes
vows@0.5.10 /home/cstrom/local/node-v0.5.5/lib/node_modules/vows
And still, when I run my tests, I find:
➜  node-spdy  vows --spec ./test/spdy-basic-test.js

♢ SPDY/basic test


✗ Errored » callback not fired
in spdy.createServer
in SPDY/basic test
in test/spdy-basic-test.js
✗ Errored » 1 errored ∙ 1 dropped
Hrm... that callback not fired message used to only apply to tests that have a return value. There is definitely not a return value in "SPDY/basic test":
vows.describe('SPDY/basic test').addBatch({
'spdy.createServer': {
topic: function() {
return spdy.createServer(options);
},
    'should return spdy.Server instance': function (_server) {
assert.instanceOf(_server, spdy.Server);
server = _server;
}
}
}).addBatch({
// ...
Even stranger is that the "callback not fired" message seems to be coming from the vows topic, not the test itself.

My next step is to boil down the batch to absolute basics:
vows.describe('SPDY/basic test').addBatch({
'spdy.createServer': {
topic: spdy.createServer(options),
'should return spdy.Server instance': function (_server) {
      console.log("*****")
assert.equal(42, 42);
}
}
}).export(module);
Still it fails. And still, it is not making it into the actual vow.

Next, I replace the vows being used with a local copy of the github clone:
➜  ~  cd ~/repos
➜  repos  git clone https://github.com/cloudhead/vows.git
Cloning into vows...
➜  repos  cd vows
➜  vows git:(master) npm install eyes
eyes@0.1.6 ./node_modules/eyes

➜  vows git:(master) cd ~/repos/node-spdy/node_modules 
➜  node_modules git:(master) ✗ mv vows vows.npm
➜  node_modules git:(master) ✗ ln -s ~/repos/vows
I am pretty sure that I recall npm arguments that allow me to symlink a local repository like that, but I know I recall the ln -s arguments.

Unfortunately, I am unable to isolate the error after that.


Day #118

Thursday, August 18, 2011

Mitigation Strategy for Unstable writeUInt32 in Node.js

‹prev | My Chain | next›

Last night I found that the protocol library in node-spdy is broken on node 0.5.4. It is even worse on master / 0.5.5.

The problem occurs in writeUInt32 (and it's 16 bit equivalent). In a binary protocol like SPDY, writing integers is kind of a big deal. The endianness of those integers used to be specified by a string argument ('big' or 'little'). Now it is specified in the method name itself:
Buffer.prototype.writeInt32BE = function(value, offset, noAssert) {
//...
};
I think that my strategy for dealing with this will be to check the Buffer prototype for a writeInt32BE method. If not there, I will add my own that wraps the old writeUInt32 method.

So, first up, I search and replace all instances of UInt methods with their big or little endian equivalent. Thus, this:
result.writeUInt32(headers.streamID & 0x7fffffff, 0, 'big');
...Becomes:
result.writeUInt32BE(headers.streamID & 0x7fffffff, 0);
Since the SPDY SETTINGS frame continues to violate the spec and be little endian, I do need to be careful to replace some UInt calls with a LE equivalent:
exports.createSettingsFrame = function(zlib, settings) {
// ...
buff.writeUInt32LE(raw_key & 0xffffff, offset);
// ...
}
I also have to make similar changes to the readUInt equivalents in parser.js, but, once those changes are in place, I am able to run an express-spdy site on node 0.5.5-pre:



And the SPDY tab confirms that it is, indeed a SPDY session:



OK great. I have node-spdy working with the soon-to-be-released node 0.5.5, but what about previous versions of node like 0.5.3? (again, I am skipping 0.5.4 completely)

To handle that case, I check the Buffer prototype. If it defines writeUInt32BE, then all is well and no changes are required. If it does not, then I need to add it myself:
/**
* Compatibility with older versions of node
*/
if (!Buffer.prototype.writeUInt32BE) {
Buffer.prototype.writeUInt32BE = function(value, offset, noAssert) {
this.writeUInt32(value, offset, 'big');
};
Buffer.prototype.writeUInt32LE = function(value, offset, noAssert) {
this.writeUInt32(value, offset, 'little');
};
Buffer.prototype.writeUInt16BE = function(value, offset, noAssert) {
this.writeUInt16(value, offset, 'big');
};
}
In the case that an older version of node is being used, one that does not define writeUInt32BE, I define it myself using the same method signature used in node 0.5.5. In such a case, it must be an older version of node (because they wouldn't change it again would they?). In that case, I simply pass along the same parameters to the writeUInt32 method that existed back then.

I do the same for the readUInt equivalents over in parser.js:
/**
* Compatibility with older versions of node
*/
if (!Buffer.prototype.readUInt32BE) {
Buffer.prototype.readUInt32BE = function(offset, noAssert) {
this.readUInt32(offset, 'big');
};
Buffer.prototype.readUInt16BE = function(offset, noAssert) {
this.readUInt16(offset, 'big');
};
}
Easy peasy.

First I retry my test express-spdy app to verify that nothing broke. Nothing did break. I still have my nice SPDY sessions going on.

Now the fun part. I start up the sample express-spdy app with node 0.5.3:
➜  express-spdy-test  ~/local/node-v0.5.3/bin/node app.js
And when I access the site in Chrome... nothing. There are no errors in the console, but the browser is not doing anything other than showing the little spinny, throbber.

Well, no errors is a good sign. At least nothing is crashing. So, for the most part everything is hooked up, but maybe not writing properly. Or... reading.

Ack! Dang it, this is Javascript, not Ruby and not even Coffeescript. This is Javascript and my read statements need to return values:
Buffer.prototype.readUInt32BE = function(offset, noAssert) {
return this.readUInt32(offset, 'big');
};
Better.

With that... it actually works! Until they change writeUInt32 again. Ah well, at least I have a decent mitigation strategy in place now!

Expect a new version of node-spdy to hit npm in a day or so.


Day #117

Wednesday, August 17, 2011

Express-Spdy on Node 0.5.4

‹prev | My Chain | next›

With my git-scribe changes more or less complete (in my fork), I am ready to switch focus again back to SPDY. I do not believe that this will necessitate any updates to The SPDY Book (though who knows?). This is more just clean-up and maintenance that I need to do before moving on.

Up today, I am going to finally tackle the first (and so far only) known issue affecting express-spdy: keep alive HTTP headers. The SPDY SPDY protocol spec explicitly prohibits keep-alives. Although Chrome is not choking on these currently, it would be good to eliminate them.

But even before that, I may as well ensure that everything is still working on node 0.5.4. So I download 0.5.4 and install it per the express-spdy install instructions (skipping the openssl install because I already had that working). Along the way, I update the install instructions to reference unstable node (0.5) rather than installing from github.

With that, I am ready to load things up. Sadly, instead of seeing my nice CA signed certificate, I am greeted by a connection error. Checking the server, I see:
➜  express-spdy-test  node app.js
Express server listening on port 3000

assert.js:104
throw new assert.AssertionError({
^
AssertionError: missing or invalid endian
at Buffer.writeUInt32 (buffer.js:860:10)
at /home/cstrom/repos/express-spdy-test/node_modules/express-spdy/node_modules/spdy/lib/spdy/protocol.js:72:8
at SPDYServer.<anonymous> (/home/cstrom/repos/express-spdy-test/node_modules/express-spdy/node_modules/spdy/lib/spdy/core.js:113:13)
at SPDYServer.emit (events.js:70:17)
at SecurePair.<anonymous> (tls.js:819:14)
at SecurePair.emit (events.js:64:17)
at SecurePair.maybeInitFinished (tls.js:649:10)
at CleartextStream._push (tls.js:294:17)
at SecurePair.cycle (tls.js:617:20)
at EncryptedStream.write (tls.js:121:13)
Ack! Looks as though I shall be yak shaving today instead of investigating defects. Ah well, there is always tomorrow for defects.

To track this down, I fetch and merge the recent changes to node:
➜  node git:(master) git fetch origin
remote: Counting objects: 6833, done.
remote: Compressing objects: 100% (2047/2047), done.
remote: Total 5769 (delta 4367), reused 4951 (delta 3658)
Receiving objects: 100% (5769/5769), 12.95 MiB | 152 KiB/s, done.
Resolving deltas: 100% (4367/4367), completed with 625 local objects.
From https://github.com/joyent/node
8971b59..4cf931d  master     -> origin/master
58a1d7e..5e37e10  v0.4       -> origin/v0.4
* [new branch]      v8-3.1     -> origin/v8-3.1
From https://github.com/joyent/node
* [new tag]         v0.4.10    -> v0.4.10
* [new tag]         v0.4.9     -> v0.4.9
* [new tag]         v0.5.0     -> v0.5.0
* [new tag]         v0.5.1     -> v0.5.1
* [new tag]         v0.5.2     -> v0.5.2
* [new tag]         v0.5.3     -> v0.5.3
* [new tag]         v0.5.4     -> v0.5.4

➜  node git:(master) git merge origin/master
...
create mode 100644 tools/gyp/tools/pretty_gyp.py
create mode 100755 tools/gyp/tools/pretty_sln.py
create mode 100755 tools/gyp/tools/pretty_vcproj.py
create mode 100755 tools/gyp_node
delete mode 100644 tools/nodejs.pc.in
Wow. Didn't think it had been that long since I updated my local copy of node.

I eventually track down the stacktrace to b7c23ac3 in node.js from last week. Blah.

In that commit (and in node 0.5.4), the writeUInt32 method takes a boolean option to describe the endianness of the Buffer object being written:
Buffer.prototype.writeUInt32 = function(value, offset, bigEndian) {
//... 
assert.ok(typeof (bigEndian) === 'boolean',
'missing or invalid endian');
//...
}
In previous versions of node, the endianness was set via a string:
Buffer.prototype.writeUInt32 = function(value, offset, endian) {
//...
assert.ok(endian !== undefined && endian !== null,
'missing endian');

assert.ok(endian == 'big' || endian == 'little',
'bad endian value');
//...
};
This is how node-spdy is invoking writeUInt32:
buff.writeUInt32(assocStreamID & 0x7fffffff, 4, 'big');
Since 0.5.4 is asserting that the third parameter is a boolean, I get the backtrace.

Complicating matters even further, however, is the fact that writeUInt32 has been removed entirely from master on github (thanks to mscdex in #nodejs for pointing that out). In the next unstable version of node, it will have to be invoked via writeUInt32BE or writeUInt32LE. I'm not sure I'm a fan of including the signedness, the size, the type, and the endiannes in the method name, but it is what it is. Blah.

I think my best strategy is going to be to add a compatibility layer into node-spdy to invoke the proper writeUInt method with the proper arguments. That's gonna get ugly fast, but the first thing that I will do is to explicitly NOT support express-spdy on 0.5.4. The 0.5.4 release should be the only one with the boolean version of writeUInt32. Hopefully after that, I can stick with writeUInt32BE.

I update the express-spdy README and INSTALL instructions. Next, I update the supported engines in express-spdy's package.json to indicate non-support of 0.5.4 (I will update again once 0.5.5 has been released):
//...
"engines": {
"node": ">= 0.5.0-pre < 0.5.4"
},
//...
Lastly, I publish express-spdy to the npm registry:
➜  express-spdy git:(master) npm install .
. Now, if I try to install express-spdy, I get:
➜  tmp  node --version
v0.5.4
➜  tmp  npm install express-spdy
npm ERR! Unsupported
npm ERR! Not compatible with your version of node/npm: express-spdy@0.0.5
npm ERR! Required: {"node":">= 0.5.0-pre < 0.5.4"}
npm ERR! Actual:   {"npm":"1.0.25","node":"v0.5.4"}
Last up tonight, I try out express-spdy on node 0.5.3. Again I follow the instructions in express-spdy's INSTALL instructions. With node 0.5.3, the latest express-spdy installs. More importantly, the site loads: And I have valid SPDY going on: Well, that went sideways pretty quickly, but hopefully I recovered enough to get express-spdy in a usable state. Up tomorrow, I think that I will install node from master so that I can get a head start on node-spdy / express-spdy on node 0.5.5. Day #116