Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The node.js aesthetic (substack.net)
136 points by substack on Nov 30, 2011 | hide | past | favorite | 76 comments


>You don't need to reason about multiple instruction pointers or mutexes or re-entrant, interruptible execution because all the javascript you write just lives in a single thread.

I see....on the other hand, when I write request-handling code using (any web framework ever on any platform), you're suggesting that we do worry about mutexes and reentrance (not to mention, handling a single request uses multiple threads?) and that these details aren't already handled by (any web framework ever) ?

Interesting.


Node is not a web framework. It's something one might use to build a web framework. So it's a bit silly to say that 'any web framework ever' already deals will this for you – well, duh, but the framework author certainly had to deal with thread safety (possibly incorrectly). And don't say no framework user has ever screwed up shared state because they didn't realize how threadlocal works or that requests might be handled in another thread...


> but the framework author certainly had to deal with thread safety

usually not. if your framework builds off the common web integration system for the target environment, i.e. Java Servlets, WSGI in Python, etc., you're already handed a request scope within a thread. It's only web servers and containers that really have a heavy lifting job to do.


Okay. Clone the Rails repo and do this: git log --grep thread


To be fair, though, Django is not thread safe. Now while I respect Rails for taking the effort in making it thread safe (IIRC, it was a Google Summer of Code project) - it doesn't seem to be a must-have to have your framework be thread safe.


My understanding is:

When you are using a ruby or python based web framework, each request is a blocking request i.e. after each request you wait for the response before proceeding further e.g. urllib.urlopen(), which as a result causes the no of requests handled/second to go down. Using eventmachine or gevent respectively for ruby or python is one way to overcome this.

In node.js there is no such concept of blocking requests and everything is processed in a single thread. But you can always fork multiple processes using its cluster module http://nodejs.org/docs/latest/api/cluster.html

edit: Oh, I was not trying not advocate anything. nodejs is indeed just a choice, if I didn't make myself very clear when I said "My understanding is:"


Ruby and Python (not to mention basically every other language) have a plethora of threaded, forking, and evented models for concurrency. Thin/Rack/Sinatra, for example, behaves capably in a thread-per-request mode. Reactor systems are not the only (nor are they even preferable in many cases) choice.


blocking / non-blocking has nothing to do with language; for e.g., see tornado for async request handling in python.


How did you find gevent and totally miss Twisted, which is linked on the front page of nodejs.org?

There are a handful of frameworks that assume every request is fully concurrent and non-blocking. Twisted-based stuff is the most prevalent, for Python, like Nevow, Athena, and the stuff baked into twisted.web. However, WSGI itself doesn't say anything about the concurrency of individual requests, and it's totally possible for WSGI requests to be multithreaded, multiprocess, or otherwise concurrent.


I think what he is getting at is that if you want to handle multiple requests within the same process (in order to have in-process shared state, so you can make your chat server without external dependencies or whatever), you either need an event loop, threads, or both.

If you have threads, you do have to worry about mutexes and reentrance.


none of that has anything to do with writing web-request-handling code, it has to do with writing custom servers.

There are plenty of asynchronous solutions for writing servers, including the very widely used Twisted platform as well as the Erlang programming language. It's a tad annoying that node.js is touted largely by front end developers as solving a supposedly previously unaddressed problem.

to wit:

> emerging set of programming conventions, philosophies, and values that I see evolving in the node.js community

no, sorry, they've already emerged, they've already evolved. Go download Twisted. Use node.js if you happen to like it better. But it's not the fricking messiah.


as someone that has battled with twisted, I don't think this is a fair rebuttal. Twisted does _allow_ you to create servers, but it makes it pretty difficult and messy.

Node allows you to do the same thing very elegantly.

I don't think the problem is unaddressed, but node has the most elegant solution I've seen.


The asynchronous aspects were a very small piece of the article for a reason. Node is doing a lot of very interesting, important, and novel things that aren't related to asynchronous events at all. Twisted in particular suffers from too much exposed surface area and having to manage the reactor yourself, which hurts reusability a lot.


I'd prefer to see a detailed article about that.

Also I'd like to see more detail on why exactly writing custom servers is so profoundly important all the sudden. I hardly see the advantage of even nginx over apache, though that's a different issue.


WebSockets probably. None of the traditional server-side web frameworks (Ruby on Rails, Django, Java Servlets, ...) can deal with them because you need to maintain open connections with each client. That also makes asynchronous I/O important (a feature of nginx over apache).

Most WebSockets solutions are clumsy at best, because you need to run something independent from the rest of your web stack. Unless you use solutions like Wt (http://www.webtoolkit.eu/wt).


What do you mean when you say "independent of your stack"?

memcached, postgres, rabbit, etc aren't written in the language I use, but they're very much a part of my stack.

Using an external application to hold the WebSocket connections and be the middleman between your app code and the user is not automatically clumsy. Clumsy would be re-writing your app in JavaScript because you believe it's the only language suited to WebSockets.


Did you even read the line you quoted? He didn't say "emerging set of software capabilities." The programming conventions, philosophies, and values of the Node.js community are evolving. Your response doesn't make any sense.


You hear this when people praise node.js, and it might be true for all I know, but I have yet to see a concrete example comparing node.js code to the same code in another framework/language. Would be very interesting and help non-node.js users understand the benefits.


As soon as you're doing anything beyond CRUD on a database, you will probably have to worry about them.

Unless Erlang blablabla...


When working with requests that share resources via web frameworks that deal with threads, the large consensus is to put the resource into an external managed resource. This external resource such as a database will manage the locking for you to prevent write conflicts, however, when caching values for complex interaction beyond CRUD you may have update conflicts where a new request has an in memory copy of an outdated resource. This is where the problem lies.


well, that's a cache invalidation issue. Any data that's cached, by definition may not be the freshest version, unless you've implemented a very nice write-through situation. If you'd like multiple requests to share an in-memory-only cache, potentially using write-through, then yes that deals with synchronization issues. I wouldn't characterize them as super-tough synchronization issues and you certainly won't have a "blocking IO" problem with an in-memory system.

If you want to write your own caching server, fine, use node.js. But now you're writing your own cache server, you've found some problem that memcached, redis, etc. all do not solve. Is this an everyday use case ?


For me it is. However i tend to just use in-memory caches instead of a cache server, which may suffer the same issue. For standard websites, this is a non-issue for CRUD, for task/activity based sites that can have long lived tasks that persist after a request this is very important.


Wait, how does Node solve this where others don't?


By giving a shared in-memory resource for the current state of an object that is not mutated from other threads during a single stack build / teardown. Concurrent access in threads requires locking the object until it is in a determinate state, while in Node you are guaranteed the state until the stack unwinds.


But this isn't unique to Node, is it? It seems like you're saying you've added a feature when you've really removed one. You can create a single threaded event loop in any language and get the same properties, no?

I'd argue something like Clojure is actually providing a feature here, instead of taking something away from your toolkit. You can have a single view of an object for the life of, say, a request all while its root binding is actually being mutated by other threads. You won't "see" the new value until you deref the root binding on the next request. Except nobody stole threads from you and sent you a bill.


Perhaps it is taking away a "feature" in some senses, but in my view it is taking the logical step not allowing concurrent access to preempt during execution. I often want the current value that has changed after the original context is changed by an asynchronous task (IP addresses of internal servers changing while a script was running came up today).

There is no way preempting access / memory contention is a feature, but Clojure avoids this with somewhat immutable state which can make keeping up to date values painful, although I may not be experienced enough to state much about Clojure.

For web services such as ours where we have values changing underneath us it is elegant that we keep a value the same through a single flow of control (until the stack unwinds). Even if it is incorrect for one part of the task as a whole, it is predictable where the values can change and dealing with errors from pointing to the wrong object / value is trivial compared to most race conditions (yes, node does those before anyone jumps in).

The environment here is key though. Node was built as a single threaded event loop. All the bindings for node / libraries for node expect this. Libgmp's love of aborting threads after a process gives it a wrong value is a good example of where the single threaded environment fights the threaded model, and the same problems of expecting threads is apparent in many programming environments (.Net Http stack I'm looking at you).

So in many ways: Node does not give you something that cannot be done in other environments; but, in other environments there is a lot of existing code that encourages thread usage. Doing something in twisted or the like proved difficult once I needed libraries that had been written expecting threads. The same is true in Node, but I can be confident that good libraries / bindings for Node provide things that expect to work in a single threaded event loop. And I like the command queue / event loop / actor based / reactive / whatever you want to call it. I like it more than anything due to the lack of concurrent edits, but allowing a lot of mutability at the same time.


Ah but node is really just the appearance of a single thread which is actually a v8 managed evented threadpool (or some such magic).


Only one is used for computation (afaik), the rest are used to toss I/O onto. gevent/eventlet in Python do the same, as do other languages. Nothing unique there.


Hmm, seems a bit hand-waving to me, but to be fair I have never tried Node.js and would probably have hard time convincing my finger to type javascript code for server things.

Javascript always seemed to me a language that you used, and tried to use correctly, because you had to. On a server I have the choice, right? So my choice is currently Python so I read this article with some bias.

In the section "batteries not included":

> modules in the core distribution get an unfair advantage over the libraries in userspace by virtue of availability and prominance.

I would call it "unfair" if some modules had access to special backdoors and APIs, but it seem to not be the case for most modules I checked in Python. For instance, 20 mn ago, I did vim /usr/lib64/python2.6/unittest.py and could check this piece of code directly. I did see no special magic that could not be provided by other modules, like Nose or py.test. Moreover, the code was not looking like "neglected code", even if it did not look like the most modern kind.

> experimenting, and iterating is much harder

Well, that's what I like the most with core modules: they don't change overnight, and, while some of them like urllib(2) may be replaced by some because they have a better competitor, most of them are just good old friends, like scipy, that don't need to be put upside down every month because someone found a slightly more elegant way to call two of it's functions.

> "core modules"' advantage evaporates in the face of baked-in concurrent library versioning and sophisticated package management.

I have never been considered as a shy sysadmin when I was sysadmin. I am actually strongly against the "Tool X have misbehaved once therefore tool X is evil and will never put a foot again on my machine" philosophy. I know some guys who are. (I was also sound engineer before and mostly all musicians I met are this way, by the way). But still, having dealt with library version issues sometimes, I think "concurrent library versioning" and "sophisticated package management" sound awfully nightmarishly black-magic to me. I guess I would be more on the "let's understand the most of what happens and not change what don't need to" kind.


> But still, having dealt with library version issues sometimes, I think "concurrent library versioning" and "sophisticated package management" sound awfully nightmarishly black-magic to me. I guess I would be more on the "let's understand the most of what happens and not change what don't need to" kind.

Concurrent versioning lets you do the "not change what don't need to" part really, really, well. Versions of packages that have been proven to work well with one another can continue to do so because if you need a newer version of some library then both the old and the new version can live harmoniously in the same codebase. You don't need to be paranoid about upgrading modules anymore because node and npm are already paranoid on your behalf.


because node and npm are already paranoid on your behalf.

Yes, until npm blows up in obscure ways. Which it likes to do frequently.


What do you mean by "blow up in obscure ways"?

It generally dumps a ton of colorful errors to your terminal. If you find them obscure, please be comforted by the fact that I certainly do not find them obscure, and would love to fix whatever problem they indicate. It's actually my job.

Also, npm is a software program, and not intelligent. It doesn't "like" things. To the extent that it has preferences, it loves semantic versions, accurate metadata, and especially you.



Moe,

So, when you say "obscure blowup", do you mean, "someone posted any issue whatsoever"?

Several of those are feature requests, or node-core bugs, or things that have been fixed.

I find your blowup very obscure. As for npm failing for you, gist or it didn't happen.


So, when you say "obscure blowup", do you mean, "someone posted any issue whatsoever"?

No, I mean "random malfunction without clear indication of what happened and how to fix it".

Several of those are feature requests, or node-core bugs

Might be a mindset thing. How many of the 8 bugs qualify as feature requests/node bugs for you?

gist or it didn't happen.

The gist is that npm needs to become more defensive, clearly state its own dependencies, and not barf random stack traces in the face of problems. You should also have a unix-guy fix the installer. Demanding 'sudo' is not only an embarrassing but also a potentially dangerous mistake.


> But still, having dealt with library version issues sometimes, I think "concurrent library versioning" and "sophisticated package management" sound awfully nightmarishly black-magic to me.

This isn't really much black magic to it, all the use cases are outlined in the docs: http://nodejs.org/docs/v0.6.3/api/modules.html#loading_from_...

The bits that NPM does, is take advantage of the way node loads modules, and process dependencies recursively. For example take a look at how a few global modules are installed on my system:

  $ npm ls -g
  /usr/local/lib
  ├── bootstrap-stylus@0.2.0 
  ├─┬ express@2.4.6 
  │ ├── connect@1.7.1 
  │ ├── mime@1.2.3 
  │ └── qs@0.3.1 
This means that each of expresses dependencies, at the version listed, is installed into /usr/local/lib/node_modules/express/node_modules.

Also, by default npm installs into the current directory, or project which uses the same package.json format that modules themselves do. I find this to actually be the opposite of "black magic" since you use the same tools to manage your project at the top level, that is used to manage all your packages and dependencies.


But the point about project-local node_modules is that it is baked into the ecosystem. Directly contrast that with the reason that most people use virtualenv which is to not pollute the global python install with project-specific dependencies. It's just really well done.

And javascript isn't that bad on the server when you're dealing with ES5 and not worrying about browser-specific js nightmare stuff.


I have use cases for virtualenv, but on my dev machine, to emulate a prod machine setup. But on a prod machine I will have to question hard every line in the install log, and virtualenv may not stay in.

For JS, I don't think it's only problem is with browser compatibility, it also has other flaws and need some patching before being usable, I've heard.


Well, that's what I like the most with core modules: they don't change overnight, and, while some of them like urllib(2) may be replaced by some because they have a better competitor, most of them are just good old friends, like scipy, that don't need to be put upside down every month because someone found a slightly more elegant way to call two of it's functions.

And that's how you wind up using twenty-year-old code.

Not that there's something wrong with that, if it works-- but what if there were a better way?


When we start thinking like that we start walking down the horrible path of changing things for the sake of changing them.

Who cares how old urllib is? There really isn't a ton of functionality that has changed in URLs within, say, the past 12 months that requires us to constantly reinvent the wheel.


The problem with urllib/urllib2 is that they're somewhat awkward to get started with for simple use cases, and just get worse as you get more advanced. Unfortunately, they're not quite bad enough to outweigh the standard-library advantage for most people, so they stick around. They should have been designed better years ago, but there we are: the curse of the included battery.

That said, I don't agree with the OP's implication that just about everything would be better as separate libraries. Node itself has a sizable standard library, and it's generally well-designed.

The start of a project is the riskiest time, and having to suspend work to evaluate multiple third-party libraries for every little feature is a huge distraction. 80% solutions close at hand could make the difference between a successful proof of concept and one that never gets off the ground. There are also positive network effects from having sample code written to a common API.

Python's early failure to anoint a standard baseline web framework is a case study in the risks of avoiding a batteries-included approach. PHP later rose to take that space, and then Rails ended up being written in a different language. We can't know how an alternate history would turn out, but it was clear at that time that Python's web development community was hopelessly fragmented, hit by the double whammy of not including a web framework and making it so much fun to write your own.

The OP does have a point I agree with, which is that having batteries included is less important than before, because package managers are so much better. That means we can hold built-in libraries to a higher standard.

The other part of the puzzle is a way to bless a high-quality third-party library as a baseline for each common niche that isn't already in the standard library, so developers starting a project aren't spoiled by choice.


numpy and scipy are poor examples because they're not core python modules. You still install them from separate people.


True, let's say re, math, array ?


Most people who just got into a new bandwagon are handwaving regardless.


When is it going to stop being a bandwagon? Do we have to wait for you to start using it?

James (the author) is probably the most prolific library author in the Node community and has a startup built on his libraries. Do you really think he's just handwaving and doesn't know what the fuck he's talking about?

Node is just 3 months younger than Redis, software which plenty of people here use, or want to use, or at the very least respect. And yet when it comes to Node, all logic seems to go out the window for whatever reason.


"""Javascript always seemed to me a language that you used, and tried to use correctly, because you had to. """

Hello, 90's called, they want their view on Javascript back.


The way modules are looked up - localized to the piece of code that needs it, inside the /node_modules directory - is one of the greatest strengths of the node ecosystem.

It's just dead simple and solves dependency conflicts like nothing I've ever seen before.


> So if a module "foo" was tested against and depends on "baz@0.1.x" and a module "bar" depends on "baz@0.2.x", then when you go to use both "foo" and "bar" in your own program, "foo" and "bar" will both use the version of "baz" that they were tested and depend against!

That sounds like a recipe for disaster if I get an object from "bar" that came from "baz@0.2.x" and try to give it to "foo" which then gives it to "baz@0.1.x".

I'm suspicious of silver bullets in general and I'm deeply suspicious of any silver bullet that claims to slay version hell. Versioning is hard.


The Law of Demeter applies here, even if traditional objects aren't being used. If the library that uses it encapsulates everything about it, or at least documents things that aren't encapsulated, it tends to work out just fine.

http://en.wikipedia.org/wiki/Law_of_Demeter


While I'm not saying that's not a real concern, I don't think I've ever seen code that would blow up from that. I think this comes from the concept of "limited surface area" that the OP outlines.


Could someone clarify something for me?

Is http.createServer robust enough for production?

It is my understanding that running it as is in production is not a good idea. You want to at least configure nginx between node and the world. If that's the case, doesn't that undermine the whole "focus on your application, not the configuration" point he's making? Sure you can call startServer multiple times, but then you would still have to focus on configuration.


I don't want to sound like I'm over qualifying things, but it really depends on the situation. For most applications http.createServer will be robust enough for production. There are node based solutions for dealing with load balancing, that would make nginx unnecessary in almost all cases.

It really comes down to tradeoffs. If your primary concern is the most simultaneous connections, then you'll need to find a special tool, and Node or most other non-haskell web frameworks probably aren't for you.

Node makes it really easy to do some things. That focus makes some other things more complicated than they should be (because it's asynchronous, you end up needing to deal with flow control, often times using queues that mitigate some of the asynchronous advantages). It's like everything else with computers. There's tradeoffs. http.createServer can be used when it can be used (which happens to cover most use cases).


it is robust enough in terms of scaleablity yes. Most people are just using nginx as a load balancer.

This only addresses servers that you expose to the world though - I think one of the less publicised strengths of making creating servers easy is that it allows you to make internal servers for services trivially.

And using a service oriented architecture is a great win.


To the best of my knowledges: yes. Using nginx in front of node seems to have originated as a non-root way of getting access to port 80, prior to node supporting setuid. A better way on Linux is just to use the kernel TCP routing, for example:

sudo iptables -A PREROUTING -t nat -i eth0 -p tcp --dport 80 -j REDIRECT --to-port 8080

However, if you're using HTTP virtual servers and you already use nginx, sure, place node behind it.


The one catch with nginx is that it doesn't support http 1.1 (ie. websocket support). So if you want that you have to reverse proxy with something that does like haproxy or node-http-proxy.


> Instead of the http server being an external service that we configure to run our code, it becomes just another tool in our arsenal.

This is unfair. If I create a framework cal Java.js and make it possible to create a httpserver by running server.start(8080, aCallbackObject), does that make Java great? One day someone might make node.js "configurable" via a configuration file (that is not written in JavaScript). Will having a configuration file make node.js less useful?

> Note also that the http snippet from earlier isn't inheriting from an http.Server base class or anything of the sort.

That is also unfair. You typically don't use inheritance in JavaScript because you can compose objects or clone from a prototype. There is no static analysis of classes, which in the main benefit of classical inheritance. Programming via callbacks is simply the JavaScript style. This might even change when the language provides better support for classes, or when more people write their node.js code using CoffeeScript.

> If software ecosystems are like economies then core libraries are at best government bureaucracies and at worst nationalized state corporations.

This is very subjective and I strongly disagree. There is a great advantage in using a language and framework that has a strong core library and a common way of doing thing, especially when you start building teams or start recruiting people to maintain legacy code. If anything, Node should start including more batteries, such as a module for concurrency.

PS - Here's the thing. I am really excited about the event driven nature of node.js and the community. However, I am new to node.js and I want to see quality critical analysis of its strength and weaknesses. If you are going to tell me its great, you better have pretty solid reasons. This article simply doesn't cut it.


> A big part of what empowers node to make these kinds of interfaces possible is its asynchronous nature. No longer do you have statelessness imposed from on high by the likes of apache or rails. You can keep intermediate state around in memory just like in any ordinary program.

Should that not be:

A big part of what empowers node to make these kinds of interfaces possible is its "statefull" nature. ...

Really, that it is asynchronous is nice for performance (but only complicates the implementation). But the fact that you keep state in memory is indeed a big win for ease of implementation, and also performance, especially if state is not global but only relates to a single session. Of course, that is usually frowned upon by the web developer community which believes that this somehow hurts scalability (while it actually helps scalability).


I think the sessions in Node/Express are helpful when pair properly. For example, Express can use Redis as its session store. This has the benefit of making the session cross-cluster populated, while still being fast. Add to this the fact that almost everything in JS is serializable as itself and you get a pretty good replication mechanism.


It's frowned upon by people who need high availability with minimum complexity of hardware and software configuration.

There's very little state one can reasonably store on a single web server without interrupting sessions if the server fails, and there's almost none you can store if you can't guarantee user X will always connect to server X for the duration of the session.

Not everyone must solve these problems, but they quickly come to the forefront for anything beyond rather low-traffic sites.


I find the spinning in this article to be incredible.

The "limited surface area" is all well and fine in JS, because there is no object inheritance in JavaScript. The author tries to emphasize usability over extensibility, which is a false dilemma in my book since it's possible to code to an interface in other languages. You define a usable interface, and then everybody codes things that fit that interface. You don't even have to care about whether your objects are inherited or composed. Of course, languages with inheritance are even more reusable because it's possible to inherit from, and extend, objects which implement the given interface. This is a key tenet of design in Java and Python.

The second part of "limited surface area" talks about how namespaces and qualified imports are great. Yep. Welcome to the party, guys. You're only a couple decades late.

The "batteries not included" section is a great dig at Python, but he could have bothered to actually bring up examples. It's easy; things like asyncore are so god-awful that it's trivial to point out where Python's batteries have expired.

However, he's comparing apples and pomegranates here; Node is not a language! It's a framework. It has a large library of its own which doesn't come standard with JS. That library provides stuff which is built-in on other languages, like unit testing, cryptographic primitives, zlib, filesystem accessors, URL handlers, buffers, iterables, type checkers, and a REPL. To repeat: These are batteries which are native to other languages.

Let's go ahead and compare with Twisted, shall we? I'll omit things for which Twisted provides protocols and Node provides streams, since those are (technically) equivalent. Node doesn't appear to contain these things which Twisted provides: Common protocols for handling lines, netstrings, and prefixed strings; non-blocking stdio as a protocol, serial port as a protocol, DNS as a protocol, a DNS server, a handful of RPC protocols like XML-RPC, AMP, and PB; and full suites for: NNTP, telnet, SSH, mail, more chat protocols than I care to remember... Not to mention powerful utilities like credential handling, and enhancements to the Python standard library like object-based file and module handling. And that's just what's included in the main tarball; there's a big community of third-party code which implements whatever you might happen to need. I didn't bother to list the reverse, because there is nothing in Node which is not in Twisted.

"Core distributions with too many modules result in neglected code that can't make meaningful changes without breaking everything." Are you not aware of deprecation? Write the new code, mark the old code as broken or deprecated, wait a few years, remove the old code. This isn't hard. Of course, if Node or JS actually provided useful tools to mark things as deprecated, it might happen more often. Python's got DeprecationWarning; why doesn't Node?

The "radical reusability" section is just the author realizing that modules are awesome. Again, welcome to the party.


> Let's go ahead and compare with Twisted, shall we?

Oh goodness, yes.

How about an echo server. This one is from the twisted home page, so I am not knocking down a strawman.

    from twisted.internet import protocol, reactor
    
    class Echo(protocol.Protocol):
        def dataReceived(self, data):
            self.transport.write(data)
    
    class EchoFactory(protocol.Factory):
        def buildProtocol(self, addr):
            return Echo()
    
    reactor.listenTCP(1234, EchoFactory())
    reactor.run()

Versus in node you can do:

    var net = require('net');
    net.createServer(function (stream) {
        stream.pipe(stream)
    }).listen(5000)
This is what I mean by limited surface area. I shouldn't need to define 2 classes to write an echo server. An Echo class AND an EchoFactory? And on top of that I need to mess with a reactor? How does that pass for good API design?


The reactor is explicit. "Explicit is better than implicit." You don't have to always be in the reactor, if you don't want to. It lets people use Twisted to build GUIs and other things, without always being in the reactor's context. Twisted is a general-purpose networking library, not a tiny-web-servers-only networking library.

An example used to illustrate and educate does not necessarily result in the shortest code sample. That sample does not use twisted.protocols.wire.Echo, because it was decided that things should be explicit and obvious in the samples on the front page.

The reason for separating protocols and factories is simple: Sometimes you need to store per-connection state, sometimes you need to store per-server state. The separation permits developers to store things in factories instead of in global objects. Node doesn't have this distinction, and as a result, things like tracking all connections currently made on a server are cumbersome.

Another thing is testing. How should a person test the Node example? Every bit of the Twisted example is trivially instrumentable; I can access the protocol, the factory, the reactor. I could, if I wanted, replace the reactor with something mocked. There's no place to do that in the Node example.

I was gonna type out an IRC bot, the favorite exercise of novice coders, but I felt that would be petty, since there's no IRC library included in Node.


> tiny-web-servers-only networking library

really?


As a noder, I'm like half-way with you.

In javascript, as with other languages, you can totally use inheritance (prototypal in js) when making things. But, it's usually better to expose an object than a constructor (imo) when it comes to exports. Being expected to do something like:

    var Foo = require('foobar').Foo;

    var Bar = function (opts) {
      Foo.call(this, opts);
      this.baz = "biff";
    }

    require('util').inherits(Bar, Foo);
This sort of behavior is all well-and-good, but it should be contained because it's boilerplate-y. In javascript, at least, it's not a very good pattern, and I suspect this carries over to other environments (to an extent).

I also think that standard libraries have to strike a balance between "batteries included" and "not cluttered with a bunch of crap that was relevant in 1995", and that different standard libraries attempt this in different ways. I think python neglects the latter to supply the former, while node swings the other way and compensates by having a really nice package manager. Time will tell which is a better approach, but I'm betting on Node's model.

> The "radical reusability" section is just the author realizing that modules are awesome. Again, welcome to the party.

Modules are awesome! You sound like a python guy, meaning your module system is actually pretty good when it comes to qualified imports. Compare python imports to ruby's require, or browser-side script tags sometime, and I think you'll find that there's an awareness problem when it comes to qualified imports. :( That said, the package management side is kinda shitty for python (at least when compared to npm).


You had me right up until:

> The "limited surface area" is all well and fine in JS, because there is no object inheritance in JavaScript


Seek to the next paragraph and try again. Paragraphs are atomic; if there's a read error or other corruption on that first paragraph, the second one should still be readable. You should be able to get the entire message out if you use the error-correction channel to compensate.

JavaScript is prototyped, not inherited, and doesn't permit the creation of new types. 0/10; troll using facts next time.


JavaScript does have inheritance, and classes.

Please troll using facts next time.


perhaps you guys are arguing semantics. i think it's fair to say javascript does not have traditional inheritance or classes. crockford seems to agree [0], perhaps i don't understand the argument.

[0] http://javascript.crockford.com/javascript.html


What is a "class" in this context? Let's define it.

As a rough strawman to move forward, let's say that a "Class" is a programming object consisting of:

- A constructor method.

- A set of properties and methods

- Optionally, a parent Class from which it inherits properties and methods, which may be overridden.

And that, when invoked, a "Class" returns an Object which is said to be an "instance of" that Class. The constructor method is called in the context of the instance, and the properties and methods are inherited by the instance.

JavaScript has those things that I'm calling "Classes". In fact, every single function is a "Class" if it's just invoked with "new", and every single object can be a prototype. It's so full of classes, the only excuse for missing them is that you weren't even looking.

Please see almost any JavaScript tutorial ever for an explanation of the "new" and "instanceof" operators, or any from the last 5 years for an explanation of the "Object.create" method.

Or better yet, just sit down with the ES5 spec, and actually read it before you talk about what this language does and does not have.

Additionally, if you mean something by "object inheritance" other than "objects can inherit from objects", then I don't even think we're on the same planet. That's actually something that JS has which most other "object-oriented" languages don't have.

I'm done being trolled for now, I think. Have fun, kids.


i don't even know what you're arguing with to be honest- i merely suggested you and MostAwesomeDude have different definitions for class. which is exactly what your reply suggests.

classes create new types. this is precisely why MostAwesomeDude said "JavaScript is prototyped, not inherited, and doesn't permit the creation of new types". if you want to omit that from your definition of class then fine. but you're no longer talking about the same thing as MostAwesomeDude.


We're not actually at ES5 yet, and to be honest I have not read it completely. I have read ES3 though, and I believe what they call "Classes" are the Array and the Number and the Date and the Boolean and most other globals that start with a capital, and you can find out the Class of a value by calling Object.prototype.toString on it. If you believe "Classes" to be these things, then yes, javascript has classes. It does not, however, allow you to create them yourself.

For what I have seen from ES5, which is not much, it is pushing Object.create, instead of the new X notation, which you were talking about (as the functions all being classes). According to your definition, things using Object.create are no longer classes, as they do not posess constructor functions (although you could still provide one and make it call Object.create). In this fashion, javascript is moving away from whatever classes it had, and into fully prototype-based inheritance.

Whether we think javascript has classes or not is mostly related to your definition of a 'class'. Personally, I think the "Classes" are not actually classes, just names for some native types, and the classes (which are actually classes) javascript does have (using the new X notation) should be abolished as soon as possible, as they look like a confusing attempt to make javascript less confusing for new people coming from Java or C++, by providing them with their familiar concept of classes. They merely guide people away from the powers of prototypical inheritance, instead of driving them closer.

By your definition, javascript is full of classes, but most of them are not intended to be classes, and they probably should not have been. (ever tried forgetting new when instantiating a class? that clobbers your global object massively, unless you use a detection-method to prevent it, further explained on the top answer at http://stackoverflow.com/questions/383402/is-javascript-s-ne... )


What's a "detection-method" ? I Googled, didn't come up with much.


but it has prototypal inheritance, which is an equally valid inheritance scheme, which allows any object to act as a class.

It's a pointless argument, but many people like the way javascript inheritance works. Discounting it because it doesn't act how you want is not a fair judgement.


i am not discounting anything. i am simply suggesting i think it's fair (if not more accurate) to say javascript does not have classes.


"perhaps you guys are arguing semantics."

I think that's a given, since they're arguing about the semantics of a programming language.


Its not about language syntax, its about culture. You can do monolithic frameworks in nodejs if you want, but the majority of nodejs programmer (and substack is one of the better examples) just implement tiny modules with "limited surface area". This is why nodejs is great. Its not because of js (although it helps), its about people.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: