This repository has been archived on 2022-07-06. You can view files and clone it, but cannot push or open issues or pull requests.
websocket-webcam/node_modules/jsftp/node_modules/event-stream/package.json

63 lines
9.4 KiB
JSON

{
"name": "event-stream",
"version": "3.0.16",
"description": "construct pipes of streams of events",
"homepage": "http://github.com/dominictarr/event-stream",
"repository": {
"type": "git",
"url": "git://github.com/dominictarr/event-stream.git"
},
"dependencies": {
"through": "~2.3.1",
"duplexer": "~0.0.2",
"from": "~0",
"map-stream": "0.0.2",
"pause-stream": "0.0.10",
"split": "0.2",
"stream-combiner": "0.0.0"
},
"devDependencies": {
"asynct": "*",
"it-is": "1",
"ubelt": "~2.9",
"stream-spec": "~0.2",
"tape": "~0.1.5"
},
"scripts": {
"test": "asynct test/",
"test_tap": "set -e; for t in test/*.js; do node $t; done"
},
"testling": {
"files": "test/*.js",
"browsers": {
"ie": [
8,
9
],
"firefox": [
13
],
"chrome": [
20
],
"safari": [
5.1
],
"opera": [
12
]
}
},
"author": {
"name": "Dominic Tarr",
"email": "dominic.tarr@gmail.com",
"url": "http://bit.ly/dominictarr"
},
"readme": "# EventStream\n\n<img src=https://secure.travis-ci.org/dominictarr/event-stream.png?branch=master>\n\n[![browser status](http://ci.testling.com/dominictarr/event-stream.png)]\n(http://ci.testling.com/dominictarr/event-stream)\n\n[Streams](http://nodejs.org/api/strems.html \"Stream\") are node's best and most misunderstood idea, and \n_<em>EventStream</em>_ is a toolkit to make creating and working with streams <em>easy</em>. \n\nNormally, streams are only used of IO, \nbut in event stream we send all kinds of objects down the pipe. \nIf your application's <em>input</em> and <em>output</em> are streams, \nshouldn't the <em>throughput</em> be a stream too? \n\nThe *EventStream* functions resemble the array functions, \nbecause Streams are like Arrays, but laid out in time, rather than in memory. \n\n<em>All the `event-stream` functions return instances of `Stream`</em>.\n\n`event-stream` creates \n[0.8 streams](https://github.com/joyent/node/blob/v0.8/doc/api/stream.markdown)\n, which are compatible with [0.10 streams](http://nodejs.org/api/stream.html \"Stream\")\n\n>NOTE: I shall use the term <em>\"through stream\"</em> to refer to a stream that is writable <em>and</em> readable. \n\n###[simple example](https://github.com/dominictarr/event-stream/blob/master/examples/pretty.js):\n\n``` js\n\n//pretty.js\n\nif(!module.parent) {\n var es = require('event-stream')\n es.pipeline( //connect streams together with `pipe`\n process.openStdin(), //open stdin\n es.split(), //split stream to break on newlines\n es.map(function (data, callback) {//turn this async function into a stream\n callback(null\n , inspect(JSON.parse(data))) //render it nicely\n }),\n process.stdout // pipe it to stdout !\n )\n }\n```\nrun it ...\n\n``` bash \ncurl -sS registry.npmjs.org/event-stream | node pretty.js\n```\n \n[node Stream documentation](http://nodejs.org/api/stream.html)\n\n## through (write?, end?)\n\nReemits data synchronously. Easy way to create syncronous through streams.\nPass in an optional `write` and `end` methods. They will be called in the \ncontext of the stream. Use `this.pause()` and `this.resume()` to manage flow.\nCheck `this.paused` to see current flow state. (write always returns `!this.paused`)\n\nthis function is the basis for most of the syncronous streams in `event-stream`.\n\n``` js\n\nes.through(function write(data) {\n this.emit('data', data)\n //this.pause() \n },\n function end () { //optional\n this.emit('end')\n })\n\n```\n\n##map (asyncFunction)\n\nCreate a through stream from an asyncronous function. \n\n``` js\nvar es = require('event-stream')\n\nes.map(function (data, callback) {\n //transform data\n // ...\n callback(null, data)\n})\n\n```\n\nEach map MUST call the callback. It may callback with data, with an error or with no arguments, \n\n * `callback()` drop this data. \n this makes the map work like `filter`, \n note:`callback(null,null)` is not the same, and will emit `null`\n\n * `callback(null, newData)` turn data into newData\n \n * `callback(error)` emit an error for this item.\n\n>Note: if a callback is not called, `map` will think that it is still being processed, \n>every call must be answered or the stream will not know when to end. \n>\n>Also, if the callback is called more than once, every call but the first will be ignored.\n\n## mapSync (syncFunction)\n\nSame as `map`, but the callback is called synchronously. Based on `es.through`\n\n## split (matcher)\n\nBreak up a stream and reassemble it so that each line is a chunk. matcher may be a `String`, or a `RegExp` \n\nExample, read every line in a file ...\n\n``` js\n es.pipeline(\n fs.createReadStream(file, {flags: 'r'}),\n es.split(),\n es.map(function (line, cb) {\n //do something with the line \n cb(null, line)\n })\n )\n\n```\n\n`split` takes the same arguments as `string.split` except it defaults to '\\n' instead of ',', and the optional `limit` paremeter is ignored.\n[String#split](https://developer.mozilla.org/en/JavaScript/Reference/Global_Objects/String/split)\n\n## join (separator)\n\ncreate a through stream that emits `separator` between each chunk, just like Array#join.\n\n(for legacy reasons, if you pass a callback instead of a string, join is a synonym for `es.wait`)\n\n## replace (from, to)\n\nReplace all occurences of `from` with `to`. `from` may be a `String` or a `RegExp`. \nWorks just like `string.split(from).join(to)`, but streaming.\n\n\n## parse\n\nConvienience function for parsing JSON chunks. For newline separated JSON,\nuse with `es.split`\n\n``` js\nfs.createReadStream(filename)\n .pipe(es.split()) //defaults to lines.\n .pipe(es.parse())\n```\n\n## stringify\n\nconvert javascript objects into lines of text. The text will have whitespace escaped and have a `\\n` appended, so it will be compatible with `es.parse`\n\n``` js\nobjectStream\n .pipe(es.stringify())\n .pipe(fs.createWriteStream(filename))\n```\n\n##readable (asyncFunction) \n\ncreate a readable stream (that respects pause) from an async function. \nwhile the stream is not paused, \nthe function will be polled with `(count, callback)`, \nand `this` will be the readable stream.\n\n``` js\n\nes.readable(function (count, callback) {\n if(streamHasEnded)\n return this.emit('end')\n \n //...\n \n this.emit('data', data) //use this way to emit multiple chunks per call.\n \n callback() // you MUST always call the callback eventually.\n // the function will not be called again until you do this.\n})\n```\nyou can also pass the data and the error to the callback. \nyou may only call the callback once. \ncalling the same callback more than once will have no effect. \n\n##readArray (array)\n\nCreate a readable stream from an Array.\n\nJust emit each item as a data event, respecting `pause` and `resume`.\n\n``` js\n var es = require('event-stream')\n , reader = es.readArray([1,2,3])\n\n reader.pipe(...)\n```\n\n## writeArray (callback)\n\ncreate a writeable stream from a callback, \nall `data` events are stored in an array, which is passed to the callback when the stream ends.\n\n``` js\n var es = require('event-stream')\n , reader = es.readArray([1, 2, 3])\n , writer = es.writeArray(function (err, array){\n //array deepEqual [1, 2, 3]\n })\n\n reader.pipe(writer)\n```\n\n## pipeline (stream1,...,streamN)\n\nTurn a pipeline into a single stream. `pipeline` returns a stream that writes to the first stream\nand reads from the last stream. \n\nListening for 'error' will recieve errors from all streams inside the pipe.\n\n> `connect` is an alias for `pipeline`.\n\n``` js\n\n es.pipeline( //connect streams together with `pipe`\n process.openStdin(), //open stdin\n es.split(), //split stream to break on newlines\n es.map(function (data, callback) {//turn this async function into a stream\n callback(null\n , inspect(JSON.parse(data))) //render it nicely\n }),\n process.stdout // pipe it to stdout !\n )\n```\n\n## pause () \n\nA stream that buffers all chunks when paused.\n\n\n``` js\n var ps = es.pause()\n ps.pause() //buffer the stream, also do not allow 'end' \n ps.resume() //allow chunks through\n```\n\n## duplex (writeStream, readStream)\n\nTakes a writable stream and a readable stream and makes them appear as a readable writable stream.\n\nIt is assumed that the two streams are connected to each other in some way. \n\n(This is used by `pipeline` and `child`.)\n\n``` js\n var grep = cp.exec('grep Stream')\n\n es.duplex(grep.stdin, grep.stdout)\n```\n\n## child (child_process)\n\nCreate a through stream from a child process ...\n\n``` js\n var cp = require('child_process')\n\n es.child(cp.exec('grep Stream')) // a through stream\n\n```\n\n## wait (callback)\n\nwaits for stream to emit 'end'.\njoins chunks of a stream into a single string. \ntakes an optional callback, which will be passed the \ncomplete string when it receives the 'end' event.\n\nalso, emits a single 'data' event.\n\n``` js\n\nreadStream.pipe(es.join(function (err, text) {\n // have complete text here.\n}))\n\n```\n\n\n",
"_id": "event-stream@3.0.16",
"dist": {
"shasum": "0b040584bf4f31ca2d250d6a2ee285fea3e8e358"
},
"_from": "event-stream@~3.0.14"
}