Generator based control flow with good performance, no other dependence.
Recommend versions of node.js which support the destructuring assignment syntax.
npm install zco
const co = require("zco");
//....
-
Be Used With Callback Directly, Avoid callback hell.
-
this.ctx
: A global context for a single transaction -
defer
:A promise(not Promise in es6) that the operation defined by defer will be executed at end no mater if there is an error! -
Support Consecutive Error Stack
Solve What: callback hell.
Many operations in node.js is asynchronous, it's easy to fall into callback hell.Consider callback-style code like this:
const func = function (data0,callback){
func1(data0,function(data1){
func2(data1,function(data2){
func3(data2,function(data3){
setTimeout(function(){
callback(data3);
},10);
})
})
})
}
what's a mess! Lets's try zco!
const func = function (data0){
return co(function*(resume){
let [data1] = yield func1(data0,resume);
let [data2] = yield func2(data1,resume);
let [data3] = yield func3(data2,resume);
yield setTimeout(resume,10);
return data3;
})
}
Wow ,much clearer !full code here
zco try to cooperate with callback directly.It's unnecessary to wrap callback-api to Promise.
In this section ,we will introduce two concepts of zco , it's future
and handler
.
We have created a function named func
,we will show how to use it.
Just invoke directly:
func("a data")((err,data3)=>{
if(err){
console.log(err);
}else{
console.log(data3);
}
})
The value returned by co()
called future
which is a function,the argument of future
called handler
. The code above is equivalent to code below:
var future = func("a data");
var handler = (err,data3)=>{
if(err){
console.log(err);
}else{
console.log(data3);
}
}
future(handler);
The first argument of handler
is error, the second is the value returned by yourself.
The code below show that we can yield future
directly:
co(function*(){
let [err,data3] = yield func("a data");
if(err){
console.log(err);
}else{
console.log(data3);
}
})();
this.ctx
is a global context for a single transaction.
const a_func = function () {
return co(function * () {
let xxx = this.ctx.xxx; // get `xxx` from this.ctx
return xxx;
});
}
co(function * () {
this.ctx.xxx = "hello world";// assign a key-value pair to `this.ctx`
let [err,xxx] = yield a_func();// zco will deliver `this.ctx` automatically
console.log(xxx);// "hello world"
})()
this.ctx
works like thread-local storage in threaded programming. Maybe you are a user of node-continuation-local-storage,now, there is a much easier way.
An Example Scene:
An user launched a request, and there is a trace_id
which is the identifying of this transaction. In order to accomplish the request ,you will invoke many functions. And it's necessary to add trace_id
to log for analyze. A traditional way is treating trace_id
as a parameter ,and pass it everywhere. Now,we can do it in a more graceful way !
Full code here.
More about this.ctx
:
this.ctx
is delivered by zco automatically ,when you yield a future
,it will deliver this.ctx
by call
future.__ctx__(this.ctx)
. Then ,how about yield a ordinary callback-api:
const callback_api=function(num,callback){
if("function" === typeof callback.ctx){
callback(num+callback.ctx().base);
}else{
callback(num);
}
}
co(function*(resume){
this.ctx.base = 1;
let [result] = yield callback_api(10,resume);
console.log(result);//11
})()
The code above show that you can access ctx
by call resume.ctx()
.
You have saw resume
many times ,yeah , it's a common-callback, used to take the place of the origin callback, and accept the data that the callback-api passed to him.
defer Solve What :A promise(not Promise in es6) that the operation defined by defer will be executed at end no mater if there is an error! Be used to do some cleaning work ,like db.close()
co(function * (resume,defer){
let db = null;
defer(function *(inner_resume,err){
if(err){
console.log(err.message);//"Manual Error!"
}
db.close();
//....or other operations you would like to do
});
db = database.connect();
throw new Error("Manual Error!");
})()
An Example Scene:
Suppose we need to design a function which will visit the main page of google, and the max concurrency must smaller than 5. At first ,we should write a concurrent lock , and use concurrent lock in the function. Code maybe like this:
mutex.lock();//hold the lock
//... do the request here ,and some other operations
//... Suppose code like `JSON.parse("{")` throws an error here.
mutex.unLock();//release the lock
But error maybe happen before mutex.unLock
,there is no guarantee that the lock will be released.
We can solve this by defer
:
co(function*(resume,defer){
defer(function*(){
mutex.unLock();
});
mutex.lock();
//... do the request here ,and some other operations
//... Suppose code like `JSON.parse("{")` throws an error here.
//But does not matter, the lock will be released safely
})();
Full code here
If an error occurred in an asynchronous function, we will lose callstack which make it difficult to debug .
Solve What: Support Consecutive Error Stack
code :
const async_func=function(callback){
setTimeout(function(){
try{
JSON.parse("{");
}catch(e){
callback(e);
}
},10)
}
const middle_call_path=function(cb){
return async_func(cb)
}
middle_call_path((err)=>{
console.log(err.stack);
});
Code above output:
SyntaxError: Unexpected end of JSON input
at JSON.parse (<anonymous>)
at Timeout._onTimeout (e:\GIT\test\zco.js:21:18)
at ontimeout (timers.js:488:11)
at tryOnTimeout (timers.js:323:5)
at Timer.listOnTimeout (timers.js:283:5)
The stack of error didn't show where we call async_func
and where we call middle_call_path
,we lose the callstack.
Rewrite the code by zco:
const async_func=function(){
return co(function*(resume){
yield setTimeout(resume,10);
JSON.parse("{");
});
}
const middle_call_path=function(){
return async_func()
}
middle_call_path()((err)=>{
console.log(err.stack);
});
Ouput:
SyntaxError: Unexpected end of JSON input
at JSON.parse (<anonymous>)
at Object.<anonymous> (e:\GIT\test\zco.js:21:13)//where we call `JSON.parse`
at middle_call_path (e:\GIT\test\zco.js:25:12) //where we call `async_func`
at Object.<anonymous> (e:\GIT\test\zco.js:27:1)//where we call `middle_call_path`
We get the full clear chain of function call ,try it by yourself :)
full code here
In order to get best performance, you can turn off this feature globally by call zco.__TrackCallStack(false)
,nearly 2x faster.
zco suppose it's possible that all operations will go wrong, so the first return-value always be error,the second one is the exact result returned by yourself.Error in different place has different meaning,we can deal with them in different way.
But sometimes ,we don't want to dealing with error everywhere ,then we use co.brief
Example:
const co_func=function(i){
return co(function*(){
return 10*i;
})
}
co(function * () {
let [err1, result1] = yield co_func(1);
if (err1) {
throw err1;//we can throw the error ,or do something else
}
let [err2, result2] = yield co_func(2);
if (err2) {
throw err2;
}
return result1+result2;
})((err, result) => {
if (err) {
console.log(err);
} else {
console.log(result);
}
})
//or in brief model
// just care about the result ,deal with error at end
co.brief(function*(){
let result1 = yield co_func(1);
let result2 = yield co_func(2);
return result1+result2;
})((err,result)=>{
if (err) {//deal with error at end
console.log(err);
} else {
console.log(result);
}
});
This api allows you setting a time limit of an operation,when timeout ,it will throw a timeout error.
API: co.timeLimit(ms,future)
Example:
co.timeLimit(1 * 10, co(function * (resume) {
yield setTimeout(resume, 2 * 10);//wait 20ms,
}))((err) => {
console.log(err.message); //"timeout"
})
Execute operations concurrently
API: co.all(future...,[timeout setting])
;
Example:
const co_func = function (a, b, c) {
return co(function * (resume) {
yield setTimeout(resume,10);//wait 10ms
return a+b+c;
})
}
const generic_callback = function (callback) { //the first arg must be callback
callback(100);
}
co(function * () {
let timeout = 10 * 1000; //timeout setting
let[err, result] = yield co.all(co_func(1, 2, 3), co_func(4, 5, 6), generic_callback, timeout); //support set timeout,
console.log(err); //null
console.log(result) //[6,15,[100]]
})()
Even if not recommend Promise ,sometimes we can't bypass.
const promise_api = function (a, b) {
return Promise.resolve().then(function () {
return a + b;
});
}
co(function * () {
let[err, data] = yield co.wrapPromise(promise_api(1, 2));
/**
* Can't yield Promise directly ,that's unsafe.Becauce some callback-style API also return a Promise at the
* same time,such as `pg.client.query`.
* */
console.log(err); //null
console.log(data) : //3;
})()
results for 20000 parallel executions, 1 ms per I/O op ,2017-06-03
when turn off callstack-trace
name timecost(ms) memory(mb) score(time+memory)
callback.js 96 30.23828125 46.5068
async-neo@1.8.2.js 146 48.59765625 30.2967
promise_bluebird@2.11.0.js 509 84.8828125 10.1153
co_zco_yyrdl@1.3.2.js 579 88.9609375 9.1068
co_when_generator_cujojs@3.7.8.js 721 117.109375 7.1949
async_caolan@1.5.2.js 712 122.5859375 7.1672
co_tj_with_bluebird_promise@4.6.0.js 895 124.79296875 6.0711
co_when_generator_cujojs_with_bluebird@3.7.8.js 916 131.3515625 5.8794
async_await_es7_with_native_promise.js 964 166.82421875 5.2861
promise_native.js 949 179.29296875 5.2457
co_tj_with_native_promise@4.6.0.js 1107 163.2421875 4.8229
co_when_generator_cujojs_with_native_promise@3.7.8.js 1112 173.63671875 4.719
async_await_es7_with_bluebird_promise.js 1183 191.41796875 4.3899
co_coroutine_bluebird@2.11.0.js 3695 242.4296875 2
when turn on stack trace:
name timecost(ms) memory(mb) score(time+memory)
callback.js 92 31.1015625 49.8332
async-neo@1.8.2.js 166 47.7109375 28.3802
promise_bluebird@2.11.0.js 510 85.125 10.4324
async_caolan@1.5.2.js 716 122.328125 7.3841
co_when_generator_cujojs@3.7.8.js 789 117.17578125 6.9716
co_tj_with_bluebird_promise@4.6.0.js 884 126.046875 6.2992
co_when_generator_cujojs_with_bluebird@3.7.8.js 883 131.0234375 6.231
co_zco_yyrdl@1.3.2.js 1181 94.42578125 5.8436
promise_native.js 999 170.3125 5.2953
async_await_es7_with_native_promise.js 1022 161.47265625 5.2862
co_tj_with_native_promise@4.6.0.js 1089 162.99609375 5.0394
async_await_es7_with_bluebird_promise.js 1165 188.90625 4.6036
co_when_generator_cujojs_with_native_promise@3.7.8.js 1231 173.71875 4.5379
co_coroutine_bluebird@2.11.0.js 3867 242.61328125 2
Platform info:
Windows_NT 10.0.14393 x64
Node.JS 7.7.3
V8 5.5.372.41
Intel(R) Core(TM) i5-5200U CPU @ 2.20GHz × 4
MIT