Skip to content

Commit

Permalink
Merge pull request #75 from fed135/v1.13.0
Browse files Browse the repository at this point in the history
v1.13.0
  • Loading branch information
drawm authored Mar 25, 2019
2 parents b914527 + 8cd6dd0 commit b7b4fe7
Show file tree
Hide file tree
Showing 16 changed files with 147 additions and 361 deletions.
10 changes: 3 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -21,7 +21,7 @@ Want to make your app faster and don't want to spend on extra infrastructure ? [
**HA-store** is a generic wrapper for your data queries, it features:

- Smart micro-caching for 'hot' information (in-memory or using the [redis-adapter](https://github.com/fed135/ha-redis-adapter))
- Request coalescing, batching, retrying and circuit-breaking
- Request coalescing, batching and retrying
- Insightful stats and [events](#Monitoring-and-events)
- Lightweight, configurable and has **zero dependencies**

Expand Down Expand Up @@ -60,8 +60,7 @@ timeout | false | `null` | The maximum time allowed for the resolver to resolve.
cache | false | <pre>{&#13;&#10;&nbsp;&nbsp;base: 1000,&#13;&#10;&nbsp;&nbsp;step: 5,&#13;&#10;&nbsp;&nbsp;limit: 30000,&#13;&#10;&nbsp;&nbsp;curve: <function(progress, start, end)>&#13;&#10;}</pre> | Caching options for the data
batch | false | <pre>{&#13;&#10;&nbsp;&nbsp;tick: 50,&#13;&#10;&nbsp;&nbsp;max: 100&#13;&#10;}</pre> | Batching options for the requests
retry | false | <pre>{&#13;&#10;&nbsp;&nbsp;base: 5,&#13;&#10;&nbsp;&nbsp;step: 3,&#13;&#10;&nbsp;&nbsp;limit: 5000,&#13;&#10;&nbsp;&nbsp;curve: <function(progress, start, end)>&#13;&#10;}</pre> | Retry options for the requests
breaker | false | <pre>{&#13;&#10;&nbsp;&nbsp;base: 1000,&#13;&#10;&nbsp;&nbsp;step: 10,&#13;&#10;&nbsp;&nbsp;limit: 65535,&#13;&#10;&nbsp;&nbsp;curve: <function(progress, start, end)>,&#13;&#10;&nbsp;&nbsp;tolerance: 1,&#13;&#10;&nbsp;&nbsp;toleranceFrame: 10000&#13;&#10;}</pre> | Circuit-breaker options, enabled by default and triggers after the retry limit
storeOptions | false | <pre>{&#13;&#10;&nbsp;&nbsp;pluginFallback: true,&#13;&#10;&nbsp;&nbsp;pluginRecoveryDelay: 10000,&#13;&#10;&nbsp;&nbsp;recordLimit: Infinity&#13;&#10;}</pre> | If the store plugin errors and `pluginFallback` is true, the Store instance will attempt to fallback to the default in-memory store. It will then attempt to recover the original store every `storePluginRecoveryDelay`.
storeOptions | false | <pre>{&#13;&#10;&nbsp;&nbsp;pluginFallback: true,&#13;&#10;&nbsp;&nbsp;pluginRecoveryDelay: 10000,&#13;&#10;&nbsp;&nbsp;recordLimit: Infinity,&#13;&#10;&nbsp;&nbsp;dropFactor: 1,&#13;&#10;&nbsp;&nbsp;scavengeCycle: 50&#13;&#10;}</pre> | If the store plugin errors and `pluginFallback` is true, the Store instance will attempt to fallback to the default in-memory store. It will then attempt to recover the original store every `storePluginRecoveryDelay`. `dropFactor` is the tuning element for the algorithm that marks records as relevant or not. A higher value (>1) means a more agressive marker, while a lower value (<1) makes it more allowing. `scavengeCycle` is the delay in ms between GC cycles for the store.

*All options are in (ms)
*Scaling options are represented via and exponential curve with base and limit being the 2 edge values while steps is the number of events over that curve.
Expand All @@ -74,17 +73,14 @@ Event | Description
--- | ---
cacheHit | When the requested item is present in the microcache, or is already being fetched. Prevents another request from being created.
cacheMiss | When the requested item is not present in the microcache and is not currently being fetched. A new request will be made.
cacheFull | Whenever a store set is denied because the maximum number of records was reached for that store.
cacheSkip | Whenever a store set is denied because the maximum number of records was reached for that store, or it was marked as extraneous.
coalescedHit | When a record query successfully hooks to the promise of the same record in transit.
query | When a batch of requests is about to be sent.
queryFailed | Indicates that the batch has failed. Retry policy will dictate if it should be re-attempted.
retryCancelled | Indicates that the batch has reached the allowed number of retries and is now abandoning.
querySuccess | Indicates that the batch request was successful.
bumpCache | When a call for an item fully loaded in the microcache succeeds, its ttl gets extended.
clearCache | When an item in the microcache has reached its ttl and is now being evicted.
circuitBroken | When a batch call fails after the limit amount of retries, the circuit gets broken - all calls in the next ttl will automatically fail. It is assumed that there is a problem with the data-source.
circuitRestored | Circuit temporarily restored, a tentative to the data-source may be sent.
circuitRecovered | The tentative request was successful and the wrapper assumes that the data-source has recovered.
storePluginErrored | The custom store has encountered an error
storePluginRestored | The custom store has been re-instantiated

Expand Down
11 changes: 2 additions & 9 deletions package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "ha-store",
"version": "1.12.1",
"version": "1.13.0",
"description": "Efficient data fetching",
"main": "src/index.js",
"scripts": {
Expand All @@ -18,26 +18,18 @@
},
"keywords": [
"store",
"javascript",
"high",
"availability",
"network",
"node",
"optimize",
"throughput",
"retry",
"circuit",
"breaker",
"request",
"cache",
"service",
"batch",
"micro",
"latency",
"messaging",
"queue",
"web",
"entity",
"congestion",
"control"
],
Expand All @@ -48,6 +40,7 @@
"license": "Apache-2.0",
"devDependencies": {
"chai": "^4.2.0",
"heapdump": "^0.3.12",
"mocha": "^6.0.0",
"sinon": "^7.2.0"
},
Expand Down
91 changes: 0 additions & 91 deletions src/breaker.js

This file was deleted.

13 changes: 2 additions & 11 deletions src/index.d.ts
Original file line number Diff line number Diff line change
Expand Up @@ -7,15 +7,6 @@ type GenericCurveConfig = {
curve (progress: number, start: number, end: number): number
}

type BreakerCurveConfig = {
base: number
steps: number
limit: number
curve (progress: number, start: number, end: number): number
tolerance: number
toleranceFrame: number
}

type Params = {
[key: string]: string
}
Expand All @@ -36,13 +27,13 @@ declare interface BatcherConfig {
max: number
}
retry?: GenericCurveConfig
breaker?: GenericCurveConfig
store?: any
storeOptions?: {
pluginFallback?: boolean
pluginRecoveryDelay?: number
memoryLimit?: number
recordLimit?: number
dropFactor?: number
scavengeCycle?: number
}
}

Expand Down
7 changes: 3 additions & 4 deletions src/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -2,10 +2,12 @@
* Batcher index
*/

'use strict';

/* Requires ------------------------------------------------------------------*/

const queue = require('./queue.js');
const store = require('./store.js');
const breaker = require('./breaker.js');
const {contextKey, recordKey} = require('./utils.js');
const EventEmitter = require('events').EventEmitter;
const {hydrateConfig} = require('./options');
Expand All @@ -32,14 +34,11 @@ class HaStore extends EventEmitter {
this.setMaxListeners(Infinity);
}

this.breaker = breaker(this.config, this);

this.queue = queue(
this.config,
this,
store(this.config, this),
this.config.store,
this.breaker,
);
}

Expand Down
31 changes: 18 additions & 13 deletions src/options.js
Original file line number Diff line number Diff line change
@@ -1,5 +1,15 @@
/**
* Options
*/

'use strict';

/* Requires ------------------------------------------------------------------*/

const {exp} = require('./utils.js');

/* Local variables -----------------------------------------------------------*/

const defaultConfig = {
batch: {
tick: 50,
Expand All @@ -17,34 +27,28 @@ const defaultConfig = {
limit: 30000,
curve: exp,
},
breaker: {
base: 1000,
steps: 10,
limit: 0xffff,
curve: exp,
tolerance: 1,
toleranceFrame: 10000,
},
};

const defaultStoreOptions = {
pluginRecoveryDelay: 10000,
pluginFallback: true,
memoryLimit: 0.9,
recordLimit: Infinity,
recordLimit: 256 * 256,
dropFactor: 1,
scavengeCycle: 50,
};

/* Methods -------------------------------------------------------------------*/

function hydrateStoreOptions(storeOptions = {}) {
return {
...defaultStoreOptions,
...storeOptions,
pluginRecoveryDelay: Number(storeOptions.pluginRecoveryDelay) || defaultStoreOptions.pluginRecoveryDelay,
pluginFallback: (storeOptions.pluginFallback === undefined) ? true : storeOptions.pluginFallback,
memoryLimit: Math.max(0, Math.min(1, Number(storeOptions.memoryLimit) || defaultStoreOptions.memoryLimit)),
recordLimit: Number(storeOptions.recordLimit) || defaultStoreOptions.recordLimit,
scavengeCycle: Number(storeOptions.scavengeCycle) || defaultStoreOptions.scavengeCycle,
dropFactor: (storeOptions.dropFactor === undefined) ? defaultStoreOptions.dropFactor : Number(storeOptions.dropFactor),
};

}

function hydrateIfNotNull(baseConfig, defaultConfig) {
Expand Down Expand Up @@ -72,8 +76,9 @@ function hydrateConfig(config = {}) {
batch: hydrateIfNotNull(config.batch, defaultConfig.batch),
retry: hydrateIfNotNull(config.retry, defaultConfig.retry),
cache: hydrateIfNotNull(config.cache, defaultConfig.cache),
breaker: hydrateIfNotNull(config.breaker, defaultConfig.breaker),
};
}

/* Exports -------------------------------------------------------------------*/

module.exports = {hydrateConfig, hydrateStoreOptions};
14 changes: 5 additions & 9 deletions src/queue.js
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@ const contextRecordKey = key => id => recordKey(key, id);

/* Methods -------------------------------------------------------------------*/

function queue(config, emitter, store, storePlugin, breaker) {
function queue(config, emitter, store, storePlugin) {

// Local variables
const contexts = new Map();
Expand Down Expand Up @@ -107,12 +107,11 @@ function queue(config, emitter, store, storePlugin, breaker) {
query(type, key, ids.splice(0, optimalBatchSize), context);
}

contexts.delete(context.key)
contexts.delete(context.key);
}

/**
* Main queue function
* - Checks circuit-breaker status
* - Resolves context object and deferred handlers
* - Looks-up cache
* - Prepares data-source query timer/invocation
Expand All @@ -123,7 +122,6 @@ function queue(config, emitter, store, storePlugin, breaker) {
* @param {boolean} startQueue Wether to start the queue immediately or not
*/
async function push(id, params, agg, startQueue, uid) {
if (breaker.status().active === true) return Promise.reject(breaker.circuitError);
const key = contextKey(config.uniqueParams, params);
const context = resolveContext(key, params, uid);
let entity = await lookupCache(key, id, context);
Expand Down Expand Up @@ -193,7 +191,9 @@ function queue(config, emitter, store, storePlugin, breaker) {
context.promises.delete(ids[i]);
}
}
if (context.promises.size === 0) contexts.delete(context.key);
if (context.promises.size === 0) {
contexts.delete(context.key);
}
}

function handleQueryCriticalError(err, override) {
Expand Down Expand Up @@ -241,7 +241,6 @@ function queue(config, emitter, store, storePlugin, breaker) {
}

if (context.promises.size === 0) contexts.delete(context.key);
breaker.openCircuit();
}
}

Expand All @@ -260,9 +259,6 @@ function queue(config, emitter, store, storePlugin, breaker) {
if (config.cache) {
targetStore.set(contextRecordKey(key), ids.filter(id => records[id] !== null && records[id] !== undefined), records, { step: 0 });
}
if (breaker.status().step > 0) {
breaker.closeCircuit();
}

for (let i = 0; i < ids.length; i++) {
const expectation = context.promises.get(ids[i]);
Expand Down
Loading

0 comments on commit b7b4fe7

Please sign in to comment.