Skip to content

Commit

Permalink
Merge pull request #78 from fed135/v1.14.0
Browse files Browse the repository at this point in the history
v1.14.0
  • Loading branch information
fed135 authored Apr 8, 2019
2 parents 66241f9 + e2101f2 commit 31328c6
Show file tree
Hide file tree
Showing 16 changed files with 86 additions and 127,565 deletions.
2 changes: 1 addition & 1 deletion .travis.yml
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@ node_js:
- '10'
- '8'
sudo: false
script: npm test && npm run bench
script: npm test
jobs:
include:
- stage: npm release
Expand Down
10 changes: 3 additions & 7 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -20,10 +20,10 @@ Want to make your app faster and don't want to spend on extra infrastructure ? [

**HA-store** is a generic wrapper for your data queries, it features:

- Smart micro-caching for 'hot' information (in-memory or using the [redis-adapter](https://github.com/fed135/ha-redis-adapter))
- Smart TLRU cache for 'hot' information
- Request coalescing, batching and retrying
- Insightful stats and [events](#Monitoring-and-events)
- Lightweight, configurable and has **zero dependencies**
- Lightweight, configurable, battle-tested


## Installing
Expand Down Expand Up @@ -57,10 +57,9 @@ resolver | true | - | The method to wrap, and how to interpret the returned data
responseParser | false | (system) | The method that format the results from the resolver into an indexed collection. Accepts indexed collections or arrays of objects with an `id` property. Uses the format `<function(response, requestedIds, params)>`
uniqueParams | false | `[]` | The list of parameters that, when passed, generate unique results. Ex: 'language', 'view', 'fields', 'country'. These will generate different combinations of cache keys.
timeout | false | `null` | The maximum time allowed for the resolver to resolve.
cache | false | <pre>{&#13;&#10;&nbsp;&nbsp;base: 1000,&#13;&#10;&nbsp;&nbsp;step: 5,&#13;&#10;&nbsp;&nbsp;limit: 30000,&#13;&#10;&nbsp;&nbsp;curve: <function(progress, start, end)>&#13;&#10;}</pre> | Caching options for the data
cache | false | <pre>{&#13;&#10;&nbsp;&nbsp;limit: 60000,&#13;&#10;&nbsp;&nbsp;ttl: 60000&#13;&#10;}</pre> | Caching options for the data
batch | false | <pre>{&#13;&#10;&nbsp;&nbsp;tick: 50,&#13;&#10;&nbsp;&nbsp;max: 100&#13;&#10;}</pre> | Batching options for the requests
retry | false | <pre>{&#13;&#10;&nbsp;&nbsp;base: 5,&#13;&#10;&nbsp;&nbsp;step: 3,&#13;&#10;&nbsp;&nbsp;limit: 5000,&#13;&#10;&nbsp;&nbsp;curve: <function(progress, start, end)>&#13;&#10;}</pre> | Retry options for the requests
storeOptions | false | <pre>{&#13;&#10;&nbsp;&nbsp;pluginFallback: true,&#13;&#10;&nbsp;&nbsp;pluginRecoveryDelay: 10000,&#13;&#10;&nbsp;&nbsp;recordLimit: Infinity,&#13;&#10;&nbsp;&nbsp;dropFactor: 1,&#13;&#10;&nbsp;&nbsp;scavengeCycle: 50&#13;&#10;}</pre> | If the store plugin errors and `pluginFallback` is true, the Store instance will attempt to fallback to the default in-memory store. It will then attempt to recover the original store every `storePluginRecoveryDelay`. `dropFactor` is the tuning element for the algorithm that marks records as relevant or not. A higher value (>1) means a more agressive marker, while a lower value (<1) makes it more allowing. `scavengeCycle` is the delay in ms between GC cycles for the store.

*All options are in (ms)
*Scaling options are represented via and exponential curve with base and limit being the 2 edge values while steps is the number of events over that curve.
Expand All @@ -73,16 +72,13 @@ Event | Description
--- | ---
cacheHit | When the requested item is present in the microcache, or is already being fetched. Prevents another request from being created.
cacheMiss | When the requested item is not present in the microcache and is not currently being fetched. A new request will be made.
cacheSkip | Whenever a store set is denied because the maximum number of records was reached for that store, or it was marked as extraneous.
coalescedHit | When a record query successfully hooks to the promise of the same record in transit.
query | When a batch of requests is about to be sent.
queryFailed | Indicates that the batch has failed. Retry policy will dictate if it should be re-attempted.
retryCancelled | Indicates that the batch has reached the allowed number of retries and is now abandoning.
querySuccess | Indicates that the batch request was successful.
bumpCache | When a call for an item fully loaded in the microcache succeeds, its ttl gets extended.
clearCache | When an item in the microcache has reached its ttl and is now being evicted.
storePluginErrored | The custom store has encountered an error
storePluginRestored | The custom store has been re-instantiated

You may also want to track the amount of `contexts` and `records` stored via the `size` method.

Expand Down
7 changes: 5 additions & 2 deletions package.json
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
{
"name": "ha-store",
"version": "1.13.1",
"version": "1.14.0",
"description": "Efficient data fetching",
"main": "src/index.js",
"scripts": {
Expand Down Expand Up @@ -50,5 +50,8 @@
"tim mulqueen <[email protected]>",
"damon perron-laurin <[email protected]>"
],
"typings": "./src/index.d.ts"
"typings": "./src/index.d.ts",
"dependencies": {
"lru-native2": "git+https://github.com/d3m3vilurr/node-lru-native.git#9343e51263abfd8879c52f4f6b913c0aff1da05a"
}
}
25 changes: 10 additions & 15 deletions src/index.d.ts
Original file line number Diff line number Diff line change
@@ -1,10 +1,10 @@
import { EventEmitter } from 'events'

type GenericCurveConfig = {
base: number
steps: number
limit: number
curve (progress: number, start: number, end: number): number
base?: number
steps?: number
limit?: number
curve? (progress: number, start: number, end: number): number
}

type Params = {
Expand All @@ -21,20 +21,15 @@ declare interface BatcherConfig {
requestedIds: string[] | number[],
params?: Params
): any
cache?: GenericCurveConfig
cache?: {
limit?: number
ttl?: number
}
batch?: {
tick: number
max: number
tick?: number
max?: number
}
retry?: GenericCurveConfig
store?: any
storeOptions?: {
pluginFallback?: boolean
pluginRecoveryDelay?: number
recordLimit?: number
dropFactor?: number
scavengeCycle?: number
}
}

declare function batcher(config: BatcherConfig, emitter: EventEmitter): {
Expand Down
10 changes: 6 additions & 4 deletions src/index.js
Original file line number Diff line number Diff line change
Expand Up @@ -34,11 +34,12 @@ class HaStore extends EventEmitter {
this.setMaxListeners(Infinity);
}

this.store = this.config.cache ? store(this.config, this) : null;

this.queue = queue(
this.config,
this,
store(this.config, this),
this.config.store,
this.store,
);
}

Expand Down Expand Up @@ -80,11 +81,12 @@ class HaStore extends EventEmitter {
* @returns {boolean} The result of the clearing
*/
clear(ids, params) {
if (this.store === null) return true;
if (Array.isArray(ids)) {
return ids.map(id => this.clear(id, params));
}

return this.queue.store.clear(this.getKey(ids, params));
return this.store.clear(this.getKey(ids, params));
}

/**
Expand All @@ -94,7 +96,7 @@ class HaStore extends EventEmitter {
async size() {
return {
contexts: this.queue.size(),
records: await this.queue.store.size(),
records: (this.store) ? await this.store.size() : 0,
};
}

Expand Down
31 changes: 3 additions & 28 deletions src/options.js
Original file line number Diff line number Diff line change
Expand Up @@ -22,35 +22,13 @@ const defaultConfig = {
curve: exp,
},
cache: {
base: 1000,
steps: 5,
limit: 30000,
curve: exp,
limit: 60000,
ttl: 60000,
},
};

const defaultStoreOptions = {
pluginRecoveryDelay: 10000,
pluginFallback: true,
recordLimit: 256 * 256,
dropFactor: 1,
scavengeCycle: 50,
};

/* Methods -------------------------------------------------------------------*/

function hydrateStoreOptions(storeOptions = {}) {
return {
...defaultStoreOptions,
...storeOptions,
pluginRecoveryDelay: Number(storeOptions.pluginRecoveryDelay) || defaultStoreOptions.pluginRecoveryDelay,
pluginFallback: (storeOptions.pluginFallback === undefined) ? true : storeOptions.pluginFallback,
recordLimit: Number(storeOptions.recordLimit) || defaultStoreOptions.recordLimit,
scavengeCycle: Number(storeOptions.scavengeCycle) || defaultStoreOptions.scavengeCycle,
dropFactor: (storeOptions.dropFactor === undefined) ? defaultStoreOptions.dropFactor : Number(storeOptions.dropFactor),
};
}

function hydrateIfNotNull(baseConfig, defaultConfig) {
if (baseConfig === null) {
return null;
Expand All @@ -70,9 +48,6 @@ function hydrateConfig(config = {}) {
return {
...config,
timeout: Number(config.timeout) || null,
storeOptions: {
...hydrateStoreOptions(config.storeOptions || {}),
},
batch: hydrateIfNotNull(config.batch, defaultConfig.batch),
retry: hydrateIfNotNull(config.retry, defaultConfig.retry),
cache: hydrateIfNotNull(config.cache, defaultConfig.cache),
Expand All @@ -81,4 +56,4 @@ function hydrateConfig(config = {}) {

/* Exports -------------------------------------------------------------------*/

module.exports = {hydrateConfig, hydrateStoreOptions};
module.exports = {hydrateConfig};
22 changes: 6 additions & 16 deletions src/queue.js
Original file line number Diff line number Diff line change
Expand Up @@ -13,22 +13,12 @@ const contextRecordKey = key => id => recordKey(key, id);

/* Methods -------------------------------------------------------------------*/

function queue(config, emitter, store, storePlugin) {
function queue(config, emitter, targetStore) {

// Local variables
const contexts = new Map();
const timeoutError = new Error('TIMEOUT')
const retryCurve = tween(config.retry);
let targetStore = storePlugin && storePlugin(config, emitter) || store;
emitter.on('storePluginErrored', () => {
if (config.storeOptions.pluginFallback === true) {
targetStore = store;
setTimeout(() => {
emitter.emit('storePluginRestored');
targetStore = storePlugin && storePlugin(config, emitter) || store;
}, config.storeOptions.pluginRecoveryDelay);
}
});

/**
* Attempts to read a query item from cache
Expand All @@ -39,12 +29,12 @@ function queue(config, emitter, store, storePlugin) {
* @returns {*|null} The cache result
*/
async function lookupCache(key, id, context) {
if (config.cache !== null) {
if (targetStore !== null) {
const record = await targetStore.get(recordKey(key, id));

if (record !== undefined && record !== null) {
if (record !== undefined) {
emitter.emit('cacheHit', { key, id, params: context.params });
return record.value;
return record;
}

emitter.emit('cacheMiss', { key, id, params: context.params });
Expand Down Expand Up @@ -256,7 +246,7 @@ function queue(config, emitter, store, storePlugin) {
const parser = config.responseParser || basicParser;
const records = parser(results, ids, context.params);

if (config.cache) {
if (targetStore !== null) {
targetStore.set(contextRecordKey(key), ids.filter(id => records[id] !== null && records[id] !== undefined), records, { step: 0 });
}

Expand Down Expand Up @@ -284,7 +274,7 @@ function queue(config, emitter, store, storePlugin) {
return contexts.size;
}

return { batch, push, size, retry, query, resolveContext, complete, store: targetStore };
return { batch, push, size, retry, query, resolveContext, complete };
}

/* Exports -------------------------------------------------------------------*/
Expand Down
Loading

0 comments on commit 31328c6

Please sign in to comment.