From 8c90c8b76d2868d68163d5a4fb107ba981148639 Mon Sep 17 00:00:00 2001 From: Scott Twiname Date: Thu, 17 Nov 2022 14:25:04 +1300 Subject: [PATCH] Sync 20221117 (#14) * fix: incomplete ds data in same block (#1370) Co-authored-by: Orion * [SKIP CI] Prerelease * Best Block (#1308) * Draft * update changes * rebase changes * fix * fix (#1329) * disable best blocks for workers until we support it * Rename bestBlock to unfinalizedBlocks, other clean up * Clean up * Further clean up * Dedupe reindex function * Fix tests * Clean up * Update checking finalization to use parent hash * Rename logger * Refactor unfinalized blocks * Use header rather than full block, improve detecting forks * Verify unfinalized blocks when disabled. Use sorted array for storing unfinalized blocks * Clean up logs * Fix not indexing unfinalized blocks right away, exit if historical not enabled Co-authored-by: Scott Twiname * [SKIP CI] Prerelease * Fix bugs with unfinalized blocks (#1374) * Fix bugs with unfinalized blocks - Fix using wrong hash for unfinalized blocks - Fix not removing unfinalized blocks when the latest unfinalized block < finalized * Fix tests getting stuck * Fix issue finding where fork occurred * [SKIP CI] Prerelease * [release] 20221028 (#1372) * hot fix tests (#1360) * update tests: api.service.spec, init-controller.test, publish-controller.spec * update tests following comments * [SKIP CI] Prerelease * add ethereum to CLI and Validator (#1378) * Imporve dictionary query (#1371) * wip * wip * wip * broken wip * wip refactoring * seperate dictionaryQueryEntries * fix logic for setDicitonaryQueryEntries * cleaning up * move dictionaryEntry.ts to node-core * remove comments * create class for dictionaryqueryentries * update tests * move dictionaryQueryService into dicitonaryService * refactor * update logic for useDicitonary * relocated dictionaryQuery funcs * add test for dictquerymap * tests failing on windows? * test getDictionaryQueryEntries * test fixed for getDictioanryQueryEntries * added test for sorting * conflict fixed ? * fix conflict_2 * fix conflict_3 * add delete temp ds records back * clean up with new logic * clean up, add comments for test * fix * update logic * add generic type * [SKIP CI] Prerelease * Enable for better inheritance of generated entity modeld (#1377) * refactor: enable for entity inheritance * refactor: include also filed getters * [SKIP CI] Prerelease * fix comments issue with new package (#1380) * fix comments issue with new package * moved yaml package * [SKIP CI] Prerelease * fix logic with reindex and unfinalized height and dynamic ds (#1382) * fix logic with reindex and unfinalized height * fix * include fix for #1379 * update polkadot to 9.7.1 (#1384) * [release] 20221107 * Fix remove alter table (#1387) * remove migrate alter table * remove * [SKIP CI] Prerelease * [release] 20221108 (#1388) * fix missing sequelize sync (#1389) * [SKIP CI] Prerelease * [release] 20221108 patch (#1390) * reindex bind (#1391) * [SKIP CI] Prerelease * [release] 20221109 (#1393) * Handle fetch errors, then retry (#1386) * add retryOnFail function * add retryOnFail * add test, fix logic * [SKIP CI] Prerelease * fix (#1395) * [SKIP CI] Prerelease * [release] 20221109 node-core (#1394) * Fix tests hanging (#1396) * Fix tests hanging * Update base docker image with newer git version * [SKIP CI] Prerelease * Add distinct query plugin (#1274) * Add distinct query plugin * Clean up log * Fix distinct not being provided to query * Uppercase enum to be consistent with other enums * Update dictionary queries to try distinct argument * [SKIP CI] Prerelease * Add query distinct dependencies (#1398) * fix missing update forked graphile dependencies * tidy up * tidy up * [SKIP CI] Prerelease * Break block dispatcher file up and move common code to base class (#1397) * [SKIP CI] Prerelease * Hot schema trigger (#1401) * implement trigger with notification * working prior clean up * refactor and clean up on async and await * clean up * clear comments * add filter * fix * fix err * [SKIP CI] Prerelease * [release] 20221115 (#1402) * [release] 20221115 * [release] 20221115 * [release] 20221115 * fix hot schema (#1404) * fix and refactor * refactor getTriggers * [SKIP CI] Prerelease * [release] 20221115 (#1408) * [release] 20221115 * [release] 20221115 * fix fetchblock for works (#1410) * [SKIP CI] Prerelease Co-authored-by: hariu-starfish <103621490+hariu-starfish@users.noreply.github.com> Co-authored-by: Orion Co-authored-by: Jay Ji Co-authored-by: Ben <89335033+bz888@users.noreply.github.com> Co-authored-by: Naveen V Co-authored-by: Filippo --- package.json | 2 +- packages/node/package.json | 2 +- .../blockDispatcher/base-block-dispatcher.ts | 146 +++++ .../block-dispatcher.service.ts | 179 +++++++ .../node/src/indexer/blockDispatcher/index.ts | 12 + .../worker-block-dispatcher.service.ts | 242 +++++++++ .../node/src/indexer/dynamic-ds.service.ts | 46 +- packages/node/src/indexer/fetch.module.ts | 8 +- packages/node/src/indexer/fetch.service.ts | 91 +++- packages/node/src/indexer/indexer.manager.ts | 11 +- packages/node/src/indexer/project.service.ts | 36 +- .../worker/block-dispatcher.service.ts | 502 ------------------ .../node/src/indexer/worker/worker.service.ts | 1 + packages/node/src/subcommands/reindex.init.ts | 2 + .../node/src/subcommands/reindex.module.ts | 11 +- .../node/src/subcommands/reindex.service.ts | 88 ++- packages/node/src/utils/reindex.ts | 63 +++ packages/node/src/yargs.ts | 6 + test/Dockerfile | 2 +- yarn.lock | 244 ++++----- 20 files changed, 965 insertions(+), 729 deletions(-) create mode 100644 packages/node/src/indexer/blockDispatcher/base-block-dispatcher.ts create mode 100644 packages/node/src/indexer/blockDispatcher/block-dispatcher.service.ts create mode 100644 packages/node/src/indexer/blockDispatcher/index.ts create mode 100644 packages/node/src/indexer/blockDispatcher/worker-block-dispatcher.service.ts delete mode 100644 packages/node/src/indexer/worker/block-dispatcher.service.ts create mode 100644 packages/node/src/utils/reindex.ts diff --git a/package.json b/package.json index 172126ecc6..d162cfc38d 100644 --- a/package.json +++ b/package.json @@ -33,7 +33,7 @@ "typescript": "^4.4.4" }, "resolutions": { - "@polkadot/api": "9.5.2", + "@polkadot/api": "9.7.1", "@polkadot/util": "10.1.11", "@terra-money/terra.js": "^3.0.11", "node-fetch": "2.6.7" diff --git a/packages/node/package.json b/packages/node/package.json index 31e4c8cdd6..72a9ff9c5c 100644 --- a/packages/node/package.json +++ b/packages/node/package.json @@ -26,7 +26,7 @@ "@nestjs/schedule": "^1.0.2", "@subql/common": "latest", "@subql/common-ethereum": "workspace:*", - "@subql/node-core": "1.3.3", + "@subql/node-core": "1.4.2-0", "@subql/types-ethereum": "workspace:*", "@subql/utils": "latest", "@subql/x-merkle-mountain-range": "2.0.0-0.1.2", diff --git a/packages/node/src/indexer/blockDispatcher/base-block-dispatcher.ts b/packages/node/src/indexer/blockDispatcher/base-block-dispatcher.ts new file mode 100644 index 0000000000..bd3a5111cf --- /dev/null +++ b/packages/node/src/indexer/blockDispatcher/base-block-dispatcher.ts @@ -0,0 +1,146 @@ +// Copyright 2020-2021 OnFinality Limited authors & contributors +// SPDX-License-Identifier: Apache-2.0 + +import assert from 'assert'; +import { EventEmitter2 } from '@nestjs/event-emitter'; +import { hexToU8a, u8aEq } from '@polkadot/util'; +import { getLogger, IndexerEvent, IQueue, NodeConfig } from '@subql/node-core'; +import { ProjectService } from '../project.service'; + +const logger = getLogger('BaseBlockDispatcherService'); + +export type ProcessBlockResponse = { + dynamicDsCreated: boolean; + operationHash: Uint8Array; + reindexBlockHeight: number; +}; + +export interface IBlockDispatcher { + init(onDynamicDsCreated: (height: number) => Promise): Promise; + + enqueueBlocks(heights: number[]): void; + + queueSize: number; + freeSize: number; + latestBufferedHeight: number | undefined; + + // Remove all enqueued blocks, used when a dynamic ds is created + flushQueue(height: number): void; + rewind(height: number): Promise; +} + +const NULL_MERKEL_ROOT = hexToU8a('0x00'); + +function isNullMerkelRoot(operationHash: Uint8Array): boolean { + return u8aEq(operationHash, NULL_MERKEL_ROOT); +} + +export abstract class BaseBlockDispatcher + implements IBlockDispatcher +{ + protected _latestBufferedHeight: number; + protected _processedBlockCount: number; + protected latestProcessedHeight: number; + protected currentProcessingHeight: number; + protected onDynamicDsCreated: (height: number) => Promise; + + constructor( + protected nodeConfig: NodeConfig, + protected eventEmitter: EventEmitter2, + protected projectService: ProjectService, + protected queue: Q, + ) {} + + abstract enqueueBlocks(heights: number[]): void; + abstract init( + onDynamicDsCreated: (height: number) => Promise, + ): Promise; + + get queueSize(): number { + return this.queue.size; + } + + get freeSize(): number { + return this.queue.freeSpace; + } + + get latestBufferedHeight(): number { + return this._latestBufferedHeight; + } + + set latestBufferedHeight(height: number) { + this.eventEmitter.emit(IndexerEvent.BlocknumberQueueSize, { + value: this.queueSize, + }); + this._latestBufferedHeight = height; + } + + protected setProcessedBlockCount(processedBlockCount: number): void { + this._processedBlockCount = processedBlockCount; + this.eventEmitter.emit(IndexerEvent.BlockProcessedCount, { + processedBlockCount, + timestamp: Date.now(), + }); + } + + // Compare it with current indexing number, if last corrected is already indexed + // rewind, also flush queued blocks, drop current indexing transaction, set last processed to correct block too + // if rollback is greater than current index flush queue only + async rewind(lastCorrectHeight: number): Promise { + if (lastCorrectHeight <= this.currentProcessingHeight) { + logger.info( + `Found last verified block at height ${lastCorrectHeight}, rewinding...`, + ); + await this.projectService.reindex(lastCorrectHeight); + this.latestProcessedHeight = lastCorrectHeight; + logger.info(`Successful rewind to block ${lastCorrectHeight}!`); + } + this.flushQueue(lastCorrectHeight); + logger.info(`Queued blocks flushed!`); //Also last buffered height reset, next fetching should start after lastCorrectHeight + } + + flushQueue(height: number): void { + this.latestBufferedHeight = height; + this.queue.flush(); + } + + // Is called directly before a block is processed + protected preProcessBlock(height: number): void { + this.currentProcessingHeight = height; + this.eventEmitter.emit(IndexerEvent.BlockProcessing, { + height, + timestamp: Date.now(), + }); + } + + // Is called directly after a block is processed + protected async postProcessBlock( + height: number, + processBlockResponse: ProcessBlockResponse, + ): Promise { + const { dynamicDsCreated, operationHash, reindexBlockHeight } = + processBlockResponse; + if (reindexBlockHeight !== null && reindexBlockHeight !== undefined) { + await this.rewind(reindexBlockHeight); + this.latestProcessedHeight = reindexBlockHeight; + } else { + if (this.nodeConfig.proofOfIndex && !isNullMerkelRoot(operationHash)) { + if (!this.projectService.blockOffset) { + // Which means during project init, it has not found offset and set value + await this.projectService.upsertMetadataBlockOffset(height - 1); + } + void this.projectService.setBlockOffset(height - 1); + } + if (dynamicDsCreated) { + await this.onDynamicDsCreated(height); + } + assert( + !this.latestProcessedHeight || height > this.latestProcessedHeight, + `Block processed out of order. Height: ${height}. Latest: ${this.latestProcessedHeight}`, + ); + // In memory _processedBlockCount increase, db metadata increase BlockCount in indexer.manager + this.setProcessedBlockCount(this._processedBlockCount + 1); + this.latestProcessedHeight = height; + } + } +} diff --git a/packages/node/src/indexer/blockDispatcher/block-dispatcher.service.ts b/packages/node/src/indexer/blockDispatcher/block-dispatcher.service.ts new file mode 100644 index 0000000000..04f8eaa72a --- /dev/null +++ b/packages/node/src/indexer/blockDispatcher/block-dispatcher.service.ts @@ -0,0 +1,179 @@ +// Copyright 2020-2021 OnFinality Limited authors & contributors +// SPDX-License-Identifier: Apache-2.0 + +import { Injectable, OnApplicationShutdown } from '@nestjs/common'; +import { EventEmitter2 } from '@nestjs/event-emitter'; +import { + ApiService, + getLogger, + NodeConfig, + IndexerEvent, + delay, + profilerWrap, + AutoQueue, + Queue, +} from '@subql/node-core'; +import { last } from 'lodash'; +import { IndexerManager } from '../indexer.manager'; +import { ProjectService } from '../project.service'; +import { BaseBlockDispatcher } from './base-block-dispatcher'; + +const logger = getLogger('BlockDispatcherService'); + +/** + * @description Intended to behave the same as WorkerBlockDispatcherService but doesn't use worker threads or any parallel processing + */ +@Injectable() +export class BlockDispatcherService + extends BaseBlockDispatcher> + implements OnApplicationShutdown +{ + private processQueue: AutoQueue; + + private fetching = false; + private isShutdown = false; + private fetchBlocksBatches: ApiService['api']['fetchBlocks']; + + constructor( + private apiService: ApiService, + nodeConfig: NodeConfig, + private indexerManager: IndexerManager, + eventEmitter: EventEmitter2, + projectService: ProjectService, + ) { + super( + nodeConfig, + eventEmitter, + projectService, + new Queue(nodeConfig.batchSize * 3), + ); + this.processQueue = new AutoQueue(nodeConfig.batchSize * 3); + + const fetchBlocks = this.apiService.api.fetchBlocks.bind( + this.apiService.api, + ); + if (this.nodeConfig.profiler) { + this.fetchBlocksBatches = profilerWrap( + fetchBlocks, + 'EthereumUtil', + 'fetchBlocksBatches', + ); + } else { + this.fetchBlocksBatches = fetchBlocks; + } + } + + // eslint-disable-next-line @typescript-eslint/require-await + async init( + onDynamicDsCreated: (height: number) => Promise, + ): Promise { + this.onDynamicDsCreated = onDynamicDsCreated; + const blockAmount = await this.projectService.getProcessedBlockCount(); + this.setProcessedBlockCount(blockAmount ?? 0); + } + + onApplicationShutdown(): void { + this.isShutdown = true; + this.processQueue.abort(); + } + + enqueueBlocks(heights: number[]): void { + if (!heights.length) return; + + logger.info( + `Enqueing blocks ${heights[0]}...${last(heights)}, total ${ + heights.length + } blocks`, + ); + + this.queue.putMany(heights); + this.latestBufferedHeight = last(heights); + + void this.fetchBlocksFromQueue().catch((e) => { + logger.error(e, 'Failed to fetch blocks from queue'); + if (!this.isShutdown) { + process.exit(1); + } + }); + } + + flushQueue(height: number): void { + super.flushQueue(height); + this.processQueue.flush(); + } + + private async fetchBlocksFromQueue(): Promise { + if (this.fetching || this.isShutdown) return; + // Process queue is full, no point in fetching more blocks + // if (this.processQueue.freeSpace < this.nodeConfig.batchSize) return; + + this.fetching = true; + + try { + while (!this.isShutdown) { + const blockNums = this.queue.takeMany( + Math.min(this.nodeConfig.batchSize, this.processQueue.freeSpace), + ); + // Used to compare before and after as a way to check if queue was flushed + const bufferedHeight = this._latestBufferedHeight; + + // Queue is empty + if (!blockNums.length) { + // The process queue might be full so no block nums were taken, wait and try again + if (this.queue.size) { + await delay(1); + continue; + } + break; + } + + logger.info( + `fetch block [${blockNums[0]},${ + blockNums[blockNums.length - 1] + }], total ${blockNums.length} blocks`, + ); + + const blocks = await this.fetchBlocksBatches(blockNums); + + if (bufferedHeight > this._latestBufferedHeight) { + logger.debug(`Queue was reset for new DS, discarding fetched blocks`); + continue; + } + const blockTasks = blocks.map((block) => async () => { + const height = block.block.block.header.number.toNumber(); + try { + this.preProcessBlock(height); + + const processBlockResponse = await this.indexerManager.indexBlock( + block, + ); + + await this.postProcessBlock(height, processBlockResponse); + } catch (e) { + if (this.isShutdown) { + return; + } + logger.error( + e, + `failed to index block at height ${height} ${ + e.handler ? `${e.handler}(${e.stack ?? ''})` : '' + }`, + ); + throw e; + } + }); + + // There can be enough of a delay after fetching blocks that shutdown could now be true + if (this.isShutdown) break; + + this.processQueue.putMany(blockTasks); + + this.eventEmitter.emit(IndexerEvent.BlockQueueSize, { + value: this.processQueue.size, + }); + } + } finally { + this.fetching = false; + } + } +} diff --git a/packages/node/src/indexer/blockDispatcher/index.ts b/packages/node/src/indexer/blockDispatcher/index.ts new file mode 100644 index 0000000000..231f984281 --- /dev/null +++ b/packages/node/src/indexer/blockDispatcher/index.ts @@ -0,0 +1,12 @@ +// Copyright 2020-2021 OnFinality Limited authors & contributors +// SPDX-License-Identifier: Apache-2.0 + +import { IBlockDispatcher } from './base-block-dispatcher'; +import { BlockDispatcherService } from './block-dispatcher.service'; +import { WorkerBlockDispatcherService } from './worker-block-dispatcher.service'; + +export { + IBlockDispatcher, + BlockDispatcherService, + WorkerBlockDispatcherService, +}; diff --git a/packages/node/src/indexer/blockDispatcher/worker-block-dispatcher.service.ts b/packages/node/src/indexer/blockDispatcher/worker-block-dispatcher.service.ts new file mode 100644 index 0000000000..c9613e964a --- /dev/null +++ b/packages/node/src/indexer/blockDispatcher/worker-block-dispatcher.service.ts @@ -0,0 +1,242 @@ +// Copyright 2020-2021 OnFinality Limited authors & contributors +// SPDX-License-Identifier: Apache-2.0 + +import assert from 'assert'; +import path from 'path'; +import { Injectable, OnApplicationShutdown } from '@nestjs/common'; +import { EventEmitter2 } from '@nestjs/event-emitter'; +import { Interval } from '@nestjs/schedule'; +import { + getLogger, + NodeConfig, + IndexerEvent, + Worker, + AutoQueue, +} from '@subql/node-core'; +import chalk from 'chalk'; +import { last } from 'lodash'; +import { ProjectService } from '../project.service'; +import { + FetchBlock, + ProcessBlock, + InitWorker, + NumFetchedBlocks, + NumFetchingBlocks, + GetWorkerStatus, +} from '../worker/worker'; +import { BaseBlockDispatcher } from './base-block-dispatcher'; + +const logger = getLogger('WorkerBlockDispatcherService'); + +type IIndexerWorker = { + processBlock: ProcessBlock; + fetchBlock: FetchBlock; + numFetchedBlocks: NumFetchedBlocks; + numFetchingBlocks: NumFetchingBlocks; + getStatus: GetWorkerStatus; +}; + +type IInitIndexerWorker = IIndexerWorker & { + initWorker: InitWorker; +}; + +type IndexerWorker = IIndexerWorker & { + terminate: () => Promise; +}; + +async function createIndexerWorker(): Promise { + const indexerWorker = Worker.create( + path.resolve(__dirname, '../../../dist/indexer/worker/worker.js'), + [ + 'initWorker', + 'processBlock', + 'fetchBlock', + 'numFetchedBlocks', + 'numFetchingBlocks', + 'getStatus', + ], + ); + + await indexerWorker.initWorker(); + + return indexerWorker; +} + +@Injectable() +export class WorkerBlockDispatcherService + extends BaseBlockDispatcher> + implements OnApplicationShutdown +{ + private workers: IndexerWorker[]; + private numWorkers: number; + + private taskCounter = 0; + private isShutdown = false; + + constructor( + nodeConfig: NodeConfig, + eventEmitter: EventEmitter2, + projectService: ProjectService, + ) { + const numWorkers = nodeConfig.workers; + super( + nodeConfig, + eventEmitter, + projectService, + new AutoQueue(numWorkers * nodeConfig.batchSize * 2), + ); + this.numWorkers = numWorkers; + } + + async init( + onDynamicDsCreated: (height: number) => Promise, + ): Promise { + if (this.nodeConfig.unfinalizedBlocks) { + throw new Error( + 'Sorry, best block feature is not supported with workers yet.', + ); + } + + this.workers = await Promise.all( + new Array(this.numWorkers).fill(0).map(() => createIndexerWorker()), + ); + + this.onDynamicDsCreated = onDynamicDsCreated; + + const blockAmount = await this.projectService.getProcessedBlockCount(); + this.setProcessedBlockCount(blockAmount ?? 0); + } + + async onApplicationShutdown(): Promise { + this.isShutdown = true; + // Stop processing blocks + this.queue.abort(); + + // Stop all workers + if (this.workers) { + await Promise.all(this.workers.map((w) => w.terminate())); + } + } + + enqueueBlocks(heights: number[]): void { + if (!heights.length) return; + logger.info( + `Enqueing blocks [${heights[0]}...${last(heights)}], total ${ + heights.length + } blocks`, + ); + + // eslint-disable-next-line no-constant-condition + if (true) { + /* + * Load balancing: + * worker1: 1,2,3 + * worker2: 4,5,6 + */ + const workerIdx = this.getNextWorkerIndex(); + heights.map((height) => this.enqueueBlock(height, workerIdx)); + } else { + /* + * Load balancing: + * worker1: 1,3,5 + * worker2: 2,4,6 + */ + heights.map((height) => + this.enqueueBlock(height, this.getNextWorkerIndex()), + ); + } + + this.latestBufferedHeight = last(heights); + } + + private enqueueBlock(height: number, workerIdx: number) { + if (this.isShutdown) return; + const worker = this.workers[workerIdx]; + + assert(worker, `Worker ${workerIdx} not found`); + + // Used to compare before and after as a way to check if queue was flushed + const bufferedHeight = this.latestBufferedHeight; + const pendingBlock = worker.fetchBlock(height); + + const processBlock = async () => { + try { + const start = new Date(); + const result = await pendingBlock; + const end = new Date(); + + if (bufferedHeight > this.latestBufferedHeight) { + logger.debug(`Queue was reset for new DS, discarding fetched blocks`); + return; + } + + const waitTime = end.getTime() - start.getTime(); + if (waitTime > 1000) { + logger.info( + `Waiting to fetch block ${height}: ${chalk.red(`${waitTime}ms`)}`, + ); + } else if (waitTime > 200) { + logger.info( + `Waiting to fetch block ${height}: ${chalk.yellow( + `${waitTime}ms`, + )}`, + ); + } + + // logger.info( + // `worker ${workerIdx} processing block ${height}, fetched blocks: ${await worker.numFetchedBlocks()}, fetching blocks: ${await worker.numFetchingBlocks()}`, + // ); + + this.preProcessBlock(height); + + const { dynamicDsCreated, operationHash, reindexBlockHeight } = + await worker.processBlock(height); + + await this.postProcessBlock(height, { + dynamicDsCreated, + operationHash: Buffer.from(operationHash, 'base64'), + reindexBlockHeight, + }); + } catch (e) { + logger.error( + e, + `failed to index block at height ${height} ${ + e.handler ? `${e.handler}(${e.stack ?? ''})` : '' + }`, + ); + throw e; + } + }; + + void this.queue.put(processBlock); + } + + @Interval(15000) + async sampleWorkerStatus(): Promise { + for (const worker of this.workers) { + const status = await worker.getStatus(); + logger.info(JSON.stringify(status)); + } + } + + // Getter doesn't seem to cary from abstract class + get latestBufferedHeight(): number { + return this._latestBufferedHeight; + } + + set latestBufferedHeight(height: number) { + super.latestBufferedHeight = height; + // There is only a single queue with workers so we treat them as the same + this.eventEmitter.emit(IndexerEvent.BlockQueueSize, { + value: this.queueSize, + }); + } + + private getNextWorkerIndex(): number { + const index = this.taskCounter % this.numWorkers; + + this.taskCounter++; + + return index; + } +} diff --git a/packages/node/src/indexer/dynamic-ds.service.ts b/packages/node/src/indexer/dynamic-ds.service.ts index 5d9350e5e2..24a6c102f0 100644 --- a/packages/node/src/indexer/dynamic-ds.service.ts +++ b/packages/node/src/indexer/dynamic-ds.service.ts @@ -5,7 +5,7 @@ import assert from 'assert'; import { Injectable } from '@nestjs/common'; import { isCustomDs, isRuntimeDs } from '@subql/common-ethereum'; import { getLogger, MetadataRepo } from '@subql/node-core'; -import { cloneDeep } from 'lodash'; +import { cloneDeep, isEqual, unionWith } from 'lodash'; import { Transaction } from 'sequelize/types'; import { SubqlProjectDs, SubqueryProject } from '../configure/SubqueryProject'; import { DsProcessorService } from './ds-processor.service'; @@ -37,6 +37,20 @@ export class DynamicDsService { private _datasources: SubqlProjectDs[]; + async resetDynamicDatasource(targetHeight: number, tx: Transaction) { + const dynamicDs = await this.getDynamicDatasourceParams(); + if (dynamicDs.length !== 0) { + const filteredDs = dynamicDs.filter( + (ds) => ds.startBlock <= targetHeight, + ); + const dsRecords = JSON.stringify(filteredDs); + await this.metaDataRepo.upsert( + { key: METADATA_KEY, value: dsRecords }, + { transaction: tx }, + ); + } + } + async createDynamicDatasource( params: DatasourceParams, tx: Transaction, @@ -86,20 +100,26 @@ export class DynamicDsService { ): Promise { assert(this.metaDataRepo, `Model _metadata does not exist`); const record = await this.metaDataRepo.findByPk(METADATA_KEY); - let results = record?.value; - - if (!results || typeof results !== 'string') { - if (blockHeight !== undefined) { - results = this.tempDsRecords?.[TEMP_DS_PREFIX + blockHeight]; - if (!results || typeof results !== 'string') { - return []; - } - } else { - return []; + + let results: DatasourceParams[] = []; + + const metaResults: DatasourceParams[] = JSON.parse( + (record?.value as string) ?? '[]', + ); + if (metaResults.length) { + results = [...metaResults]; + } + + if (blockHeight !== undefined) { + const tempResults: DatasourceParams[] = JSON.parse( + this.tempDsRecords?.[TEMP_DS_PREFIX + blockHeight] ?? '[]', + ); + if (tempResults.length) { + results = unionWith(results, tempResults, isEqual); } } - return JSON.parse(results); + return results; } private async saveDynamicDatasourceParams( @@ -115,7 +135,7 @@ export class DynamicDsService { .then(() => { this.tempDsRecords = { ...this.tempDsRecords, - ...{ [TEMP_DS_PREFIX + dsParams.startBlock]: dsRecords }, + [TEMP_DS_PREFIX + dsParams.startBlock]: dsRecords, }; }); } diff --git a/packages/node/src/indexer/fetch.module.ts b/packages/node/src/indexer/fetch.module.ts index dc8ca8534f..6a455579e1 100644 --- a/packages/node/src/indexer/fetch.module.ts +++ b/packages/node/src/indexer/fetch.module.ts @@ -13,6 +13,10 @@ import { } from '@subql/node-core'; import { SubqueryProject } from '../configure/SubqueryProject'; import { EthereumApiService } from '../ethereum/api.service.ethereum'; +import { + BlockDispatcherService, + WorkerBlockDispatcherService, +} from './blockDispatcher'; import { DictionaryService } from './dictionary.service'; import { DsProcessorService } from './ds-processor.service'; import { DynamicDsService } from './dynamic-ds.service'; @@ -20,10 +24,6 @@ import { FetchService } from './fetch.service'; import { IndexerManager } from './indexer.manager'; import { ProjectService } from './project.service'; import { SandboxService } from './sandbox.service'; -import { - BlockDispatcherService, - WorkerBlockDispatcherService, -} from './worker/block-dispatcher.service'; @Module({ providers: [ diff --git a/packages/node/src/indexer/fetch.service.ts b/packages/node/src/indexer/fetch.service.ts index a1d87401a2..c2087a844f 100644 --- a/packages/node/src/indexer/fetch.service.ts +++ b/packages/node/src/indexer/fetch.service.ts @@ -28,9 +28,9 @@ import { range, sortBy, uniqBy } from 'lodash'; import { SubqlProjectDs, SubqueryProject } from '../configure/SubqueryProject'; import { calcInterval } from '../ethereum/utils.ethereum'; import { eventToTopic, functionToSighash } from '../utils/string'; +import { IBlockDispatcher } from './blockDispatcher'; import { DictionaryService } from './dictionary.service'; import { DynamicDsService } from './dynamic-ds.service'; -import { IBlockDispatcher } from './worker/block-dispatcher.service'; const logger = getLogger('fetch'); let BLOCK_TIME_VARIANCE = 5000; @@ -98,10 +98,10 @@ export class FetchService implements OnApplicationShutdown { private latestBestHeight: number; private latestFinalizedHeight: number; private isShutdown = false; - private useDictionary: boolean; private dictionaryQueryEntries?: DictionaryQueryEntry[]; private batchSizeScale: number; private templateDynamicDatasouces: SubqlProjectDs[]; + private dictionaryGenesisMatches = true; constructor( private apiService: ApiService, @@ -129,12 +129,17 @@ export class FetchService implements OnApplicationShutdown { await this.dynamicDsService.getDynamicDatasources(); } - // TODO: if custom ds doesn't support dictionary, use baseFilter, if yes, let - getDictionaryQueryEntries(): DictionaryQueryEntry[] { + buildDictionaryQueryEntries(startBlock: number): DictionaryQueryEntry[] { const queryEntries: DictionaryQueryEntry[] = []; - const dataSources = this.project.dataSources; - for (const ds of dataSources.concat(this.templateDynamicDatasouces)) { + // Only run the ds that is equal or less than startBlock + // sort array from lowest ds.startBlock to highest + const filteredDs = this.project.dataSources + .concat(this.templateDynamicDatasouces) + .filter((ds) => ds.startBlock <= startBlock) + .sort((a, b) => a.startBlock - b.startBlock); + + for (const ds of filteredDs) { for (const handler of ds.mapping.handlers) { let filterList: SubqlHandlerFilter[]; filterList = [handler.filter]; @@ -182,10 +187,21 @@ export class FetchService implements OnApplicationShutdown { } updateDictionary(): void { - this.dictionaryQueryEntries = this.getDictionaryQueryEntries(); - this.useDictionary = - !!this.dictionaryQueryEntries?.length && - !!this.project.network.dictionary; + this.dictionaryService.buildDictionaryEntryMap( + this.project.dataSources.concat(this.templateDynamicDatasouces), + this.buildDictionaryQueryEntries.bind(this), + ); + } + + private get useDictionary(): boolean { + return ( + !!this.project.network.dictionary && + this.dictionaryGenesisMatches && + !!this.dictionaryService.getDictionaryQueryEntries( + this.blockDispatcher.latestBufferedHeight ?? + Math.min(...this.project.dataSources.map((ds) => ds.startBlock)), + ).length + ); } async init(startHeight: number): Promise { @@ -233,9 +249,11 @@ export class FetchService implements OnApplicationShutdown { const currentFinalizedHeight = await this.api.getFinalizedBlockHeight(); if (this.latestFinalizedHeight !== currentFinalizedHeight) { this.latestFinalizedHeight = currentFinalizedHeight; - this.eventEmitter.emit(IndexerEvent.BlockTarget, { - height: this.latestFinalizedHeight, - }); + if (!this.nodeConfig.unfinalizedBlocks) { + this.eventEmitter.emit(IndexerEvent.BlockTarget, { + height: this.latestFinalizedHeight, + }); + } } } catch (e) { logger.warn(e, `Having a problem when get finalized block`); @@ -255,13 +273,19 @@ export class FetchService implements OnApplicationShutdown { this.eventEmitter.emit(IndexerEvent.BlockBest, { height: this.latestBestHeight, }); + + if (this.nodeConfig.unfinalizedBlocks) { + this.eventEmitter.emit(IndexerEvent.BlockTarget, { + height: this.latestBestHeight, + }); + } } } catch (e) { logger.warn(e, `Having a problem when get best block`); } } - async startLoop(initBlockHeight: number): Promise { + private async startLoop(initBlockHeight: number): Promise { await this.fillNextBlockBuffer(initBlockHeight); } @@ -323,26 +347,32 @@ export class FetchService implements OnApplicationShutdown { Math.round(this.batchSizeScale * this.nodeConfig.batchSize), Math.min(MINIMUM_BATCH_SIZE, this.nodeConfig.batchSize * 3), ); + const latestHeight = this.nodeConfig.unfinalizedBlocks + ? this.latestBestHeight + : this.latestFinalizedHeight; + if ( this.blockDispatcher.freeSize < scaledBatchSize || - startBlockHeight > this.latestFinalizedHeight + startBlockHeight > latestHeight ) { await delay(1); continue; } + if (this.useDictionary) { const queryEndBlock = startBlockHeight + DICTIONARY_MAX_QUERY_SIZE; const moduloBlocks = this.getModuloBlocks( startBlockHeight, queryEndBlock, ); + try { - const dictionary = await this.dictionaryService.getDictionary( - startBlockHeight, - queryEndBlock, - scaledBatchSize, - this.dictionaryQueryEntries, - ); + const dictionary = + await this.dictionaryService.scopedDictionaryEntries( + startBlockHeight, + queryEndBlock, + scaledBatchSize, + ); if (startBlockHeight !== getStartBlockHeight()) { logger.debug( @@ -382,12 +412,13 @@ export class FetchService implements OnApplicationShutdown { this.eventEmitter.emit(IndexerEvent.SkipDictionary); } } + const endHeight = this.nextEndBlockHeight( startBlockHeight, scaledBatchSize, ); - if (this.getModulos().length === handlers.length) { + if (handlers.length && this.getModulos().length === handlers.length) { this.blockDispatcher.enqueueBlocks( this.getEnqueuedModuloBlocks(startBlockHeight), ); @@ -406,7 +437,13 @@ export class FetchService implements OnApplicationShutdown { let endBlockHeight = startBlockHeight + scaledBatchSize - 1; if (endBlockHeight > this.latestFinalizedHeight) { - endBlockHeight = this.latestFinalizedHeight; + if (this.nodeConfig.unfinalizedBlocks) { + if (endBlockHeight >= this.latestBestHeight) { + endBlockHeight = this.latestBestHeight; + } + } else { + endBlockHeight = this.latestFinalizedHeight; + } } return endBlockHeight; } @@ -419,7 +456,7 @@ export class FetchService implements OnApplicationShutdown { logger.error( 'The dictionary that you have specified does not match the chain you are indexing, it will be ignored. Please update your project manifest to reference the correct dictionary', ); - this.useDictionary = false; + this.dictionaryGenesisMatches = false; this.eventEmitter.emit(IndexerEvent.UsingDictionary, { value: Number(this.useDictionary), }); @@ -442,4 +479,10 @@ export class FetchService implements OnApplicationShutdown { this.updateDictionary(); this.blockDispatcher.flushQueue(blockHeight); } + + async resetForIncorrectBestBlock(blockHeight: number): Promise { + await this.syncDynamicDatascourcesFromMeta(); + this.updateDictionary(); + this.blockDispatcher.flushQueue(blockHeight); + } } diff --git a/packages/node/src/indexer/indexer.manager.ts b/packages/node/src/indexer/indexer.manager.ts index c71d3f5863..3ad81767ac 100644 --- a/packages/node/src/indexer/indexer.manager.ts +++ b/packages/node/src/indexer/indexer.manager.ts @@ -72,9 +72,11 @@ export class IndexerManager { } @profiler(yargsOptions.argv.profiler) - async indexBlock( - blockContent: EthereumBlockWrapper, - ): Promise<{ dynamicDsCreated: boolean; operationHash: Uint8Array }> { + async indexBlock(blockContent: EthereumBlockWrapper): Promise<{ + dynamicDsCreated: boolean; + operationHash: Uint8Array; + reindexBlockHeight: null; + }> { const { blockHeight } = blockContent; let dynamicDsCreated = false; const tx = await this.sequelize.transaction(); @@ -132,7 +134,7 @@ export class IndexerManager { { transaction: tx }, ); // Db Metadata increase BlockCount, in memory ref to block-dispatcher _processedBlockCount - await this.storeService.incrementBlockCount(tx); + await this.storeService.incrementJsonbCount('processedBlockCount', tx); // Need calculate operationHash to ensure correct offset insert all time operationHash = this.storeService.getOperationMerkleRoot(); @@ -165,6 +167,7 @@ export class IndexerManager { return { dynamicDsCreated, operationHash, + reindexBlockHeight: null, }; } diff --git a/packages/node/src/indexer/project.service.ts b/packages/node/src/indexer/project.service.ts index fd244c2704..2e792ff9ec 100644 --- a/packages/node/src/indexer/project.service.ts +++ b/packages/node/src/indexer/project.service.ts @@ -25,6 +25,7 @@ import { SubqueryProject, } from '../configure/SubqueryProject'; import { initDbSchema } from '../utils/project'; +import { reindex } from '../utils/reindex'; import { DsProcessorService } from './ds-processor.service'; import { DynamicDsService } from './dynamic-ds.service'; @@ -71,6 +72,11 @@ export class ProjectService { return this._startHeight; } + get isHistorical(): boolean { + return this.storeService.historical; + } + + // eslint-disable-next-line @typescript-eslint/require-await private async getExistingProjectSchema(): Promise { return getExistingProjectSchema(this.nodeConfig, this.sequelize); } @@ -111,6 +117,13 @@ export class ProjectService { await this.poiService.init(this.schema); } } + + if (this.nodeConfig.unfinalizedBlocks && !this.isHistorical) { + logger.error( + 'Unfinalized blocks cannot be enabled without historical. You will need to reindex your project to enable historical', + ); + process.exit(1); + } } private async ensureProject(): Promise { @@ -162,6 +175,8 @@ export class ProjectService { 'genesisHash', 'chainId', 'processedBlockCount', + 'lastFinalizedVerifiedHeight', + 'schemaMigrationCount', ] as const; const entries = await metadataRepo.findAll({ @@ -228,6 +243,9 @@ export class ProjectService { value: packageVersion, }); } + if (!keyValue.schemaMigrationCount) { + await metadataRepo.upsert({ key: 'schemaMigrationCount', value: 0 }); + } return metadataRepo; } @@ -255,7 +273,6 @@ export class ProjectService { } else { startHeight = this.getStartBlockFromDataSources(); } - return startHeight; } @@ -298,4 +315,21 @@ export class ProjectService { return Math.min(...startBlocksList); } } + + async reindex(targetBlockHeight: number): Promise { + const lastProcessedHeight = await this.getLastProcessedHeight(); + + return reindex( + this.getStartBlockFromDataSources(), + await this.getMetadataBlockOffset(), + targetBlockHeight, + lastProcessedHeight, + this.storeService, + this.dynamicDsService, + this.mmrService, + this.sequelize, + /* Not providing force clean service, it should never be needed */ + ); + } + n; } diff --git a/packages/node/src/indexer/worker/block-dispatcher.service.ts b/packages/node/src/indexer/worker/block-dispatcher.service.ts deleted file mode 100644 index 001a18542f..0000000000 --- a/packages/node/src/indexer/worker/block-dispatcher.service.ts +++ /dev/null @@ -1,502 +0,0 @@ -// Copyright 2020-2021 OnFinality Limited authors & contributors -// SPDX-License-Identifier: Apache-2.0 - -import assert from 'assert'; -import path from 'path'; -import { Injectable, OnApplicationShutdown } from '@nestjs/common'; -import { EventEmitter2 } from '@nestjs/event-emitter'; -import { Interval } from '@nestjs/schedule'; -import { hexToU8a, u8aEq } from '@polkadot/util'; -import { - ApiService, - getLogger, - NodeConfig, - IndexerEvent, - AutoQueue, - Queue, - Worker, - delay, - profilerWrap, -} from '@subql/node-core'; -import { EthereumBlockWrapper } from '@subql/types-ethereum'; -import chalk from 'chalk'; -import { last } from 'lodash'; -import { IndexerManager } from '../indexer.manager'; -import { ProjectService } from '../project.service'; -import { - FetchBlock, - ProcessBlock, - InitWorker, - NumFetchedBlocks, - NumFetchingBlocks, - GetWorkerStatus, -} from './worker'; - -const NULL_MERKEL_ROOT = hexToU8a('0x00'); - -function isNullMerkelRoot(operationHash: Uint8Array): boolean { - return u8aEq(operationHash, NULL_MERKEL_ROOT); -} - -type IIndexerWorker = { - processBlock: ProcessBlock; - fetchBlock: FetchBlock; - numFetchedBlocks: NumFetchedBlocks; - numFetchingBlocks: NumFetchingBlocks; - getStatus: GetWorkerStatus; -}; - -type IInitIndexerWorker = IIndexerWorker & { - initWorker: InitWorker; -}; - -type IndexerWorker = IIndexerWorker & { - terminate: () => Promise; -}; - -async function createIndexerWorker(): Promise { - const indexerWorker = Worker.create( - path.resolve(__dirname, '../../../dist/indexer/worker/worker.js'), - [ - 'initWorker', - 'processBlock', - 'fetchBlock', - 'numFetchedBlocks', - 'numFetchingBlocks', - 'getStatus', - ], - ); - - await indexerWorker.initWorker(); - - return indexerWorker; -} - -export interface IBlockDispatcher { - init(onDynamicDsCreated: (height: number) => Promise): Promise; - - enqueueBlocks(heights: number[]): void; - - queueSize: number; - freeSize: number; - latestBufferedHeight: number | undefined; - - // Remove all enqueued blocks, used when a dynamic ds is created - flushQueue(height: number): void; -} - -const logger = getLogger('BlockDispatcherService'); - -// TODO move to another file -/** - * @description Intended to behave the same as WorkerBlockDispatcherService but doesn't use worker threads or any parallel processing - */ -@Injectable() -export class BlockDispatcherService - implements IBlockDispatcher, OnApplicationShutdown -{ - private fetchQueue: Queue; - private processQueue: AutoQueue; - - private fetching = false; - private isShutdown = false; - private onDynamicDsCreated: (height: number) => Promise; - private _latestBufferedHeight: number; - private _processedBlockCount: number; - - private fetchBlocksBatches: ApiService['api']['fetchBlocks']; - private latestProcessedHeight: number; - - constructor( - private apiService: ApiService, - private nodeConfig: NodeConfig, - private indexerManager: IndexerManager, - private eventEmitter: EventEmitter2, - private projectService: ProjectService, - ) { - this.fetchQueue = new Queue(nodeConfig.batchSize * 3); - this.processQueue = new AutoQueue(nodeConfig.batchSize * 3); - - const fetchBlocks = this.apiService.api.fetchBlocks.bind( - this.apiService.api, - ); - if (this.nodeConfig.profiler) { - this.fetchBlocksBatches = profilerWrap( - fetchBlocks, - 'EthereumUtil', - 'fetchBlocksBatches', - ); - } else { - this.fetchBlocksBatches = fetchBlocks; - } - } - - // eslint-disable-next-line @typescript-eslint/require-await - async init( - onDynamicDsCreated: (height: number) => Promise, - ): Promise { - this.onDynamicDsCreated = onDynamicDsCreated; - const blockAmount = await this.projectService.getProcessedBlockCount(); - this.setProcessedBlockCount(blockAmount ?? 0); - } - - onApplicationShutdown(): void { - this.isShutdown = true; - this.processQueue.abort(); - } - - enqueueBlocks(heights: number[]): void { - if (!heights.length) return; - - logger.info(`Enqueing blocks ${heights[0]}...${last(heights)}`); - - this.fetchQueue.putMany(heights); - this.latestBufferedHeight = last(heights); - - void this.fetchBlocksFromQueue().catch((e) => { - logger.error(e, 'Failed to fetch blocks from queue'); - if (!this.isShutdown) { - throw e; - } - }); - } - - flushQueue(height: number): void { - this.latestBufferedHeight = height; - this.fetchQueue.flush(); // Empty - this.processQueue.flush(); - } - - private setProcessedBlockCount(processedBlockCount: number) { - this._processedBlockCount = processedBlockCount; - this.eventEmitter.emit(IndexerEvent.BlockProcessedCount, { - processedBlockCount, - timestamp: Date.now(), - }); - } - - private async fetchBlocksFromQueue(): Promise { - if (this.fetching || this.isShutdown) return; - // Process queue is full, no point in fetching more blocks - // if (this.processQueue.freeSpace < this.nodeConfig.batchSize) return; - - this.fetching = true; - - try { - while (!this.isShutdown) { - const blockNums = this.fetchQueue.takeMany( - Math.min(this.nodeConfig.batchSize, this.processQueue.freeSpace), - ); - // Used to compare before and after as a way to check if queue was flushed - const bufferedHeight = this._latestBufferedHeight; - - // Queue is empty - if (!blockNums.length) { - // The process queue might be full so no block nums were taken, wait and try again - if (this.fetchQueue.size) { - await delay(1); - continue; - } - break; - } - - logger.info( - `fetch block [${blockNums[0]},${ - blockNums[blockNums.length - 1] - }], total ${blockNums.length} blocks`, - ); - - const blocks = await this.fetchBlocksBatches(blockNums); - - if (bufferedHeight > this._latestBufferedHeight) { - logger.debug(`Queue was reset for new DS, discarding fetched blocks`); - continue; - } - - const blockTasks = blocks.map((block) => async () => { - const height = block.blockHeight; - try { - this.eventEmitter.emit(IndexerEvent.BlockProcessing, { - height, - timestamp: Date.now(), - }); - - const { dynamicDsCreated, operationHash } = - await this.indexerManager.indexBlock( - block as EthereumBlockWrapper, - ); - - // In memory _processedBlockCount increase, db metadata increase BlockCount in indexer.manager - this.setProcessedBlockCount(this._processedBlockCount + 1); - if ( - this.nodeConfig.proofOfIndex && - !isNullMerkelRoot(operationHash) - ) { - if (!this.projectService.blockOffset) { - // Which means during project init, it has not found offset and set value - await this.projectService.upsertMetadataBlockOffset(height - 1); - } - void this.projectService.setBlockOffset(height - 1); - } - - if (dynamicDsCreated) { - await this.onDynamicDsCreated(height); - } - - assert( - !this.latestProcessedHeight || - height > this.latestProcessedHeight, - `Block processed out of order. Height: ${height}. Latest: ${this.latestProcessedHeight}`, - ); - this.latestProcessedHeight = height; - } catch (e) { - if (this.isShutdown) { - return; - } - logger.error( - e, - `failed to index block at height ${height} ${ - e.handler ? `${e.handler}(${e.stack ?? ''})` : '' - }`, - ); - throw e; - } - }); - - // There can be enough of a delay after fetching blocks that shutdown could now be true - if (this.isShutdown) break; - - this.processQueue.putMany(blockTasks); - - this.eventEmitter.emit(IndexerEvent.BlockQueueSize, { - value: this.processQueue.size, - }); - } - } finally { - this.fetching = false; - } - } - - get queueSize(): number { - return this.fetchQueue.size; - } - - get freeSize(): number { - return this.fetchQueue.freeSpace; - } - - get latestBufferedHeight(): number { - return this._latestBufferedHeight; - } - - set latestBufferedHeight(height: number) { - this.eventEmitter.emit(IndexerEvent.BlocknumberQueueSize, { - value: this.queueSize, - }); - this._latestBufferedHeight = height; - } -} - -@Injectable() -export class WorkerBlockDispatcherService - implements IBlockDispatcher, OnApplicationShutdown -{ - private workers: IndexerWorker[]; - private numWorkers: number; - private onDynamicDsCreated: (height: number) => Promise; - - private taskCounter = 0; - private isShutdown = false; - private queue: AutoQueue; - private _latestBufferedHeight: number; - private _processedBlockCount: number; - - constructor( - private nodeConfig: NodeConfig, - private eventEmitter: EventEmitter2, - private projectService: ProjectService, - ) { - this.numWorkers = nodeConfig.workers; - this.queue = new AutoQueue(this.numWorkers * nodeConfig.batchSize * 2); - } - - async init( - onDynamicDsCreated: (height: number) => Promise, - ): Promise { - this.workers = await Promise.all( - new Array(this.numWorkers).fill(0).map(() => createIndexerWorker()), - ); - - this.onDynamicDsCreated = onDynamicDsCreated; - - const blockAmount = await this.projectService.getProcessedBlockCount(); - this.setProcessedBlockCount(blockAmount ?? 0); - } - - private setProcessedBlockCount(processedBlockCount: number) { - this._processedBlockCount = processedBlockCount; - this.eventEmitter.emit(IndexerEvent.BlockProcessedCount, { - processedBlockCount, - timestamp: Date.now(), - }); - } - - async onApplicationShutdown(): Promise { - this.isShutdown = true; - // Stop processing blocks - this.queue.abort(); - - // Stop all workers - if (this.workers) { - await Promise.all(this.workers.map((w) => w.terminate())); - } - } - - enqueueBlocks(heights: number[]): void { - if (!heights.length) return; - logger.info( - `Enqueing blocks [${heights[0]}...${last(heights)}], total ${ - heights.length - } blocks`, - ); - - // eslint-disable-next-line no-constant-condition - if (true) { - /* - * Load balancing: - * worker1: 1,2,3 - * worker2: 4,5,6 - */ - const workerIdx = this.getNextWorkerIndex(); - heights.map((height) => this.enqueueBlock(height, workerIdx)); - } else { - /* - * Load balancing: - * worker1: 1,3,5 - * worker2: 2,4,6 - */ - heights.map((height) => - this.enqueueBlock(height, this.getNextWorkerIndex()), - ); - } - - this.latestBufferedHeight = last(heights); - } - - flushQueue(height: number): void { - this.latestBufferedHeight = height; - this.queue.flush(); - } - - private enqueueBlock(height: number, workerIdx: number) { - if (this.isShutdown) return; - const worker = this.workers[workerIdx]; - - assert(worker, `Worker ${workerIdx} not found`); - - // Used to compare before and after as a way to check if queue was flushed - const bufferedHeight = this._latestBufferedHeight; - const pendingBlock = worker.fetchBlock(height); - - const processBlock = async () => { - try { - const start = new Date(); - const result = await pendingBlock; - const end = new Date(); - - if (bufferedHeight > this._latestBufferedHeight) { - logger.debug(`Queue was reset for new DS, discarding fetched blocks`); - return; - } - - const waitTime = end.getTime() - start.getTime(); - if (waitTime > 1000) { - logger.info( - `Waiting to fetch block ${height}: ${chalk.red(`${waitTime}ms`)}`, - ); - } else if (waitTime > 200) { - logger.info( - `Waiting to fetch block ${height}: ${chalk.yellow( - `${waitTime}ms`, - )}`, - ); - } - - // logger.info( - // `worker ${workerIdx} processing block ${height}, fetched blocks: ${await worker.numFetchedBlocks()}, fetching blocks: ${await worker.numFetchingBlocks()}`, - // ); - - this.eventEmitter.emit(IndexerEvent.BlockProcessing, { - height, - timestamp: Date.now(), - }); - - const { dynamicDsCreated, operationHash } = await worker.processBlock( - height, - ); - // In memory _processedBlockCount increase, db metadata increase BlockCount in indexer.manager - this.setProcessedBlockCount(this._processedBlockCount + 1); - - if ( - this.nodeConfig.proofOfIndex && - !isNullMerkelRoot(Buffer.from(operationHash, 'base64')) - ) { - if (!this.projectService.blockOffset) { - // Which means during project init, it has not found offset and set value - await this.projectService.upsertMetadataBlockOffset(height - 1); - } - void this.projectService.setBlockOffset(height - 1); - } - - if (dynamicDsCreated) { - await this.onDynamicDsCreated(height); - } - } catch (e) { - logger.error( - e, - `failed to index block at height ${height} ${ - e.handler ? `${e.handler}(${e.stack ?? ''})` : '' - }`, - ); - throw e; - } - }; - - void this.queue.put(processBlock); - } - - @Interval(15000) - async sampleWorkerStatus(): Promise { - for (const worker of this.workers) { - const status = await worker.getStatus(); - logger.info(JSON.stringify(status)); - } - } - - get queueSize(): number { - return this.queue.size; - } - - get freeSize(): number { - return this.queue.freeSpace; - } - - get latestBufferedHeight(): number { - return this._latestBufferedHeight; - } - - set latestBufferedHeight(height: number) { - this.eventEmitter.emit(IndexerEvent.BlocknumberQueueSize, { - value: this.queueSize, - }); - this._latestBufferedHeight = height; - } - - private getNextWorkerIndex(): number { - const index = this.taskCounter % this.numWorkers; - - this.taskCounter++; - - return index; - } -} diff --git a/packages/node/src/indexer/worker/worker.service.ts b/packages/node/src/indexer/worker/worker.service.ts index e6c571ee83..5236a7d1d8 100644 --- a/packages/node/src/indexer/worker/worker.service.ts +++ b/packages/node/src/indexer/worker/worker.service.ts @@ -12,6 +12,7 @@ export type FetchBlockResponse = { parentHash: string } | undefined; export type ProcessBlockResponse = { dynamicDsCreated: boolean; operationHash: string; // Base64 encoded u8a array + reindexBlockHeight: number; }; export type WorkerStatusResponse = { diff --git a/packages/node/src/subcommands/reindex.init.ts b/packages/node/src/subcommands/reindex.init.ts index 563675c037..69bdc89d9d 100644 --- a/packages/node/src/subcommands/reindex.init.ts +++ b/packages/node/src/subcommands/reindex.init.ts @@ -13,6 +13,8 @@ export async function reindexInit(targetHeight: number): Promise { await app.init(); const reindexService = app.get(ReindexService); + + await reindexService.init(); await reindexService.reindex(targetHeight); } catch (e) { logger.error(e, 'Reindex failed to execute'); diff --git a/packages/node/src/subcommands/reindex.module.ts b/packages/node/src/subcommands/reindex.module.ts index c63f5d0e40..2978562bb0 100644 --- a/packages/node/src/subcommands/reindex.module.ts +++ b/packages/node/src/subcommands/reindex.module.ts @@ -4,11 +4,20 @@ import { Module } from '@nestjs/common'; import { DbModule, MmrService, StoreService } from '@subql/node-core'; import { ConfigureModule } from '../configure/configure.module'; +import { DsProcessorService } from '../indexer/ds-processor.service'; +import { DynamicDsService } from '../indexer/dynamic-ds.service'; import { ForceCleanService } from './forceClean.service'; import { ReindexService } from './reindex.service'; @Module({ - providers: [StoreService, ReindexService, MmrService, ForceCleanService], + providers: [ + StoreService, + ReindexService, + MmrService, + ForceCleanService, + DynamicDsService, + DsProcessorService, + ], controllers: [], }) export class ReindexFeatureModule {} diff --git a/packages/node/src/subcommands/reindex.service.ts b/packages/node/src/subcommands/reindex.service.ts index 7f670d803c..27c365c95e 100644 --- a/packages/node/src/subcommands/reindex.service.ts +++ b/packages/node/src/subcommands/reindex.service.ts @@ -13,8 +13,10 @@ import { getMetaDataInfo, } from '@subql/node-core'; import { Sequelize } from 'sequelize'; -import { SubqlProjectDs, SubqueryProject } from '../configure/SubqueryProject'; +import { SubqueryProject } from '../configure/SubqueryProject'; +import { DynamicDsService } from '../indexer/dynamic-ds.service'; import { initDbSchema } from '../utils/project'; +import { reindex } from '../utils/reindex'; import { ForceCleanService } from './forceClean.service'; @@ -24,8 +26,7 @@ const logger = getLogger('Reindex'); export class ReindexService { private schema: string; private metadataRepo: MetadataRepo; - private specName: string; - private startHeight: number; + constructor( private readonly sequelize: Sequelize, private readonly nodeConfig: NodeConfig, @@ -33,8 +34,21 @@ export class ReindexService { private readonly mmrService: MmrService, private readonly project: SubqueryProject, private readonly forceCleanService: ForceCleanService, + private readonly dynamicDsService: DynamicDsService, ) {} + async init(): Promise { + this.schema = await this.getExistingProjectSchema(); + + if (!this.schema) { + logger.error('Unable to locate schema'); + throw new Error('Schema does not exist.'); + } + await this.initDbSchema(); + this.metadataRepo = MetadataFactory(this.sequelize, this.schema); + this.dynamicDsService.init(this.metadataRepo); + } + private async getExistingProjectSchema(): Promise { return getExistingProjectSchema(this.nodeConfig, this.sequelize); } @@ -74,57 +88,21 @@ export class ReindexService { } async reindex(targetBlockHeight: number): Promise { - this.schema = await this.getExistingProjectSchema(); - - if (!this.schema) { - logger.error('Unable to locate schema'); - throw new Error('Schema does not exist.'); - } - await this.initDbSchema(); - - this.metadataRepo = MetadataFactory(this.sequelize, this.schema); - - this.startHeight = await this.getStartBlockFromDataSources(); - - const lastProcessedHeight = await this.getLastProcessedHeight(); - - if (!this.storeService.historical) { - logger.warn('Unable to reindex, historical state not enabled'); - return; - } - if (!lastProcessedHeight || lastProcessedHeight < targetBlockHeight) { - logger.warn( - `Skipping reindexing to block ${targetBlockHeight}: current indexing height ${lastProcessedHeight} is behind requested block`, - ); - return; - } - - // if startHeight is greater than the targetHeight, just force clean - if (targetBlockHeight < this.startHeight) { - logger.info( - `targetHeight: ${targetBlockHeight} is less than startHeight: ${this.startHeight}. Hence executing force-clean`, - ); - await this.forceCleanService.forceClean(); - } else { - logger.info(`Reindexing to block: ${targetBlockHeight}`); - const transaction = await this.sequelize.transaction(); - try { - await this.storeService.rewind(targetBlockHeight, transaction); - - const blockOffset = await this.getMetadataBlockOffset(); - if (blockOffset) { - await this.mmrService.deleteMmrNode( - targetBlockHeight + 1, - blockOffset, - ); - } - await transaction.commit(); - logger.info('Reindex Success'); - } catch (err) { - logger.error(err, 'Reindexing failed'); - await transaction.rollback(); - throw err; - } - } + const [startHeight, lastProcessedHeight] = await Promise.all([ + this.getStartBlockFromDataSources(), + this.getLastProcessedHeight(), + ]); + + return reindex( + startHeight, + await this.getMetadataBlockOffset(), + targetBlockHeight, + lastProcessedHeight, + this.storeService, + this.dynamicDsService, + this.mmrService, + this.sequelize, + this.forceCleanService, + ); } } diff --git a/packages/node/src/utils/reindex.ts b/packages/node/src/utils/reindex.ts new file mode 100644 index 0000000000..8b672e006c --- /dev/null +++ b/packages/node/src/utils/reindex.ts @@ -0,0 +1,63 @@ +// Copyright 2020-2022 OnFinality Limited authors & contributors +// SPDX-License-Identifier: Apache-2.0 + +import { getLogger, MmrService, StoreService } from '@subql/node-core'; +import { Sequelize } from 'sequelize'; +import { DynamicDsService } from '../indexer/dynamic-ds.service'; +import { ForceCleanService } from '../subcommands/forceClean.service'; + +const logger = getLogger('Reindex'); + +export async function reindex( + startHeight: number, + blockOffset: number | undefined, + targetBlockHeight: number, + lastProcessedHeight: number, + storeService: StoreService, + dynamicDsService: DynamicDsService, + mmrService: MmrService, + sequelize: Sequelize, + forceCleanService?: ForceCleanService, +): Promise { + if (!storeService.historical) { + logger.warn('Unable to reindex, historical state not enabled'); + return; + } + if (!lastProcessedHeight || lastProcessedHeight < targetBlockHeight) { + logger.warn( + `Skipping reindexing to block ${targetBlockHeight}: current indexing height ${lastProcessedHeight} is behind requested block`, + ); + return; + } + + // if startHeight is greater than the targetHeight, just force clean + if (targetBlockHeight < startHeight) { + logger.info( + `targetHeight: ${targetBlockHeight} is less than startHeight: ${startHeight}. Hence executing force-clean`, + ); + if (!forceCleanService) { + logger.error('ForceCleanService not provided, cannot force clean'); + process.exit(1); + } + await forceCleanService.forceClean(); + } else { + logger.info(`Reindexing to block: ${targetBlockHeight}`); + const transaction = await sequelize.transaction(); + try { + await Promise.all([ + storeService.rewind(targetBlockHeight, transaction), + dynamicDsService.resetDynamicDatasource(targetBlockHeight, transaction), + ]); + + if (blockOffset) { + await mmrService.deleteMmrNode(targetBlockHeight + 1, blockOffset); + } + await transaction.commit(); + logger.info('Reindex Success'); + } catch (err) { + logger.error(err, 'Reindexing failed'); + await transaction.rollback(); + throw err; + } + } +} diff --git a/packages/node/src/yargs.ts b/packages/node/src/yargs.ts index d486dde8c0..b11fd77094 100644 --- a/packages/node/src/yargs.ts +++ b/packages/node/src/yargs.ts @@ -200,4 +200,10 @@ export const yargsOptions = yargs(hideBin(process.argv)) type: 'number', default: 100, }, + 'unfinalized-blocks': { + demandOption: false, + default: false, + describe: 'Enable to fetch and index unfinalized blocks', + type: 'boolean', + }, }); diff --git a/test/Dockerfile b/test/Dockerfile index 470e51c663..4045416b5a 100644 --- a/test/Dockerfile +++ b/test/Dockerfile @@ -1,4 +1,4 @@ -FROM node:lts-gallium +FROM node:lts-bullseye WORKDIR /workdir COPY . . diff --git a/yarn.lock b/yarn.lock index 1b0765c961..03f3ca7fb3 100644 --- a/yarn.lock +++ b/yarn.lock @@ -1315,7 +1315,7 @@ __metadata: languageName: node linkType: hard -"@babel/runtime@npm:^7.18.9, @babel/runtime@npm:^7.19.4, @babel/runtime@npm:^7.8.4": +"@babel/runtime@npm:^7.18.9, @babel/runtime@npm:^7.19.4, @babel/runtime@npm:^7.20.1, @babel/runtime@npm:^7.8.4": version: 7.20.1 resolution: "@babel/runtime@npm:7.20.1" dependencies: @@ -2399,74 +2399,74 @@ __metadata: languageName: node linkType: hard -"@polkadot/api-augment@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/api-augment@npm:9.5.2" +"@polkadot/api-augment@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/api-augment@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 - "@polkadot/api-base": 9.5.2 - "@polkadot/rpc-augment": 9.5.2 - "@polkadot/types": 9.5.2 - "@polkadot/types-augment": 9.5.2 - "@polkadot/types-codec": 9.5.2 + "@babel/runtime": ^7.20.1 + "@polkadot/api-base": 9.7.1 + "@polkadot/rpc-augment": 9.7.1 + "@polkadot/types": 9.7.1 + "@polkadot/types-augment": 9.7.1 + "@polkadot/types-codec": 9.7.1 "@polkadot/util": ^10.1.11 - checksum: 19a3d96d4dd8f5c8569ff972f67b4bdcdf046844f33729d959f27338223f80da25ea44894f4ba243d911f30555e61ffa188ec89a528df73542279618487e43a0 + checksum: 405b817d26ff9fd57ac7aafcea826f98e394af0bb88f49d2966f92cfd84678dec33fd554aaf2e7cffaa549276f0aa90c0290fd9e0b28762c262643653da03aac languageName: node linkType: hard -"@polkadot/api-base@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/api-base@npm:9.5.2" +"@polkadot/api-base@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/api-base@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 - "@polkadot/rpc-core": 9.5.2 - "@polkadot/types": 9.5.2 + "@babel/runtime": ^7.20.1 + "@polkadot/rpc-core": 9.7.1 + "@polkadot/types": 9.7.1 "@polkadot/util": ^10.1.11 rxjs: ^7.5.7 - checksum: 40d6da6f86b85121680992b5ebc238be72e0c5aba18ae376bd0a6c45233e4b62d716aec70b2c84ef7b9b657576d47f5624857b8e9e5e76ee7c829cd2a3c65867 + checksum: b33902b0a20acb15064bcac26e57f833ec3bc63d447d46169409474b6920e286f54c562b06fe249213df39f5f72f8907807d4b39b2e1934bdafe847971e1d860 languageName: node linkType: hard -"@polkadot/api-derive@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/api-derive@npm:9.5.2" +"@polkadot/api-derive@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/api-derive@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 - "@polkadot/api": 9.5.2 - "@polkadot/api-augment": 9.5.2 - "@polkadot/api-base": 9.5.2 - "@polkadot/rpc-core": 9.5.2 - "@polkadot/types": 9.5.2 - "@polkadot/types-codec": 9.5.2 + "@babel/runtime": ^7.20.1 + "@polkadot/api": 9.7.1 + "@polkadot/api-augment": 9.7.1 + "@polkadot/api-base": 9.7.1 + "@polkadot/rpc-core": 9.7.1 + "@polkadot/types": 9.7.1 + "@polkadot/types-codec": 9.7.1 "@polkadot/util": ^10.1.11 "@polkadot/util-crypto": ^10.1.11 rxjs: ^7.5.7 - checksum: 65898526bc35c456442c0c274b71a40fe634583b4a20743bb77fe45dc1777ffb3cd41505d1d8897e3a46cd8602fb6ff4ba795da32b0f72768d413cc469cab8c6 + checksum: 67211478ea95d4480a66162359a4651a40bf54439eb3cc0a8f8d2b214c3137ee3e89bd2ab58264e1bbd0cbcd0d06975110da9e2ce929243e4b4b70b06dc90bb7 languageName: node linkType: hard -"@polkadot/api@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/api@npm:9.5.2" +"@polkadot/api@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/api@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 - "@polkadot/api-augment": 9.5.2 - "@polkadot/api-base": 9.5.2 - "@polkadot/api-derive": 9.5.2 + "@babel/runtime": ^7.20.1 + "@polkadot/api-augment": 9.7.1 + "@polkadot/api-base": 9.7.1 + "@polkadot/api-derive": 9.7.1 "@polkadot/keyring": ^10.1.11 - "@polkadot/rpc-augment": 9.5.2 - "@polkadot/rpc-core": 9.5.2 - "@polkadot/rpc-provider": 9.5.2 - "@polkadot/types": 9.5.2 - "@polkadot/types-augment": 9.5.2 - "@polkadot/types-codec": 9.5.2 - "@polkadot/types-create": 9.5.2 - "@polkadot/types-known": 9.5.2 + "@polkadot/rpc-augment": 9.7.1 + "@polkadot/rpc-core": 9.7.1 + "@polkadot/rpc-provider": 9.7.1 + "@polkadot/types": 9.7.1 + "@polkadot/types-augment": 9.7.1 + "@polkadot/types-codec": 9.7.1 + "@polkadot/types-create": 9.7.1 + "@polkadot/types-known": 9.7.1 "@polkadot/util": ^10.1.11 "@polkadot/util-crypto": ^10.1.11 eventemitter3: ^4.0.7 rxjs: ^7.5.7 - checksum: 2f0b33148bb0d99f040647cab4c9c48594e57c7ded97486dee7a6b8ede03e7724bcf3115f65bb080a5e540a199d31f168cfc65d95af2a743b857b642ab2bbb0f + checksum: 9e3d79117cba99fb63ef4ff6e81c7caeb226a84ed2b925f5bd224ad3302fc5e86ad57834f60d888aa56b206582eb0da1bc8e956680a3050b9111aee705103827 languageName: node linkType: hard @@ -2495,125 +2495,125 @@ __metadata: languageName: node linkType: hard -"@polkadot/rpc-augment@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/rpc-augment@npm:9.5.2" +"@polkadot/rpc-augment@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/rpc-augment@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 - "@polkadot/rpc-core": 9.5.2 - "@polkadot/types": 9.5.2 - "@polkadot/types-codec": 9.5.2 + "@babel/runtime": ^7.20.1 + "@polkadot/rpc-core": 9.7.1 + "@polkadot/types": 9.7.1 + "@polkadot/types-codec": 9.7.1 "@polkadot/util": ^10.1.11 - checksum: 2045d0943cbd0b91b06c698551c5b698b4c66cbe58bd7c21e42831ac4208e6d69a98cc1a4714584598c5907898c439b8ecf815a1dc5f3569fe9c236990342ce7 + checksum: 6d1801ca8d02252bc84c78e73f2d66cbd81ee4121415e90f2bf9f31ea4c705fdf59df49177310593f6981c13114935103ea8894a9b4235285c9552ebf8e24186 languageName: node linkType: hard -"@polkadot/rpc-core@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/rpc-core@npm:9.5.2" +"@polkadot/rpc-core@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/rpc-core@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 - "@polkadot/rpc-augment": 9.5.2 - "@polkadot/rpc-provider": 9.5.2 - "@polkadot/types": 9.5.2 + "@babel/runtime": ^7.20.1 + "@polkadot/rpc-augment": 9.7.1 + "@polkadot/rpc-provider": 9.7.1 + "@polkadot/types": 9.7.1 "@polkadot/util": ^10.1.11 rxjs: ^7.5.7 - checksum: ec72c886cee7c2466aa513a6f792ccf1f558eac41d9a48bd6d85ab5a6a961ee807f12eafdbbe2bb8d904e6908e30502acf51975702e8f88c40e84c1f2cf07f56 + checksum: 143dfd3194a30ed31f7aad0f8d30993abb224db4aa9be5ba73303b59a733df11cce8dded04d85ea6c2ff99bd56381ebebc200882c3de6b2032f4603bfa1fd728 languageName: node linkType: hard -"@polkadot/rpc-provider@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/rpc-provider@npm:9.5.2" +"@polkadot/rpc-provider@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/rpc-provider@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 + "@babel/runtime": ^7.20.1 "@polkadot/keyring": ^10.1.11 - "@polkadot/types": 9.5.2 - "@polkadot/types-support": 9.5.2 + "@polkadot/types": 9.7.1 + "@polkadot/types-support": 9.7.1 "@polkadot/util": ^10.1.11 "@polkadot/util-crypto": ^10.1.11 "@polkadot/x-fetch": ^10.1.11 "@polkadot/x-global": ^10.1.11 "@polkadot/x-ws": ^10.1.11 - "@substrate/connect": 0.7.14 + "@substrate/connect": 0.7.16 eventemitter3: ^4.0.7 mock-socket: ^9.1.5 nock: ^13.2.9 - checksum: 6201eb0f9da6afac2a794f8569ffca187368b148eaddcd2174ff629768480a82b441db393e7a755cd6cbad1348287aaadc7bec7d0d1fbaa643a0f758c6a2da84 + checksum: 8cfca534b181f2f03f90770a09d58d135a373fcb616c5b28703e5026e1b5a52afd32e25bbd46c33e424c50c3458cc9b02a92f7d6b68347412fbb3ae6803ad2fa languageName: node linkType: hard -"@polkadot/types-augment@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/types-augment@npm:9.5.2" +"@polkadot/types-augment@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/types-augment@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 - "@polkadot/types": 9.5.2 - "@polkadot/types-codec": 9.5.2 + "@babel/runtime": ^7.20.1 + "@polkadot/types": 9.7.1 + "@polkadot/types-codec": 9.7.1 "@polkadot/util": ^10.1.11 - checksum: ee7eecda29fdc3f672c0cc3a38e00e9bbe7d13128a5b3e6df4cafe0bb0c831c615e48a7d64464d462fd0bad849dc580a9a671237e3febd0996877f1f70b0db36 + checksum: 761371937dbd31ed984ff65abeb0f6ced46f92719c75742384b23731c20223215eba99e30697b2470423990956836f17fd6a292e86d270e901c69cdf284d6596 languageName: node linkType: hard -"@polkadot/types-codec@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/types-codec@npm:9.5.2" +"@polkadot/types-codec@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/types-codec@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 + "@babel/runtime": ^7.20.1 "@polkadot/util": ^10.1.11 "@polkadot/x-bigint": ^10.1.11 - checksum: fafe3ab7c8ae2902c1df49393fdb716c7acf6c97c133c6c6bf8c97997430384f443acb4f378171fb3c34fe48fb49f1b1606ab078a6582ed1a0c25121c480c41e + checksum: bf77ab20a692eb174328dd79ea18c68f9b2199afadf6d58e0807bfaeebbf07477a902aac83d1b4f84f2464c05218749dcdc1d9bfb348290d2208555a81276d22 languageName: node linkType: hard -"@polkadot/types-create@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/types-create@npm:9.5.2" +"@polkadot/types-create@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/types-create@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 - "@polkadot/types-codec": 9.5.2 + "@babel/runtime": ^7.20.1 + "@polkadot/types-codec": 9.7.1 "@polkadot/util": ^10.1.11 - checksum: ee53d0ed7172d48b5e3982d84b47aa77b0b02f66fd415dd1f9cfbcd9dd9af2246ef4c4d9f5d9dbf8601d76ceb70c29c08abdcf082e62f05bf3648977b336bcc9 + checksum: 78396d9a7bb8f6fe55fccef74bf4a8f38971e06af06c26e35216b2e6c191df400d795972c93914a0043f813e52ff71b94f89ebfff9b46c68ce0cff03593b9cee languageName: node linkType: hard -"@polkadot/types-known@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/types-known@npm:9.5.2" +"@polkadot/types-known@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/types-known@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 + "@babel/runtime": ^7.20.1 "@polkadot/networks": ^10.1.11 - "@polkadot/types": 9.5.2 - "@polkadot/types-codec": 9.5.2 - "@polkadot/types-create": 9.5.2 + "@polkadot/types": 9.7.1 + "@polkadot/types-codec": 9.7.1 + "@polkadot/types-create": 9.7.1 "@polkadot/util": ^10.1.11 - checksum: 3fd31a630f879675231424c6cf228d8a3fc711f2cb13e1ed8795d43d0ed43e0b873a028ccff27cfb93969eba63b8811499527edcb0b7420a46417bae827f82b5 + checksum: 7b523464a49c025434594d40875c26f47e32dfb3801d1f9b4828511805018dcc7696016d9cfd7ebc7dd96f46406935de3f76ea5dd3e1a1bbcaf75978c43c234a languageName: node linkType: hard -"@polkadot/types-support@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/types-support@npm:9.5.2" +"@polkadot/types-support@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/types-support@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 + "@babel/runtime": ^7.20.1 "@polkadot/util": ^10.1.11 - checksum: 8b1c37ab88533b23455f8e45048ec8d2b84cebe695534cc7c45aa19691dd60da921ef87e375ae028c5982fd18176060a59ff53cdaed84d7cea79312022b2e605 + checksum: 1020a53d8bda812ebeba797bc4615cffae5713acc8dc9722de3cb49f803d749a81a5bd041ebc321f229ba95d45b75fe838aa2bc049802e2808d15982ce5b514b languageName: node linkType: hard -"@polkadot/types@npm:9.5.2": - version: 9.5.2 - resolution: "@polkadot/types@npm:9.5.2" +"@polkadot/types@npm:9.7.1": + version: 9.7.1 + resolution: "@polkadot/types@npm:9.7.1" dependencies: - "@babel/runtime": ^7.19.4 + "@babel/runtime": ^7.20.1 "@polkadot/keyring": ^10.1.11 - "@polkadot/types-augment": 9.5.2 - "@polkadot/types-codec": 9.5.2 - "@polkadot/types-create": 9.5.2 + "@polkadot/types-augment": 9.7.1 + "@polkadot/types-codec": 9.7.1 + "@polkadot/types-create": 9.7.1 "@polkadot/util": ^10.1.11 "@polkadot/util-crypto": ^10.1.11 rxjs: ^7.5.7 - checksum: 1cd604e2db92292248b7061ebc185c7215a212e2875c96e8c0176e970ea8f208f86790386621a1ceab539cb7b1d74e3eb1c7ac2b653a062abec3798bbf66f149 + checksum: 345086883244f82196e7727b1a763b0c0cfc1197173f6f1f5fc6d760d55982bfed3dba3531001de79ab16f0edee78bf0f1c43cad61f64296029f053682e424ff languageName: node linkType: hard @@ -2958,9 +2958,9 @@ __metadata: languageName: node linkType: hard -"@subql/node-core@npm:1.3.3": - version: 1.3.3 - resolution: "@subql/node-core@npm:1.3.3" +"@subql/node-core@npm:1.4.2-0": + version: 1.4.2-0 + resolution: "@subql/node-core@npm:1.4.2-0" dependencies: "@nestjs/common": ^8.2.6 "@nestjs/event-emitter": ^1.3.0 @@ -2976,7 +2976,7 @@ __metadata: sequelize: 6.23.0 vm2: ^3.9.9 yargs: ^16.2.0 - checksum: ceca0574732c236c69223488047c2d16825f9d47e1d173947bf90b36471efd1a2b670aa7200e8955f573d8cc6f1f4319c8e0cd52b83e6f3e91622c74c3cf11ce + checksum: d5723394b07313fdf8496e6bd2a986992b0b370ba108f83ab7274a4e48243cddb28526e02af4d9d99b2ac5a12f381fb8bb0cd30b88ab2f2ef417c12b668bc0b5 languageName: node linkType: hard @@ -3015,7 +3015,7 @@ __metadata: "@nestjs/testing": ^8.2.6 "@subql/common": latest "@subql/common-ethereum": "workspace:*" - "@subql/node-core": 1.3.3 + "@subql/node-core": 1.4.2-0 "@subql/types-ethereum": "workspace:*" "@subql/utils": latest "@subql/x-merkle-mountain-range": 2.0.0-0.1.2 @@ -3129,24 +3129,24 @@ __metadata: languageName: node linkType: hard -"@substrate/connect@npm:0.7.14": - version: 0.7.14 - resolution: "@substrate/connect@npm:0.7.14" +"@substrate/connect@npm:0.7.16": + version: 0.7.16 + resolution: "@substrate/connect@npm:0.7.16" dependencies: "@substrate/connect-extension-protocol": ^1.0.1 - "@substrate/smoldot-light": 0.6.34 + "@substrate/smoldot-light": 0.7.5 eventemitter3: ^4.0.7 - checksum: b125d3f12021e570f11d7fcb59b1f4ea020884b4228c3675349e8b6ed28abcbdc404f54e43a64e6358035b57e79bb6fe0b8717c7bb2b1a610f6a4bd6f84fc72e + checksum: ad0c95b6f823f6586a5109c0edf513a4a88d2e75d9d85799d841b570aebf557f7713ca8c45e24aa29a59068f018e9c1e619beae6d7c066c02ffd813da1bf187d languageName: node linkType: hard -"@substrate/smoldot-light@npm:0.6.34": - version: 0.6.34 - resolution: "@substrate/smoldot-light@npm:0.6.34" +"@substrate/smoldot-light@npm:0.7.5": + version: 0.7.5 + resolution: "@substrate/smoldot-light@npm:0.7.5" dependencies: pako: ^2.0.4 ws: ^8.8.1 - checksum: 123c93dc8968f8efc68ce68c22fca306fc21c57f9f9f239690e1509ff7cbe5051b5e130de1683e4a9605262a83f6430a5f466ead3ff4497051e16b7dafaad20e + checksum: db059391c5e72bf4100e8d471ef20b9b0e2e4fc9eb285ba6ff64a3d2670bc998ab0e2c1fcc7fd1f5fab81035968db0a813c05b800eee01fd318cf92943f06963 languageName: node linkType: hard