Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merge v1.4.20 into 1.5 #3908

Merged
merged 29 commits into from
Nov 13, 2023
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
29 commits
Select commit Hold shift + click to select a range
f65322e
Bumped dependencies (#3857)
phearnot Jun 27, 2023
b0fad2a
NODE-2600 Optionally override a blockchain state in /utils/script/eva…
vsuharnikov Jul 5, 2023
25b60c3
NODE-2588 Qase.io integration (#3845)
DrBlast Jul 6, 2023
e651239
NODE-2578 GetBlockUpdate tests (#3852)
Kseolis Jul 6, 2023
51616ba
NODE-2336: Disallow FairPoSCalculator base target < 1 (#3523)
Karasiq Jul 6, 2023
961e643
NODE-2476 Check asset id field size (#3716)
ivan-mashonskiy Jul 7, 2023
b773589
Common isWellFormed() implementation (#3829)
xrtm000 Jul 17, 2023
ee95b92
Pull newer images during docker build (#3860)
phearnot Jul 17, 2023
9466846
NODE-2598 Added RIDE test ids (#3856)
Kseolis Jul 18, 2023
ce57e96
NODE-2599 Public key in AccountScriptInfo (#3862)
vsuharnikov Jul 25, 2023
c237de9
Fixed sync call transfers validation for /utils/script/evaluate (#3863)
xrtm000 Aug 7, 2023
380bf96
NODE-2591 Tuple type comments parse error (#3867)
xrtm000 Aug 15, 2023
df946ac
NODE-2602 Remove script and Ethereum transaction size checks from UTX…
ivan-mashonskiy Aug 16, 2023
e6bb22a
Fix liquid block replacing with better one (#3864)
ivan-mashonskiy Aug 29, 2023
eecfa39
NODE-2601 Corrected balance snapshots (#3871)
xrtm000 Sep 14, 2023
e02c6da
NODE-2603 Escaping characters in decompiler (#3873)
xrtm000 Sep 25, 2023
7a7bae1
NODE-2592 Fix permissions change on upgrade (#3877)
ivan-mashonskiy Sep 25, 2023
ef713bf
NODE-2615 Add base target field to /debug/stateHash (#3894)
ivan-mashonskiy Oct 13, 2023
9c49518
NODE-2449 Run Ride on any environment (#3869)
vsuharnikov Oct 13, 2023
2da7fe2
NODE-2604 New zwaves library (#3895)
vsuharnikov Oct 17, 2023
e311af4
NODE-2619 Feature activation fixes (#3900)
vsuharnikov Oct 20, 2023
d6f9f8f
Broadcast only last block after apply (#3878)
ivan-mashonskiy Oct 20, 2023
586e0c2
Build system cleanup (#3901)
phearnot Oct 23, 2023
82f19d8
Version 1.4.19 (Mainnet + Testnet) (#3902)
phearnot Oct 23, 2023
89f9c51
NODE-2624 /utils/script/evaluate: do not parse bytes in expr (#3907)
vsuharnikov Oct 26, 2023
069e969
NODE-2625 Fixed GRPC Reflection API (#3910)
xrtm000 Nov 7, 2023
7e02801
NODE-2628 Use valid default address in /utils/evaluate (#3913)
phearnot Nov 9, 2023
40f15b2
Version 1.4.20 (Mainnet + Testnet) (#3914)
phearnot Nov 9, 2023
e2c1e7b
Merge tag 'v1.4.20' into 1.4.20-into-1.5.x
phearnot Nov 9, 2023
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
The table of contents is too big for display.
Diff view
Diff view
  •  
  •  
  •  
9 changes: 8 additions & 1 deletion .github/workflows/check-pr.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -27,7 +27,14 @@ jobs:
path: ~/.cache/coursier
key: coursier-cache
- name: Check PR
run: sbt --mem 6144 --batch checkPR
run: sbt --mem 6144 --batch ";checkPR;completeQaseRun"
env:
QASE_ENABLE: true
QASE_RUN_NAME: checkPR
QASE_RUN_ID: 1
QASE_PROJECT_CODE: PR
QASE_API_TOKEN: ${{ secrets.QASE_API_TOKEN }}
CHECKPR_RUN_ID: ${{ github.run_id }}
- uses: dorny/paths-filter@v2
id: filter
with:
Expand Down
22 changes: 22 additions & 0 deletions .github/workflows/publish-docker-node.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -64,12 +64,24 @@ jobs:
type=ref,event=tag
type=raw,value=latest,enable=${{ github.event.release.prerelease == false }}

- name: Extract Docker RIDE runner metadata
id: meta-ride-runner
uses: docker/metadata-action@v4
with:
images: wavesplatform/ride-runner
flavor: |
latest=false
tags: |
type=match,pattern=v(.*),group=1
type=raw,value=latest,enable=${{ github.event.release.prerelease == false }}

- name: Build and push Docker public image
id: build-and-push-public
uses: docker/build-push-action@v3
with:
context: ./docker
push: true
pull: true
tags: ${{ steps.meta-public.outputs.tags }}

- name: Build and push Docker private image
Expand All @@ -78,6 +90,16 @@ jobs:
with:
context: ./docker/private
push: true
pull: true
tags: ${{ steps.meta-private.outputs.tags }}
build-args: |
NODE_TAG=${{ steps.meta-public.outputs.version }}

- name: Build and push Docker RIDE runner image
id: build-and-push-ride-runner
uses: docker/build-push-action@v3
with:
context: ./ride-runner/docker
push: true
pull: true
tags: ${{ steps.meta-ride-runner.outputs.tags }}
1 change: 1 addition & 0 deletions .github/workflows/publish-node-sbt-builder.yml
Original file line number Diff line number Diff line change
Expand Up @@ -23,6 +23,7 @@ jobs:
context: ./docker
file: ./docker/node-sbt-builder.Dockerfile
push: true
pull: true
tags: wavesplatform/node-sbt-builder:${{ steps.extract-versions.outputs.waves-version }}
build-args: |
WAVES_VERSION=${{ steps.extract-versions.outputs.waves-version }}
Expand Down
1 change: 1 addition & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -36,6 +36,7 @@ docker/temp

package
!src/package
!ride-runner/src/package
native

!lang/jvm/lib/*.jar
Original file line number Diff line number Diff line change
@@ -1,7 +1,5 @@
package com.wavesplatform.state

import java.io.File

import com.wavesplatform.Application
import com.wavesplatform.account.AddressScheme
import com.wavesplatform.common.state.ByteStr
Expand All @@ -13,6 +11,8 @@ import com.wavesplatform.utils.ScorexLogging
import monix.eval.Coeval
import org.openjdk.jmh.annotations.{Param, Scope, State, TearDown}

import java.io.File

@State(Scope.Benchmark)
abstract class DBState extends ScorexLogging {
@Param(Array("waves.conf"))
Expand All @@ -32,7 +32,7 @@ abstract class DBState extends ScorexLogging {

AddressScheme.current = new AddressScheme { override val chainId: Byte = 'W' }

lazy val environment = new WavesEnvironment(
lazy val environment = WavesEnvironment(
AddressScheme.current.chainId,
Coeval.raiseError(new NotImplementedError("`tx` is not implemented")),
Coeval(rocksDBWriter.height),
Expand Down
70 changes: 55 additions & 15 deletions benchmark/src/test/scala/com/wavesplatform/lang/v1/DataFuncs.scala
Original file line number Diff line number Diff line change
@@ -1,31 +1,69 @@
package com.wavesplatform.lang.v1

import java.util.concurrent.TimeUnit

import com.wavesplatform.common.utils._
import com.wavesplatform.lang.v1.DataFuncs._
import com.esaulpaugh.headlong.util.FastHex
import com.sun.org.apache.xerces.internal.impl.dv.util.HexBin
import com.wavesplatform.common.utils.*
import com.wavesplatform.lang.v1.DataFuncs.*
import com.wavesplatform.lang.v1.EnvironmentFunctionsBenchmark.randomBytes
import org.openjdk.jmh.annotations._
import org.openjdk.jmh.annotations.*
import org.openjdk.jmh.infra.Blackhole

import java.util.concurrent.TimeUnit

@OutputTimeUnit(TimeUnit.MICROSECONDS)
@BenchmarkMode(Array(Mode.AverageTime))
@Threads(1)
@Fork(1)
@Warmup(iterations = 30)
@Measurement(iterations = 30)
@Warmup(iterations = 10, time = 1)
@Measurement(iterations = 10, time = 1)
class DataFuncs {
@Benchmark
def decode64_35Kb(st: StrSt35K, bh: Blackhole): Unit =
bh.consume(Base64.decode(st.message))

@Benchmark
def decode16_32kb_bcprov(st: StrSt32K, bh: Blackhole): Unit =
bh.consume(org.bouncycastle.util.encoders.Hex.decode(st.message))

@Benchmark
def decode16_32kb_guava(st: StrSt32K, bh: Blackhole): Unit =
bh.consume(com.google.common.io.BaseEncoding.base16.decode(st.message))

@Benchmark
def decode16_32kb_commons_codec(st: StrSt32K, bh: Blackhole): Unit =
bh.consume(org.apache.commons.codec.binary.Hex.decodeHex(st.message))

@Benchmark
def decode16_32kb_web3j(st: StrSt32K, bh: Blackhole): Unit =
bh.consume(org.web3j.utils.Numeric.hexStringToByteArray(st.message))

@Benchmark
def decode16_32kb_headlong(st: StrSt32K, bh: Blackhole): Unit =
bh.consume(FastHex.decode(st.message))

@Benchmark
def decode16_32kb_jdk_hexbin(st: StrSt105K, bh: Blackhole): Unit =
bh.consume(HexBin.decode(st.message))

@Benchmark
def decode64_70Kb(st: StrSt70K, bh: Blackhole): Unit =
bh.consume(Base64.decode(st.message))

@Benchmark
def decode64_105Kb(st: StrSt105K, bh: Blackhole): Unit =
bh.consume(Base64.decode(st.message))
def decode64_105Kb_jdk(st: StrSt105K, bh: Blackhole): Unit =
bh.consume(java.util.Base64.getDecoder.decode(st.message))

@Benchmark
def decode64_105Kb_bcprov(st: StrSt105K, bh: Blackhole): Unit =
bh.consume(org.bouncycastle.util.encoders.Base64.decode(st.message))

@Benchmark
def decode64_105Kb_guava(st: StrSt105K, bh: Blackhole): Unit =
bh.consume(com.google.common.io.BaseEncoding.base64().decode(st.message))

@Benchmark
def decode64_105Kb_commons_codec(st: StrSt105K, bh: Blackhole): Unit =
bh.consume(org.apache.commons.codec.binary.Base64.decodeBase64(st.message))

@Benchmark
def decode64_140Kb(st: StrSt140K, bh: Blackhole): Unit =
Expand Down Expand Up @@ -95,7 +133,6 @@ class DataFuncs {
def concatr_175Kb(st: StrSt175K, bh: Blackhole): Unit =
bh.consume("q" ++ st.message)


@Benchmark
def decode58_16b(st: StrSt16b, bh: Blackhole): Unit =
bh.consume(Base58.decode(st.message))
Expand Down Expand Up @@ -144,23 +181,26 @@ class DataFuncs {
def encode58_896b(st: StrSt896b, bh: Blackhole): Unit =
bh.consume(Base58.encode(st.bmessage))


}

object DataFuncs {
@State(Scope.Benchmark)
class StrSt8K extends StrSt(8)
@State(Scope.Benchmark)
class StrSt35K extends StrSt(35)
@State(Scope.Benchmark)
class StrSt70K extends StrSt(70)
@State(Scope.Benchmark)
class StrSt105K extends StrSt(105)
@State(Scope.Benchmark)
class StrSt32K extends StrSt(32)
@State(Scope.Benchmark)
class StrSt140K extends StrSt(140)
@State(Scope.Benchmark)
class StrSt175K extends StrSt(175)

class StrSt(size: Int) {
val message = "B" * (size * 1024)
val message = "B" * (size * 1024)
}

@State(Scope.Benchmark)
Expand All @@ -177,8 +217,8 @@ object DataFuncs {
class StrSt896b extends StrStS(896)

class StrStS(size: Int) {
val message = "B" * size
val bmessage = randomBytes(size)
val message = "B" * size
val bmessage = randomBytes(size)
}

@State(Scope.Benchmark)
Expand All @@ -193,6 +233,6 @@ object DataFuncs {
class BinSt130K extends BinSt(130)

class BinSt(size: Int) {
val message = randomBytes(size * 1024)
val message = randomBytes(size * 1024)
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -34,8 +34,8 @@ import scala.util.Random
@BenchmarkMode(Array(Mode.AverageTime))
@Threads(1)
@Fork(1)
@Warmup(iterations = 10)
@Measurement(iterations = 10)
@Warmup(iterations = 10, time = 1)
@Measurement(iterations = 10, time = 1)
class EnvironmentFunctionsBenchmark {

@Benchmark
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,6 @@ import java.util.function.Consumer

import com.google.common.primitives.{Ints, UnsignedBytes}
import com.typesafe.config.ConfigFactory
import com.wavesplatform.common.ByteStrComparator
import com.wavesplatform.common.state.ByteStr
import com.wavesplatform.database.RDB
import com.wavesplatform.settings.{WavesSettings, loadConfig}
Expand All @@ -19,6 +18,7 @@ import org.eclipse.collections.impl.utility.MapIterate
import org.openjdk.jmh.annotations.*
import org.openjdk.jmh.infra.Blackhole
import org.rocksdb.{WriteBatch, WriteOptions}
import com.wavesplatform.utils.byteStrOrdering

import scala.util.Random

Expand Down Expand Up @@ -123,7 +123,7 @@ object RocksDBWriteBatchBenchmark {
}

object SortedBatch {
val byteStrComparator: Comparator[ByteStr] = (o1: ByteStr, o2: ByteStr) => ByteStrComparator.compare(o1, o2)
val byteStrComparator: Comparator[ByteStr] = (o1: ByteStr, o2: ByteStr) => byteStrOrdering.compare(o1, o2)
}

object ByteArrayHashingStrategy extends HashingStrategy[Array[Byte]] {
Expand Down
Original file line number Diff line number Diff line change
@@ -1,8 +1,5 @@
package com.wavesplatform.state

import java.io.File
import java.util.concurrent.{ThreadLocalRandom, TimeUnit}

import cats.Id
import com.typesafe.config.ConfigFactory
import com.wavesplatform.account.{AddressOrAlias, AddressScheme, Alias}
Expand All @@ -21,6 +18,8 @@ import org.openjdk.jmh.annotations.*
import org.openjdk.jmh.infra.Blackhole
import scodec.bits.BitVector

import java.io.File
import java.util.concurrent.{ThreadLocalRandom, TimeUnit}
import scala.io.Codec

/** Tests over real database. How to test:
Expand Down Expand Up @@ -137,7 +136,7 @@ object WavesEnvironmentBenchmark {

val environment: Environment[Id] = {
val state = new RocksDBWriter(rdb, wavesSettings.blockchainSettings, wavesSettings.dbSettings, wavesSettings.enableLightMode)
new WavesEnvironment(
WavesEnvironment(
AddressScheme.current.chainId,
Coeval.raiseError(new NotImplementedError("`tx` is not implemented")),
Coeval(state.height),
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -123,7 +123,7 @@ class WavesEnvironmentRebenchmark {
@Benchmark
def biggestDataEntries(bh: Blackhole, st: St): Unit = {
val address = Recipient.Address(
ByteStr(Address.fromString("3PFfUN4dRAyMN4nxYayES1CRZHJjS8JVCHf").explicitGet().bytes)
ByteStr(Address.fromString("3PFfUN4dRAyMN4nxYayES1CRZHJjS8JVCHf", None).explicitGet().bytes)
)
val checkBinaryOrString = Random.nextBoolean()
if (checkBinaryOrString) {
Expand All @@ -139,7 +139,7 @@ object WavesEnvironmentRebenchmark {
lazy val allAliases: Vector[Alias] = {
val builder = Vector.newBuilder[Alias]
rdb.db.iterateOver(KeyTags.AddressIdOfAlias) { e =>
builder += Alias.fromBytes(e.getKey.drop(2)).explicitGet()
builder += Alias.fromBytes(e.getKey.drop(2), None).explicitGet()
}
builder.result()
}
Expand Down
Loading