Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

range check #131

Closed
wants to merge 66 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
66 commits
Select commit Hold shift + click to select a range
b087893
range check without optimizations and random test
tnyuzg Aug 3, 2024
4556038
fmt
tnyuzg Aug 3, 2024
bd4d2f3
add random test
tnyuzg Aug 4, 2024
fdfc8df
fmt
tnyuzg Aug 4, 2024
2c3ad2e
clippy
tnyuzg Aug 4, 2024
c64fa7f
comments
tnyuzg Aug 7, 2024
5e17f98
record
tnyuzg Aug 15, 2024
600de46
backup
tnyuzg Aug 16, 2024
27fe621
logup
tnyuzg Aug 17, 2024
2742aae
fmt
tnyuzg Aug 17, 2024
44c755a
fmt
tnyuzg Aug 17, 2024
35a8668
comment
tnyuzg Aug 23, 2024
8a3e155
lookup num can be arbitrary integer
tnyuzg Aug 27, 2024
9148907
fiat shamir randomness
tnyuzg Aug 27, 2024
4976f91
enable more actions for check (#133)
serendipity-crypto Aug 5, 2024
382afad
move batch_inverse to util
tnyuzg Aug 28, 2024
ffdd477
reconstruct BitDecomposition
f7ed Sep 3, 2024
6e501a2
finish bit decomp
f7ed Sep 3, 2024
b20c967
unify transcript in FF and EF
f7ed Sep 4, 2024
d12f052
snarks for bit decomposition
f7ed Sep 4, 2024
89d1cec
reconstruct addition in Zq
f7ed Sep 4, 2024
ed8792d
add snarks for addition in zq
f7ed Sep 4, 2024
a026810
general lookup
tnyuzg Sep 4, 2024
1ea06ff
finish iop for ntt
f7ed Sep 5, 2024
375edbc
snarks for ntt
f7ed Sep 5, 2024
370ed2b
add EF for RoundIOP
f7ed Sep 6, 2024
e1d8668
snarks for round
f7ed Sep 6, 2024
ffc55c0
range check without optimizations and random test
tnyuzg Aug 3, 2024
5982049
fmt
tnyuzg Aug 3, 2024
e229c51
add random test
tnyuzg Aug 4, 2024
d15f827
fmt
tnyuzg Aug 4, 2024
afdd8f2
clippy
tnyuzg Aug 4, 2024
680d0d8
comments
tnyuzg Aug 7, 2024
c4340e9
record
tnyuzg Aug 15, 2024
70dde2f
backup
tnyuzg Aug 16, 2024
1e01b90
logup
tnyuzg Aug 17, 2024
d95f228
fmt
tnyuzg Aug 17, 2024
7495649
fmt
tnyuzg Aug 17, 2024
378b0b8
comment
tnyuzg Aug 23, 2024
57575d8
lookup num can be arbitrary integer
tnyuzg Aug 27, 2024
20964aa
fiat shamir randomness
tnyuzg Aug 27, 2024
9de6042
move batch_inverse to util
tnyuzg Aug 28, 2024
b98e9ab
general lookup
tnyuzg Sep 4, 2024
d1eb676
Merge branch 'lookup-and-rangecheck' of https://github.com/xiangxiecr…
tnyuzg Sep 7, 2024
25b529f
fix
tnyuzg Sep 7, 2024
a9281c9
reconstruct RLWE * RGSW
f7ed Sep 8, 2024
4753570
reconstruct snarks for RLWE * RGSW
f7ed Sep 8, 2024
40576e3
fmt
f7ed Sep 8, 2024
8738daf
add RLWE * RGSW example
f7ed Sep 8, 2024
fa23eca
delete dead code
tnyuzg Sep 8, 2024
ddcf6bc
rename
tnyuzg Sep 8, 2024
f097030
rename
tnyuzg Sep 8, 2024
6720eea
Merge branch 'main' into lookup-and-rangecheck
tnyuzg Sep 8, 2024
c8d8247
reconstruct Accumulator
f7ed Sep 9, 2024
5393538
check equality relations among ACC
f7ed Sep 9, 2024
0e132f9
add snarks for ACC
f7ed Sep 9, 2024
804d7d6
fmt
f7ed Sep 9, 2024
deaf0b0
check & clippy
f7ed Sep 9, 2024
336cb6a
Merge branch 'main' into add_EF
f7ed Sep 9, 2024
e5cdeee
fix
tnyuzg Sep 9, 2024
85ff8f5
for merge
tnyuzg Sep 11, 2024
658482d
Merge branch 'add_EF' into lookup-and-rangecheck
tnyuzg Sep 11, 2024
db1a962
snarky lookup
tnyuzg Sep 11, 2024
0b56d7d
fix
tnyuzg Sep 11, 2024
3747842
fix
tnyuzg Sep 11, 2024
dceff38
separate computing m
tnyuzg Sep 16, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Binary file added .DS_Store
Binary file not shown.
6 changes: 3 additions & 3 deletions Cargo.toml
Original file line number Diff line number Diff line change
Expand Up @@ -22,12 +22,12 @@ rand_distr = "0.4"
rand_core = "0.6.4"
rand_chacha = "0.3.1"
rayon = "1"
bytemuck = { version = "1.17", features = ["derive"] }
bytemuck = { version = "1.13", features = ["derive"] }
merlin = { version = "3.0.0", default-features = false }
serde = { version = "1.0", features = ["derive"] }
bincode = "1.3"
itertools = "0.13"
sha2 = { version = "0.10" }
itertools = "0.13.0"
sha2 = { version = "0.10.7", features = ["asm"] }

criterion = "0.5"

Expand Down
Binary file added algebra/.DS_Store
Binary file not shown.
4 changes: 4 additions & 0 deletions algebra/src/extension/binomial_extension.rs
Original file line number Diff line number Diff line change
Expand Up @@ -168,6 +168,10 @@ impl<F: Field + BinomiallyExtendable<D> + Packable, const D: usize> Field
}

const MODULUS_VALUE: Self::Value = F::MODULUS_VALUE;

fn random<R: CryptoRng + Rng>(rng: &mut R) -> Self {
Self::from_base_fn(|_| FieldUniformSampler::new().sample(rng))
}
}

impl<F: Field + BinomiallyExtendable<D> + Packable, const D: usize> Display
Expand Down
4 changes: 3 additions & 1 deletion algebra/src/polynomial/multivariate/data_structures.rs
Original file line number Diff line number Diff line change
Expand Up @@ -2,6 +2,8 @@

use std::{collections::HashMap, rc::Rc};

use serde::Serialize;

use crate::Field;

use super::{DenseMultilinearExtension, MultilinearExtension};
Expand Down Expand Up @@ -48,7 +50,7 @@ impl<F: Field> ListOfProductsOfPolynomials<F> {
}
}

#[derive(Clone, Copy)]
#[derive(Clone, Copy, Serialize)]
/// Stores the number of variables and max number of multiplicands of the added polynomial used by the prover.
/// This data structures will be used as the verifier key.
pub struct PolynomialInfo {
Expand Down
30 changes: 14 additions & 16 deletions algebra/src/polynomial/multivariate/multilinear/dense.rs
Original file line number Diff line number Diff line change
Expand Up @@ -109,6 +109,15 @@ impl<F: Field> DenseMultilinearExtension<F> {
poly.truncate(1 << (nv - dim));
poly[0]
}

/// Convert to EF version
#[inline]
pub fn to_ef<EF: AbstractExtensionField<F>>(&self) -> DenseMultilinearExtension<EF> {
DenseMultilinearExtension::<EF> {
num_vars: self.num_vars,
evaluations: self.evaluations.iter().map(|x| EF::from_base(*x)).collect(),
}
}
}

impl<F: DecomposableField> DenseMultilinearExtension<F> {
Expand All @@ -121,19 +130,19 @@ impl<F: DecomposableField> DenseMultilinearExtension<F> {
#[inline]
pub fn get_decomposed_mles(
&self,
base_len: u32,
bits_len: u32,
base_len: usize,
bits_len: usize,
) -> Vec<Rc<DenseMultilinearExtension<F>>> {
let mut val = self.evaluations.clone();
let mask = F::mask(base_len);
let mask = F::mask(base_len as u32);

let mut bits = Vec::with_capacity(bits_len as usize);
let mut bits = Vec::with_capacity(bits_len);

// extract `base_len` bits as one "bit" at a time
for _ in 0..bits_len {
let mut bit = vec![F::zero(); self.evaluations.len()];
bit.iter_mut().zip(val.iter_mut()).for_each(|(b_i, v_i)| {
v_i.decompose_lsb_bits_at(b_i, mask, base_len);
v_i.decompose_lsb_bits_at(b_i, mask, base_len as u32);
});
bits.push(Rc::new(DenseMultilinearExtension::from_evaluations_vec(
self.num_vars,
Expand Down Expand Up @@ -310,17 +319,6 @@ impl<'a, F: Field> AddAssign<(F, &'a DenseMultilinearExtension<F>)>
}
}

impl<'a, F: Field> AddAssign<(F, &'a Rc<DenseMultilinearExtension<F>>)>
for DenseMultilinearExtension<F>
{
#[inline]
fn add_assign(&mut self, (f, rhs): (F, &'a Rc<DenseMultilinearExtension<F>>)) {
self.iter_mut()
.zip(rhs.iter())
.for_each(|(x, y)| *x += f.mul(y));
}
}

impl<F: Field> Neg for DenseMultilinearExtension<F> {
type Output = DenseMultilinearExtension<F>;

Expand Down
83 changes: 16 additions & 67 deletions algebra/src/utils/transcript.rs
Original file line number Diff line number Diff line change
@@ -1,18 +1,17 @@
use std::marker::PhantomData;

use rand::SeedableRng;
use rand_distr::Distribution;
use serde::Serialize;

use crate::{AbstractExtensionField, Field, FieldUniformSampler};
use crate::Field;

use super::{Block, Prg};

/// A transcript consists of a Merlin transcript and a `sampler``
/// to sample uniform field elements.
pub struct Transcript<F: Field> {
transcript: merlin::Transcript,
sampler: FieldUniformSampler<F>,
// sampler: FieldUniformSampler<F>,
_marker: PhantomData<F>,
}

Expand All @@ -22,89 +21,39 @@ impl<F: Field> Transcript<F> {
pub fn new() -> Self {
Self {
transcript: merlin::Transcript::new(b""),
sampler: FieldUniformSampler::new(),
// sampler: FieldUniformSampler::new(),
_marker: PhantomData,
}
}
}
impl<F: Field + Serialize> Transcript<F> {
/// Append the message to the transcript.
#[inline]
pub fn append_message(&mut self, msg: &[u8]) {
self.transcript.append_message(b"", msg);
}

/// Append elements to the transcript.
#[inline]
pub fn append_elements(&mut self, elems: &[F]) {
self.append_message(&bincode::serialize(elems).unwrap());
}

/// Append extension field elements to the transcript.
#[inline]
pub fn append_ext_field_elements<EF: AbstractExtensionField<F>>(&mut self, elems: &[EF]) {
let elems: Vec<F> = elems
.iter()
.flat_map(|x| x.as_base_slice())
.cloned()
.collect();
self.append_message(&bincode::serialize(&elems).unwrap());
impl<F: Field> Transcript<F> {
/// Append the message to the transcript.
pub fn append_message<M: Serialize>(&mut self, label: &'static [u8], msg: &M) {
self.transcript
.append_message(label, &bincode::serialize(msg).unwrap());
}

/// Generate the challenge bytes from the current transcript
#[inline]
pub fn get_challenge_bytes(&mut self, bytes: &mut [u8]) {
self.transcript.challenge_bytes(b"", bytes);
pub fn get_challenge_bytes(&mut self, label: &'static [u8], bytes: &mut [u8]) {
self.transcript.challenge_bytes(label, bytes);
}

/// Generate the challenge from the current transcript
/// and append it to the transcript.
pub fn get_and_append_challenge(&mut self) -> F {
pub fn get_challenge(&mut self, label: &'static [u8]) -> F {
let mut seed = [0u8; 16];
self.transcript.challenge_bytes(b"", &mut seed);
self.transcript.challenge_bytes(label, &mut seed);
let mut prg = Prg::from_seed(Block::from(seed));
let challenge: F = self.sampler.sample(&mut prg);
self.append_message(&bincode::serialize(&challenge).unwrap());

challenge
F::random(&mut prg)
}

/// Generate the challenge vector from the current transcript
/// and append it to the transcript.
pub fn get_vec_and_append_challenge(&mut self, num: usize) -> Vec<F> {
pub fn get_vec_challenge(&mut self, label: &'static [u8], num: usize) -> Vec<F> {
let mut seed = [0u8; 16];
self.transcript.challenge_bytes(b"", &mut seed);
self.transcript.challenge_bytes(label, &mut seed);
let mut prg = Prg::from_seed(Block::from(seed));

let challenge = self.sampler.sample_iter(&mut prg).take(num).collect();
self.append_message(&bincode::serialize(&challenge).unwrap());

challenge
}

/// Generate the challenge for extension field from the current transcript
/// and append it to the transcript.
#[inline]
pub fn get_ext_field_and_append_challenge<EF>(&mut self) -> EF
where
EF: AbstractExtensionField<F>,
{
let value = self.get_vec_and_append_challenge(EF::D);
EF::from_base_slice(&value)
}

/// Generate the challenge vector for extension field from the current transcript
/// and append it to the transcript.
#[inline]
pub fn get_vec_ext_field_and_append_challenge<EF>(&mut self, num: usize) -> Vec<EF>
where
EF: AbstractExtensionField<F>,
{
let challenges = self.get_vec_and_append_challenge(num * EF::D);
challenges
.chunks_exact(EF::D)
.map(|ext| EF::from_base_slice(ext))
.collect()
(0..num).map(|_| F::random(&mut prg)).collect()
}
}

Expand Down
6 changes: 3 additions & 3 deletions algebra/tests/multivariate_test.rs
Original file line number Diff line number Diff line change
Expand Up @@ -108,11 +108,11 @@ fn mle_arithmetic() {

// test decomposition of mle
{
let base_len = 3;
let base_len: usize = 3;
let base = FF::new(1 << base_len);
let basis = <Basis<FF>>::new(base_len);
let basis = <Basis<FF>>::new(base_len as u32);
let bits_len = basis.decompose_len();
let decomposed_polys = poly1.get_decomposed_mles(base_len, bits_len as u32);
let decomposed_polys = poly1.get_decomposed_mles(base_len, bits_len);
let point: Vec<_> = (0..NV).map(|_| uniform.sample(&mut rng)).collect();
let evaluation = decomposed_polys
.iter()
Expand Down
Binary file added lattice/.DS_Store
Binary file not shown.
Binary file added pcs/.DS_Store
Binary file not shown.
4 changes: 2 additions & 2 deletions pcs/benches/brakedown_pcs.rs
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ pub fn criterion_benchmark(c: &mut Criterion) {
Some(code_spec),
);

let mut trans = Transcript::<FF>::new();
let mut trans = Transcript::<EF>::new();
let mut comm = BrakedownPolyCommitment::default();
let mut state = BrakedownCommitmentState::default();
let mut proof = BrakedownOpenProof::default();
Expand Down Expand Up @@ -84,7 +84,7 @@ fn configure() -> Criterion {
criterion_group! {
name = benches;
config = configure();
targets =criterion_benchmark
targets = criterion_benchmark
}

criterion_main!(benches);
4 changes: 2 additions & 2 deletions pcs/examples/brakedown_pcs.rs
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ fn main() {
);
println!("setup time: {:?} ms", start.elapsed().as_millis());

let mut trans = Transcript::<FF>::new();
let mut trans = Transcript::<EF>::new();

let start = Instant::now();
let (comm, state) =
Expand All @@ -54,7 +54,7 @@ fn main() {

let eval = poly.evaluate_ext(&point);

let mut trans = Transcript::<FF>::new();
let mut trans = Transcript::<EF>::new();

let start = Instant::now();
let check = BrakedownPCS::<FF, Hash, ExpanderCode<FF>, ExpanderCodeSpec, EF>::verify(
Expand Down
8 changes: 4 additions & 4 deletions pcs/src/lib.rs
Original file line number Diff line number Diff line change
Expand Up @@ -8,12 +8,12 @@ pub mod multilinear;
/// utils, mainly used to implement linear time encodable code now
pub mod utils;

use algebra::{utils::Transcript, Field, MultilinearExtension};
use algebra::{utils::Transcript, AbstractExtensionField, Field, MultilinearExtension};

// type Point<F, P> = <P as MultilinearExtension<F>>::Point;

/// Polymomial Commitment Scheme
pub trait PolynomialCommitmentScheme<F: Field, S> {
pub trait PolynomialCommitmentScheme<F: Field, EF: AbstractExtensionField<F>, S> {
/// System parameters
type Parameters;
/// polynomial to commit
Expand Down Expand Up @@ -42,7 +42,7 @@ pub trait PolynomialCommitmentScheme<F: Field, S> {
commitment: &Self::Commitment,
state: &Self::CommitmentState,
points: &[Self::Point],
trans: &mut Transcript<F>,
trans: &mut Transcript<EF>,
) -> Self::Proof;

/// The Verification phase.
Expand All @@ -52,6 +52,6 @@ pub trait PolynomialCommitmentScheme<F: Field, S> {
points: &[Self::Point],
eval: Self::Point,
proof: &Self::Proof,
trans: &mut Transcript<F>,
trans: &mut Transcript<EF>,
) -> bool;
}
21 changes: 11 additions & 10 deletions pcs/src/multilinear/brakedown/mod.rs
Original file line number Diff line number Diff line change
Expand Up @@ -48,7 +48,7 @@
EF: AbstractExtensionField<F>,
{
/// Prover answers the challenge by computing the product of the challenge vector
/// and the committed matrix.
/// and the commited matirx.

Check warning on line 51 in pcs/src/multilinear/brakedown/mod.rs

View workflow job for this annotation

GitHub Actions / typos

"commited" should be "committed".

Check warning on line 51 in pcs/src/multilinear/brakedown/mod.rs

View workflow job for this annotation

GitHub Actions / typos

"matirx" should be "matrix".
/// The computation of the product can be viewed as a linear combination of rows
/// of the matrix with challenge vector as the coefficients.
fn answer_challenge(
Expand Down Expand Up @@ -227,12 +227,12 @@
EF: AbstractExtensionField<F>,
{
/// Generate random queries.
fn random_queries(pp: &BrakedownParams<F, EF, C>, trans: &mut Transcript<F>) -> Vec<usize> {
fn random_queries(pp: &BrakedownParams<F, EF, C>, trans: &mut Transcript<EF>) -> Vec<usize> {
let num_queries = pp.num_query();
let codeword_len = pp.code().codeword_len();

let mut seed = [0u8; 16];
trans.get_challenge_bytes(&mut seed);
trans.get_challenge_bytes(b"Generate random queries", &mut seed);
let mut prg = Prg::from_seed(Block::from(seed));

// Generate a random set of queries.
Expand All @@ -244,7 +244,7 @@
}
}

impl<F, H, C, S, EF> PolynomialCommitmentScheme<F, S> for BrakedownPCS<F, H, C, S, EF>
impl<F, H, C, S, EF> PolynomialCommitmentScheme<F, EF, S> for BrakedownPCS<F, H, C, S, EF>
where
F: Field + Serialize,
H: Hash + Sync + Send,
Expand Down Expand Up @@ -322,19 +322,20 @@
commitment: &Self::Commitment,
state: &Self::CommitmentState,
points: &[Self::Point],
trans: &mut Transcript<F>,
trans: &mut Transcript<EF>,
) -> Self::Proof {
assert_eq!(points.len(), pp.num_vars());
// Hash the commitment to transcript.
trans.append_message(&commitment.to_bytes().unwrap());
trans.append_message(b"commitment", &commitment);
// trans.append_message(&commitment.to_bytes().unwrap());

// Compute the tensor from the random point, see [DP23](https://eprint.iacr.org/2023/630.pdf).
let tensor = Self::tensor_from_points(pp, points);

let rlc_msgs = Self::answer_challenge(pp, &tensor, state);

// Hash rlc to transcript.
trans.append_ext_field_elements(&rlc_msgs);
trans.append_message(b"rlc", &rlc_msgs);

// Sample random queries.
let queries = Self::random_queries(pp, trans);
Expand All @@ -355,12 +356,12 @@
points: &[Self::Point],
eval: Self::Point,
proof: &Self::Proof,
trans: &mut Transcript<F>,
trans: &mut Transcript<EF>,
) -> bool {
assert_eq!(points.len(), pp.num_vars());

// Hash the commitment to transcript.
trans.append_message(&commitment.to_bytes().unwrap());
trans.append_message(b"commitment", &commitment);

let (tensor, residual) = Self::tensor_decompose(pp, points);

Expand All @@ -371,7 +372,7 @@
pp.code().encode_ext(&mut encoded_msg);

// Hash rlc to transcript.
trans.append_ext_field_elements(&proof.rlc_msgs);
trans.append_message(b"rlc", &proof.rlc_msgs);

// Sample random queries.
let queries = Self::random_queries(pp, trans);
Expand Down
Loading
Loading