Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using large memory block, read or write is getting corrupted #35

Closed
object88 opened this issue Jan 8, 2017 · 1 comment
Closed

Using large memory block, read or write is getting corrupted #35

object88 opened this issue Jan 8, 2017 · 1 comment
Labels

Comments

@object88
Copy link
Owner

object88 commented Jan 8, 2017

Example code; causes an issue with hashmaps with large data sets.

Width is set to 53, because large hashmaps contain significant low-order bytes, and we are attempting to use 64-bit hashes.

src := rand.NewSource(time.Now().UnixNano())
random := rand.New(src)

width := uint32(64 - 11)
max := int64(math.Pow(2.0, float64(width)))

count := uint64(8)
contents := make([]uint64, count)
for i := uint64(0); i < count; i++ {
	contents[i] = uint64(random.Int63n(max))
}

m := AllocateMemories(LargeBlock, width, 8)
for k, v := range contents {
	m.Assign(uint64(k), v)
}

for k, v := range contents {
	result := m.Read(uint64(k))
	if result != v {
		t.Fatalf("At %d\nexpected %064b\nreceived %064b\n", k, v, result)
	}
}
@object88 object88 added the bug label Jan 10, 2017
@object88
Copy link
Owner Author

Fixed in #36 .

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

1 participant