A high-performance N-API addon for RandomX hashing and share verification in Node.js mining pools.
The primary API is now a SeedPool:
- one shared RandomX cache/dataset per seed
- a small VM pool per seed for concurrency
- direct
hashAsync(seed, input)/verifyShare(seed, ...)calls
That maps much better to real pool workloads than making application code manage raw context ids.
- Native Performance: N-API addon with JIT compilation and AES acceleration
- Shared Seed Resources: One RandomX cache/dataset per seed, reused across multiple VMs
- VM Pooling: Multiple VMs can share the same seed resources for concurrent pool workloads
- Pool Broker Friendly: Works well behind a local socket broker shared by many pool workers
- Pool-Ready: Designed specifically for mining pool server workloads
- Cross-Platform: Supports Linux, Windows, and macOS
- Vendored RandomX Source: Builds from pinned local RandomX source in
deps/randomx
- JIT Compilation: Near-native speed RandomX execution
- AES Hardware: Hardware-accelerated AES when available
- Memory Optimization: Shared seed resources and optional large pages
- Seed Reuse: Current/previous
seed_hashvalues stay warm without rebuilding full datasets - Async Hashing:
hashAsync()lets Node brokers overlap requests without blocking the event loop
# Ubuntu/Debian
sudo apt-get install build-essential cmake git
# CentOS/RHEL
sudo yum groupinstall "Development Tools"
sudo yum install cmake git
# macOS
brew install cmake gitnpm install "git+https://github.com/nssy/node-randomx-hashing.git"git clone https://github.com/nssy/node-randomx-hashing.git
cd node-randomx-hashing
# Build the addon
npm installThat's it! The build system:
- verifies
deps/randomxexists - configures and builds RandomX from local source
- compiles the N-API addon
- links everything together
const randomx = require('node-randomx-hashing');
// Primary API: a seed pool for current + previous seed_hash
const pool = randomx.createPoolSeedPool({
maxSeeds: 2,
vmPoolSize: 2,
mode: 'fast', // explicit: fast is never implied by magic
threads: 4, // dataset init threads
enableHugePages: false
});
const seed = Buffer.alloc(32, 1);
const input = Buffer.from('mining-share-data');
const target = Buffer.alloc(32, 0xff);
await pool.warmSeedAsync(seed);
const result = pool.verifyShare(seed, input, target);
console.log('Share valid:', result.valid);
console.log('Hash time:', result.hashTime, 'ms');
// Async hashing, useful for broker/server patterns
const hash = await pool.hashAsync(seed, input);
console.log('Hash:', hash.toString('hex'));
// Clean up when done
pool.releaseAll();Low-level API. Initialize a single RandomX VM context for a seed.
For pool workloads, prefer createSeedPool() / createPoolSeedPool().
Parameters:
seed(Buffer): 32-byte RandomX seedoptions(Object): Configuration optionsmode(string):"light"or"fast"(default:"light")enableJit(boolean): Enable JIT compilation (default: true)enableAes(boolean): Enable AES acceleration (default: true)enableHugePages(boolean): Enable large pages (default: false)threads(number): Initialization threads (default: 1)
Returns: Context ID (number)
Low-level async API. Initialize a single RandomX VM context for a seed on libuv's worker pool.
For pool workloads, prefer createSeedPool() / createPoolSeedPool().
Parameters:
seed(Buffer): 32-byte RandomX seedoptions(Object): same options asinitContext
Returns: Promise<number>
Primary high-level API for pools and brokers.
Parameters:
options.maxSeeds(number, default2): how many different seeds to retainoptions.vmPoolSize(number, default1): how many VMs to create per seedoptions.mode(string, default"light"):"light"or"fast"options.threads(number, defaultos.cpus().length): dataset init threadsoptions.enableJit(boolean)options.enableAes(boolean)options.enableHugePages(boolean)options.idleEvictMs(number): evict idle seeds after this many ms
Returns: SeedPool
Convenience wrapper around createSeedPool() with the same semantics.
Use this when your workload is specifically RandomX epoch based (current + previous seed).
Important methods:
hash(seed, input)hashAsync(seed, input)warmSeed(seed)/warmSeedAsync(seed)warmSeedFromHex(seedHex)/warmSeedAsyncFromHex(seedHex)verifyShare(seed, input, target, expectedHash?)hashFromHex(seedHex, input)hashAsyncFromHex(seedHex, input)verifyShareFromHex(seedHex, input, target, expectedHash?)getContext(seed)/getContextFromHex(seedHex)for low-level interoprelease(seed)/releaseFromHex(seedHex)/releaseAll()getSnapshot()
Notes:
hashAsync()andhashAsyncFromHex()are warmup-aware and will await any pending async warmup for that seed.- sync
hash()/verifyShare()methods do not wait for pending async warmups and may still initialize synchronously. releaseAll()is a full teardown; after calling it, theSeedPoolinstance should be considered disposed and will reject further use.
Low-level API for raw context ids.
Verify a mining share against difficulty target.
Parameters:
contextId(number): Context ID frominitContextinput(Buffer): Share data to hashtarget(Buffer): 32-byte difficulty target, little-endianexpectedHash(Buffer, optional): Expected hash for validation
Returns: Object with:
valid(boolean): Whether share meets targetdifficulty(number): Calculated difficultyhash(Buffer): 32-byte RandomX hashhashTime(number): Hash calculation time in ms
Low-level API for raw context ids.
Calculate RandomX hash for given input.
Parameters:
contextId(number): Context ID frominitContextinput(Buffer): Data to hash
Returns: 32-byte hash (Buffer)
Calculate RandomX hash asynchronously on libuv's worker pool.
This is the preferred primitive for Node.js broker/server patterns where one process handles many concurrent requests against a shared set of RandomX VMs.
Parameters:
contextId(number): Context ID frominitContextinput(Buffer): Data to hash
Returns: Promise<Buffer>
Release a RandomX context and free memory.
Parameters:
contextId(number): Context ID to release
Get performance statistics.
Returns: Object with:
totalHashes(number): Total hashes calculatedtotalVerifications(number): Total shares verifiedactiveVMs(number): Active VM countactiveSeeds(number): Active shared seed-resource countaverageHashTime(number): Average hash time in mscacheHits(number): Context cache hitscacheMisses(number): Context cache misses
Get hardware capability information.
Returns: Object with:
hasJit(boolean): JIT compilation availablehasAes(boolean): AES acceleration availablehasHugePages(boolean): Large pages availablecpuCores(number): CPU core counttotalMemory(number): Total system memoryhugePagesAvailable(number): Available huge pages
The build system is local-source based and builds from the vendored deps/randomx source tree included in the package:
- Check Vendored Source: Verifies
deps/randomxexists - Configure RandomX: Uses CMake with optimal performance settings
- Build RandomX: Compiles the static library from local source
- Build Addon: Compiles the N-API addon and links against RandomX
The build system automatically applies optimal settings:
- Compiler Flags:
-O3 -march=native -mtune=native - RandomX Features: JIT compilation + AES acceleration enabled
- Cross-Platform: Handles Linux, Windows, and macOS differences
- Dependencies: Automatically manages pthread and native build inputs
# Clean local build output
npm run clean
# Rebuild addon + native library
npm run build
# Full install/build
npm install
# Verify installation
npm run verify
# Run tests
npm test
# Run example pool server
npm run exampleFor maximum performance on Linux:
# Enable performance governor
echo performance | sudo tee /sys/devices/system/cpu/cpu*/cpufreq/scaling_governor
# Allocate huge pages (adjust count as needed)
echo 1024 | sudo tee /proc/sys/vm/nr_hugepages
# Set CPU affinity for mining processes
taskset -c 0-3 node your-pool-server.js// Pool server example with the primary seed-pool API
class PoolServer {
constructor() {
this.contexts = randomx.createPoolSeedPool({
maxSeeds: 2, // current + previous seed_hash
vmPoolSize: 2, // two VMs can share one dataset for concurrent work
mode: 'fast',
enableHugePages: false
});
}
getContext(seedHex) {
return this.contexts.getContextFromHex(seedHex);
}
verifyShare(seedHex, shareData, target) {
return this.contexts.verifyShareFromHex(seedHex, shareData, target);
}
}For real pool deployments, a useful pattern is:
- keep RandomX in one or a few broker processes
- let pool workers send hash requests over a local socket
- use
createPoolSeedPool({ maxSeeds: 2, vmPoolSize: N, mode: 'fast' }) - call
seedPool.warmSeedAsync(seed)when a newseed_hashappears so dataset init happens off the main thread - call
seedPool.hashAsync(seed, input)inside the broker so one process can overlap requests
See:
examples/pool-broker.jsexamples/pool-cryptonote-style.jsexamples/pool-server.js
Performance on modern hardware:
| CPU | Threads | Hashes/sec | Notes |
|---|---|---|---|
| Intel i9-12900K | 1 | ~15,000 | JIT + AES + Huge Pages |
| AMD Ryzen 9 5950X | 1 | ~12,000 | JIT + AES + Huge Pages |
| Intel Xeon E5-2680 | 1 | ~8,000 | JIT + AES |
# Setup development environment
npm install
# Rebuild addon only (keep RandomX)
npm run build
# Run verification
npm run verify
# Run tests
npm test
# Clean local build output
npm run cleanThe build system includes several automated features:
- Vendored Source Check: Fails fast if the package is incomplete and
deps/randomxis missing - Dependency Detection: Automatically checks for cmake and build tools
- Platform Detection: Handles Linux, Windows, macOS differences automatically
- Version Pinning: Uses the pinned vendored RandomX revision
- Error Handling: Clear error messages for missing dependencies
- Incremental Builds: Only rebuilds what's necessary
Build fails with missing deps/randomx:
# The package source is incomplete. Reinstall from a complete git checkout or archive.
rm -rf node_modules/randomx-hashing
npm installBuild fails with "cmake not found":
# Install cmake
sudo apt-get install cmake # Ubuntu/Debian
brew install cmake # macOSLarge pages not available:
- This is normal and doesn't affect functionality
- Large pages are off unless explicitly requested
- For pool brokers, start with
enableHugePages: falseand only enable after verifying the host can satisfy allocation reliably getContextInfo(contextId).usedLargePagesis conservative; it staysfalseunless the addon can prove large-page backing
# Verify installation
npm run verify
# Check hardware capabilities
node -e "console.log(require('./index').getHardwareInfo())"
# Test basic functionality
npm test- Memory is securely cleared on context destruction
- No sensitive data persists after
releaseContext() - Validates all input parameters and buffer sizes
- Exception-safe with proper cleanup on errors
MIT License - see LICENSE file for details.
- Fork the repository
- Create a feature branch
- Add tests for new functionality
- Ensure all tests pass
- Submit a pull request
- File issues on GitHub
- Check troubleshooting section for common problems
- Review examples for usage patterns
- Verify installation with
npm run verify