Performance and Memory Problems
Slow Decompression
Symptoms: Extraction taking excessive time
Causes and Solutions:
-
Large LZX window
-
Use MSZIP instead of LZX for faster extraction
-
Reduce LZX window size when creating archives
-
-
Many small files
-
Extract in parallel if possible
-
Consider batch processing
-
-
Slow I/O
-
Use SSD instead of HDD
-
Check disk space and fragmentation
-
High Memory Usage
Symptoms: Excessive RAM consumption
Solutions:
# Use streaming for large files
File.open('output.dat', 'wb') do |output|
cab.extract_file('large.dat') do |chunk|
output.write(chunk)
end
end
# Process files individually
cab.files.each do |file|
# Extract one at a time
data = file.data
File.write(file.name, data)
data = nil # Allow GC
GC.start
endMemory Allocation Failures
Error: NoMemoryError or Errno::ENOMEM
Solutions:
-
Increase system memory
-
Use smaller compression windows
-
Close other applications
-
Process in smaller batches
CPU Usage
Issue: High CPU during compression/decompression
Normal: LZX compression is CPU-intensive
Solutions:
-
Use faster algorithms (MSZIP, None)
-
Reduce compression level
-
Use parallel processing for multiple files
Optimization Tips
-
Choose appropriate algorithm:
-
Fast: None, LZSS
-
Balanced: MSZIP
-
Maximum compression: LZX
-
-
Tune for your use case:
-
One-time: Use maximum compression
-
Frequent access: Use faster algorithms
-
-
Monitor resources:
require 'benchmark'
time = Benchmark.measure do
cab.extract_all('output/')
end
puts "Extraction took #{time.real} seconds"