Skip to main content
gio91ber's user avatar
gio91ber's user avatar
gio91ber's user avatar
gio91ber
  • Member for 5 years, 6 months
  • Last seen more than 1 year ago
awarded
asked
Loading…
awarded
awarded
comment
Reimplementing lrzip decompress on Windows explorer using WSL
Well... I've actually tried to benchmark the whole thing on a 350MB file and came to the conclusion that cygwin is actually better than anything else apparently: ` NATIVE LINUX Compression Speed: 14.500MB/s. Total time: 00:00:24.07 DeCompression Speed: 6.327MB/s Total time: 00:00:55.10 CYGWIN Compression Speed: 15.130MB/s. Total time: 00:00:23.23 DeCompression Speed: 69.600MB/s Total time: 00:00:05.53 WSL2 Compression Speed: 7.250MB/s. Total time: 00:00:48.97 DeCompression Speed: 0.436MB/s Total time: 00:13:18.68 `
Loading…
comment
Compress a damaged-disk with lots of duplicate files dd image for archival
@GordanBobić can I ask you what a zvol is? I only know it's an emulation of a block device. Could I then save or export that zvol as an already compressed file? Thanks in advance
comment
Compress a damaged-disk with lots of duplicate files dd image for archival
@Tetsujin yeah, Robert is for sure right, and I'm also pretty sure the guy's not even that aware that a data-recovery retrieval could be done at all. The reason why I started this experiment is that being this data recovery effort for free as the guy is a friend of my father I don't really want to put any effort - or money, not even for a "cheap" 1 TB hdd, more than writing some commands on a command line and letting the pc run its magic. The objective of the experiment was to get a small enough disk image to make it fit onto the free space of one of the recovered partitions.
comment
Compress a damaged-disk with lots of duplicate files dd image for archival
@Robert unluckily it's not a paid work, it's actually a friend of my father who f'ed up with its old pc a few hundred times already. It's actually more of an experiment on my side that I left running on my home pc for days
comment
Compress a damaged-disk with lots of duplicate files dd image for archival
@Tetsujin it was data recovered from a previous hard disk drive. But this time they luckily gave the SSD directly to me so that I could firstly save an untouched disk-image and then repair the partition table and partition headers to recover all the data, first to another volume for them to check and then in place on the SSD itself once they gave me their consent
Loading…
awarded
awarded
Loading…
Loading…
awarded
Loading…