All Questions
88
questions
0
votes
1
answer
216
views
Are .tar.gz files made in Linux with identical files identical to .tar.gz files made in Windows?
I've produced a .tar.gz file in Windows that contains some files to update firmware via a Linux Device. The update failed to take, and a Coworker mentioned the problem was because the .tar.gz did not ...
0
votes
0
answers
111
views
Why are not all my files included when I gzip a directory?
I know this is a question that has been asked before, for example here Recursively count all the files in a directory,
but this is driving me crazy. Let me explain:
I have a directory called "...
1
vote
1
answer
480
views
dd + gz = Even Worse /// Mount Valid Partition of Whole-Disk dd Image
I've got a 250G backup of a whole disk via dd, which was then piped to gz, thanks to the advice of one of my -genius- friends.
Looks like the EFI boot partition was corrupted (pretty sure it's an ...
1
vote
0
answers
318
views
gzip -d introduces wrong ASCII characters
I am struggling with the following command: gzip -d /path/to/compressed.sql.gz
I always run it inside a container from the same Docker image. It works fine on my laptop as well as in our production ...
0
votes
1
answer
457
views
How do I maintain integrity on zip and unzip?
I'm taking a dd of a drive and trying to share it with a friend. It's rather big after dd (~8gb). When I tar -czvf my-image.img I can get it down to ~1.5gb. However, if I try to test unzip it, I see ...
0
votes
0
answers
1k
views
Unable to decompress file with gzip
I have a file called myfile.dat.gz that I need to decompress. I have been trying gzip -d myfile.dat.gz, but this results in the error gzip: myfile.dat.gz: invalid compressed data--format violated.
...
0
votes
0
answers
517
views
How could I use logrotate and group by day a bunch of gzip files with an uncompresssed extension?
Doing a test in a production enviroment with rotation logs with 2 days just for testing purposes with gizp files. Created a test file at /tmp/.
/etc/init.d/test
/tmp/test/*.gz {
daily
...
1
vote
1
answer
4k
views
How to split a large gzip file into smaller gzip files without uncompressing it first?
Is there a way to split a large gzip file into smaller gzip files without using zcat?
zcat large.bed.gz| split --numeric-suffixes - -b 7M --filter='gzip > $FILE.gz' large.part.
-2
votes
1
answer
256
views
I want to tar gz file of the month of January
-rw-r--r--. 1 root root 4788074 Jan 1 01:50 2020-01-01-00-50-18.done.csv
-rw-r--r--. 1 root root 4842864 Jan 1 02:50 2020-01-01-01-50-18.done.csv
-rw-r--r--. 1 root root 4923486 Jan 1 03:50 ...
0
votes
1
answer
440
views
Perform operations on the output of find command with -exec options
I want to perform operations on the files found by the find command which already has a -exec option.
find . -type f -exec zgrep -li "4168781103" {} +
Output:
./results/...
0
votes
2
answers
1k
views
How to restore the original file.gz after running a gunzip -f file.gz?
The default behavior of gunzip is to delete the .gz file after it decompresses.
There's a way to restore the original file from the decompressed version on Mac OS?
1
vote
1
answer
2k
views
DD image smaller than drive copied
I took a backup of my server's OS disk through a live USB. Using the command:
dd if=/dev/sda bs=512 count=(30 gb worth of sectors) conv=noerror,sync status=progress | gzip -c > /path/to/removable/...
1
vote
1
answer
840
views
Why does gzip -c give different results than gzip?
I am writing unit tests for some code and found that a simple gzip is causing a difference in my results. Upon further investigation I found that gzip gives a different .gz file than gzip -c does. Why ...
0
votes
0
answers
404
views
How to view specific path inside tar.gz?
tar -tf file.tar.gz let's me view inside the whole archive. How do you view a specific path without viewing the whole thing? E.g., I want to view only /path/foo/
Update:
This command almost works ...
2
votes
1
answer
2k
views
simple way to extract tar.gz in-place with no hard-drive overhead
I'm working with a tar.gz file (196 GB). This file needs to be extracted (240 GB) in place, and I would like the total disk usage not to go over 240 GB during the process (or close to it as possible). ...