Compressing folder without using additional space on the drive
I have an 18GB EC2 instance that has run out of space. I wanted to zip and transfer files off the server to a local drive. There is just about 1% free space remaining on it. This means that I cannot compress anything on the drive because it keeps giving me zip I/O error: No space left on device
error. Is there a way I can compress everything without using additional drive space?
compression zip io
|
show 1 more comment
I have an 18GB EC2 instance that has run out of space. I wanted to zip and transfer files off the server to a local drive. There is just about 1% free space remaining on it. This means that I cannot compress anything on the drive because it keeps giving me zip I/O error: No space left on device
error. Is there a way I can compress everything without using additional drive space?
compression zip io
You need some drive space to write the first compressed file(s), while you still have the original file(s). But you can write to another drive, for example a USB pendrive.
– sudodus
Jan 31 at 8:25
@sudodus Are you saying I can compress an EC2 folder to a flash drive connected to my Mac?
– electrophile
Jan 31 at 8:29
I have no Mac computer, so I don't know, but it should work on a pc. Depending on the size of the folder (to be compressed) and the size of the flash drive, you might succeed directly, or you may have to compress only part of the folder in a first step. Then after checking that the compression was successful, you can remove the original files.
– sudodus
Jan 31 at 8:47
What operating system are you running (distro and version, for example Ubuntu 18.04.1 LTS)?
– sudodus
Jan 31 at 8:58
Ubuntu 16.04 LTS
– electrophile
Jan 31 at 9:01
|
show 1 more comment
I have an 18GB EC2 instance that has run out of space. I wanted to zip and transfer files off the server to a local drive. There is just about 1% free space remaining on it. This means that I cannot compress anything on the drive because it keeps giving me zip I/O error: No space left on device
error. Is there a way I can compress everything without using additional drive space?
compression zip io
I have an 18GB EC2 instance that has run out of space. I wanted to zip and transfer files off the server to a local drive. There is just about 1% free space remaining on it. This means that I cannot compress anything on the drive because it keeps giving me zip I/O error: No space left on device
error. Is there a way I can compress everything without using additional drive space?
compression zip io
compression zip io
asked Jan 31 at 8:16
electrophileelectrophile
1153
1153
You need some drive space to write the first compressed file(s), while you still have the original file(s). But you can write to another drive, for example a USB pendrive.
– sudodus
Jan 31 at 8:25
@sudodus Are you saying I can compress an EC2 folder to a flash drive connected to my Mac?
– electrophile
Jan 31 at 8:29
I have no Mac computer, so I don't know, but it should work on a pc. Depending on the size of the folder (to be compressed) and the size of the flash drive, you might succeed directly, or you may have to compress only part of the folder in a first step. Then after checking that the compression was successful, you can remove the original files.
– sudodus
Jan 31 at 8:47
What operating system are you running (distro and version, for example Ubuntu 18.04.1 LTS)?
– sudodus
Jan 31 at 8:58
Ubuntu 16.04 LTS
– electrophile
Jan 31 at 9:01
|
show 1 more comment
You need some drive space to write the first compressed file(s), while you still have the original file(s). But you can write to another drive, for example a USB pendrive.
– sudodus
Jan 31 at 8:25
@sudodus Are you saying I can compress an EC2 folder to a flash drive connected to my Mac?
– electrophile
Jan 31 at 8:29
I have no Mac computer, so I don't know, but it should work on a pc. Depending on the size of the folder (to be compressed) and the size of the flash drive, you might succeed directly, or you may have to compress only part of the folder in a first step. Then after checking that the compression was successful, you can remove the original files.
– sudodus
Jan 31 at 8:47
What operating system are you running (distro and version, for example Ubuntu 18.04.1 LTS)?
– sudodus
Jan 31 at 8:58
Ubuntu 16.04 LTS
– electrophile
Jan 31 at 9:01
You need some drive space to write the first compressed file(s), while you still have the original file(s). But you can write to another drive, for example a USB pendrive.
– sudodus
Jan 31 at 8:25
You need some drive space to write the first compressed file(s), while you still have the original file(s). But you can write to another drive, for example a USB pendrive.
– sudodus
Jan 31 at 8:25
@sudodus Are you saying I can compress an EC2 folder to a flash drive connected to my Mac?
– electrophile
Jan 31 at 8:29
@sudodus Are you saying I can compress an EC2 folder to a flash drive connected to my Mac?
– electrophile
Jan 31 at 8:29
I have no Mac computer, so I don't know, but it should work on a pc. Depending on the size of the folder (to be compressed) and the size of the flash drive, you might succeed directly, or you may have to compress only part of the folder in a first step. Then after checking that the compression was successful, you can remove the original files.
– sudodus
Jan 31 at 8:47
I have no Mac computer, so I don't know, but it should work on a pc. Depending on the size of the folder (to be compressed) and the size of the flash drive, you might succeed directly, or you may have to compress only part of the folder in a first step. Then after checking that the compression was successful, you can remove the original files.
– sudodus
Jan 31 at 8:47
What operating system are you running (distro and version, for example Ubuntu 18.04.1 LTS)?
– sudodus
Jan 31 at 8:58
What operating system are you running (distro and version, for example Ubuntu 18.04.1 LTS)?
– sudodus
Jan 31 at 8:58
Ubuntu 16.04 LTS
– electrophile
Jan 31 at 9:01
Ubuntu 16.04 LTS
– electrophile
Jan 31 at 9:01
|
show 1 more comment
2 Answers
2
active
oldest
votes
You can give a command to compress to stdout over ssh and redirect stdout localy to a file. Something like:
ssh user@host "tar c /mydir | gzip -f" > myarchive.tar.gz
add a comment |
The following tools are available in Ubuntu, and I have checked that rsync
, gzip
and tar
are also available in MacOS.
rsync
which can copy files and/or directory trees locally and via a network
gzip
which compresses single files
tar
which can create an archive with many files and directory trees, and compress, if you specify it
Change directory
Change directory with
cd path-to-source-directory
to the directory that you want to compress.
rsync
I don’t have physical access to the server. I always communicate via
ssh. I’ll look up rsync. Thanks.
rsync
is a powerful copy tool, and it has a built-in check, that the transfer is correct.
- It can copy files and/or directory trees locally and via a network
- It is often used for backup
- locally to an external drive or
- via a network connection to a server or between servers
It is straightforward to use rsync
, if you have Ubuntu at both ends of the connection, and I checked that there is an rsync
version also in MacOS.
I like the following command line where the option
-H
takes hard links into account (and avoids double transfers/copies); if there are no hard links, you should remove this option.
-a
'archive' makes a copy that suits backup or synchronizing
-v
'verbose' creates output of all files to be copied with-n
and all files copied in the real case (without-n
)
-n
makes it a 'dry run', just showing what it 'wants to do'
rsync -Havn source/ target
In your case the source is in the server, and you run via the ssh
connection. So, in the client (your Mac computer), run
rsync -avn user-id@ip-address:/path-to-source-directory/ path-to-target-directory
Please notice the trailing slash after the source directory.
If it looks good, you can let it do the transfer with the following command (remove the n
for 'dry run')
rsync -av user-id@ip-address:/path-to-source-directory/ path-to-target-directory
Tips and comments
After the transfer, you can do what you want with the copy in the target directory. I think you want to compress it, and I suggest that you use
tar
for that purpose and create a tarball.If you cannot run
rsync
orgzip
ortar
in your MacOS, you can boot your Mac computer from a USB pendrive or DVD disk with Ubuntu, and run the programs that way. (The advice to boot the computer from a USB pendrive or DVD disk with Ubuntu applies also to a computer with Windows.)You can read the built-in manual
man rsync
,man gzip
andman tar
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet.
gzip
Change directory with cd path-to-source-directory
to the directory that you want to compress.
Compress single files with
gzip -c file > path-to-external-directory/file.gz
Change directory to where you want to extract the file and run gunzip to uncompress
cd to-where-you-want-to-extract-the-files
gunzip -c path-to-external-directory/file.gz > file
tar
Change directory with cd path-to-source-directory
to the directory that you want to compress.
Compress a group of files to a 'tarball', for example
tar -cvzf path-to-external-directory/file.tar.gz file1 file2 file3
or if there is space enough in the target partition on the external drive for the whole directory
tar -cvzf path-to-external-directory/file.tar.gz .
The space and final dot are important.
You can 'look into' the tar file with the command
tar -tvf path-to-external-directory/file.tar.gz
Extract the compressed files from the tarball with the following commands
cd to-where-you-want-to-extract-the-files
tar -xvf path-to-external-directory/file.tar.gz
Mypath-to-external-directory
is on my local machine. Thepath-to-source-directory
is on the server. With gzip, I would then run it from the server and specify the local machine path? How would gzip know where my local machine is?
– electrophile
Jan 31 at 13:55
1. Is there a USB port (that you can use) on the server? In that case you can do it in the server. 2. Can you communicate viassh
? In that case you can transfer the files (or whole directory trees) viarsync
and later on compress locally in the Mac.
– sudodus
Jan 31 at 14:17
I don’t have physical access to the server. I always communicate via ssh. I’ll look up rsync. Thanks.
– electrophile
Jan 31 at 15:23
@electrophile, It's a good idea to learn aboutrsync
:-) I use it a lot. It is a powerful copy tool, that works both locally and via network connections. You can read the built-in manualman rsync
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet. -- I will add a paragraph aboutrsync
in my answer.
– sudodus
Jan 31 at 15:28
add a comment |
Your Answer
StackExchange.ready(function() {
var channelOptions = {
tags: "".split(" "),
id: "89"
};
initTagRenderer("".split(" "), "".split(" "), channelOptions);
StackExchange.using("externalEditor", function() {
// Have to fire editor after snippets, if snippets enabled
if (StackExchange.settings.snippets.snippetsEnabled) {
StackExchange.using("snippets", function() {
createEditor();
});
}
else {
createEditor();
}
});
function createEditor() {
StackExchange.prepareEditor({
heartbeatType: 'answer',
autoActivateHeartbeat: false,
convertImagesToLinks: true,
noModals: true,
showLowRepImageUploadWarning: true,
reputationToPostImages: 10,
bindNavPrevention: true,
postfix: "",
imageUploader: {
brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
allowUrls: true
},
onDemand: true,
discardSelector: ".discard-answer"
,immediatelyShowMarkdownHelp:true
});
}
});
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1114318%2fcompressing-folder-without-using-additional-space-on-the-drive%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
2 Answers
2
active
oldest
votes
2 Answers
2
active
oldest
votes
active
oldest
votes
active
oldest
votes
You can give a command to compress to stdout over ssh and redirect stdout localy to a file. Something like:
ssh user@host "tar c /mydir | gzip -f" > myarchive.tar.gz
add a comment |
You can give a command to compress to stdout over ssh and redirect stdout localy to a file. Something like:
ssh user@host "tar c /mydir | gzip -f" > myarchive.tar.gz
add a comment |
You can give a command to compress to stdout over ssh and redirect stdout localy to a file. Something like:
ssh user@host "tar c /mydir | gzip -f" > myarchive.tar.gz
You can give a command to compress to stdout over ssh and redirect stdout localy to a file. Something like:
ssh user@host "tar c /mydir | gzip -f" > myarchive.tar.gz
answered Jan 31 at 8:49
EelkeEelke
1564
1564
add a comment |
add a comment |
The following tools are available in Ubuntu, and I have checked that rsync
, gzip
and tar
are also available in MacOS.
rsync
which can copy files and/or directory trees locally and via a network
gzip
which compresses single files
tar
which can create an archive with many files and directory trees, and compress, if you specify it
Change directory
Change directory with
cd path-to-source-directory
to the directory that you want to compress.
rsync
I don’t have physical access to the server. I always communicate via
ssh. I’ll look up rsync. Thanks.
rsync
is a powerful copy tool, and it has a built-in check, that the transfer is correct.
- It can copy files and/or directory trees locally and via a network
- It is often used for backup
- locally to an external drive or
- via a network connection to a server or between servers
It is straightforward to use rsync
, if you have Ubuntu at both ends of the connection, and I checked that there is an rsync
version also in MacOS.
I like the following command line where the option
-H
takes hard links into account (and avoids double transfers/copies); if there are no hard links, you should remove this option.
-a
'archive' makes a copy that suits backup or synchronizing
-v
'verbose' creates output of all files to be copied with-n
and all files copied in the real case (without-n
)
-n
makes it a 'dry run', just showing what it 'wants to do'
rsync -Havn source/ target
In your case the source is in the server, and you run via the ssh
connection. So, in the client (your Mac computer), run
rsync -avn user-id@ip-address:/path-to-source-directory/ path-to-target-directory
Please notice the trailing slash after the source directory.
If it looks good, you can let it do the transfer with the following command (remove the n
for 'dry run')
rsync -av user-id@ip-address:/path-to-source-directory/ path-to-target-directory
Tips and comments
After the transfer, you can do what you want with the copy in the target directory. I think you want to compress it, and I suggest that you use
tar
for that purpose and create a tarball.If you cannot run
rsync
orgzip
ortar
in your MacOS, you can boot your Mac computer from a USB pendrive or DVD disk with Ubuntu, and run the programs that way. (The advice to boot the computer from a USB pendrive or DVD disk with Ubuntu applies also to a computer with Windows.)You can read the built-in manual
man rsync
,man gzip
andman tar
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet.
gzip
Change directory with cd path-to-source-directory
to the directory that you want to compress.
Compress single files with
gzip -c file > path-to-external-directory/file.gz
Change directory to where you want to extract the file and run gunzip to uncompress
cd to-where-you-want-to-extract-the-files
gunzip -c path-to-external-directory/file.gz > file
tar
Change directory with cd path-to-source-directory
to the directory that you want to compress.
Compress a group of files to a 'tarball', for example
tar -cvzf path-to-external-directory/file.tar.gz file1 file2 file3
or if there is space enough in the target partition on the external drive for the whole directory
tar -cvzf path-to-external-directory/file.tar.gz .
The space and final dot are important.
You can 'look into' the tar file with the command
tar -tvf path-to-external-directory/file.tar.gz
Extract the compressed files from the tarball with the following commands
cd to-where-you-want-to-extract-the-files
tar -xvf path-to-external-directory/file.tar.gz
Mypath-to-external-directory
is on my local machine. Thepath-to-source-directory
is on the server. With gzip, I would then run it from the server and specify the local machine path? How would gzip know where my local machine is?
– electrophile
Jan 31 at 13:55
1. Is there a USB port (that you can use) on the server? In that case you can do it in the server. 2. Can you communicate viassh
? In that case you can transfer the files (or whole directory trees) viarsync
and later on compress locally in the Mac.
– sudodus
Jan 31 at 14:17
I don’t have physical access to the server. I always communicate via ssh. I’ll look up rsync. Thanks.
– electrophile
Jan 31 at 15:23
@electrophile, It's a good idea to learn aboutrsync
:-) I use it a lot. It is a powerful copy tool, that works both locally and via network connections. You can read the built-in manualman rsync
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet. -- I will add a paragraph aboutrsync
in my answer.
– sudodus
Jan 31 at 15:28
add a comment |
The following tools are available in Ubuntu, and I have checked that rsync
, gzip
and tar
are also available in MacOS.
rsync
which can copy files and/or directory trees locally and via a network
gzip
which compresses single files
tar
which can create an archive with many files and directory trees, and compress, if you specify it
Change directory
Change directory with
cd path-to-source-directory
to the directory that you want to compress.
rsync
I don’t have physical access to the server. I always communicate via
ssh. I’ll look up rsync. Thanks.
rsync
is a powerful copy tool, and it has a built-in check, that the transfer is correct.
- It can copy files and/or directory trees locally and via a network
- It is often used for backup
- locally to an external drive or
- via a network connection to a server or between servers
It is straightforward to use rsync
, if you have Ubuntu at both ends of the connection, and I checked that there is an rsync
version also in MacOS.
I like the following command line where the option
-H
takes hard links into account (and avoids double transfers/copies); if there are no hard links, you should remove this option.
-a
'archive' makes a copy that suits backup or synchronizing
-v
'verbose' creates output of all files to be copied with-n
and all files copied in the real case (without-n
)
-n
makes it a 'dry run', just showing what it 'wants to do'
rsync -Havn source/ target
In your case the source is in the server, and you run via the ssh
connection. So, in the client (your Mac computer), run
rsync -avn user-id@ip-address:/path-to-source-directory/ path-to-target-directory
Please notice the trailing slash after the source directory.
If it looks good, you can let it do the transfer with the following command (remove the n
for 'dry run')
rsync -av user-id@ip-address:/path-to-source-directory/ path-to-target-directory
Tips and comments
After the transfer, you can do what you want with the copy in the target directory. I think you want to compress it, and I suggest that you use
tar
for that purpose and create a tarball.If you cannot run
rsync
orgzip
ortar
in your MacOS, you can boot your Mac computer from a USB pendrive or DVD disk with Ubuntu, and run the programs that way. (The advice to boot the computer from a USB pendrive or DVD disk with Ubuntu applies also to a computer with Windows.)You can read the built-in manual
man rsync
,man gzip
andman tar
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet.
gzip
Change directory with cd path-to-source-directory
to the directory that you want to compress.
Compress single files with
gzip -c file > path-to-external-directory/file.gz
Change directory to where you want to extract the file and run gunzip to uncompress
cd to-where-you-want-to-extract-the-files
gunzip -c path-to-external-directory/file.gz > file
tar
Change directory with cd path-to-source-directory
to the directory that you want to compress.
Compress a group of files to a 'tarball', for example
tar -cvzf path-to-external-directory/file.tar.gz file1 file2 file3
or if there is space enough in the target partition on the external drive for the whole directory
tar -cvzf path-to-external-directory/file.tar.gz .
The space and final dot are important.
You can 'look into' the tar file with the command
tar -tvf path-to-external-directory/file.tar.gz
Extract the compressed files from the tarball with the following commands
cd to-where-you-want-to-extract-the-files
tar -xvf path-to-external-directory/file.tar.gz
Mypath-to-external-directory
is on my local machine. Thepath-to-source-directory
is on the server. With gzip, I would then run it from the server and specify the local machine path? How would gzip know where my local machine is?
– electrophile
Jan 31 at 13:55
1. Is there a USB port (that you can use) on the server? In that case you can do it in the server. 2. Can you communicate viassh
? In that case you can transfer the files (or whole directory trees) viarsync
and later on compress locally in the Mac.
– sudodus
Jan 31 at 14:17
I don’t have physical access to the server. I always communicate via ssh. I’ll look up rsync. Thanks.
– electrophile
Jan 31 at 15:23
@electrophile, It's a good idea to learn aboutrsync
:-) I use it a lot. It is a powerful copy tool, that works both locally and via network connections. You can read the built-in manualman rsync
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet. -- I will add a paragraph aboutrsync
in my answer.
– sudodus
Jan 31 at 15:28
add a comment |
The following tools are available in Ubuntu, and I have checked that rsync
, gzip
and tar
are also available in MacOS.
rsync
which can copy files and/or directory trees locally and via a network
gzip
which compresses single files
tar
which can create an archive with many files and directory trees, and compress, if you specify it
Change directory
Change directory with
cd path-to-source-directory
to the directory that you want to compress.
rsync
I don’t have physical access to the server. I always communicate via
ssh. I’ll look up rsync. Thanks.
rsync
is a powerful copy tool, and it has a built-in check, that the transfer is correct.
- It can copy files and/or directory trees locally and via a network
- It is often used for backup
- locally to an external drive or
- via a network connection to a server or between servers
It is straightforward to use rsync
, if you have Ubuntu at both ends of the connection, and I checked that there is an rsync
version also in MacOS.
I like the following command line where the option
-H
takes hard links into account (and avoids double transfers/copies); if there are no hard links, you should remove this option.
-a
'archive' makes a copy that suits backup or synchronizing
-v
'verbose' creates output of all files to be copied with-n
and all files copied in the real case (without-n
)
-n
makes it a 'dry run', just showing what it 'wants to do'
rsync -Havn source/ target
In your case the source is in the server, and you run via the ssh
connection. So, in the client (your Mac computer), run
rsync -avn user-id@ip-address:/path-to-source-directory/ path-to-target-directory
Please notice the trailing slash after the source directory.
If it looks good, you can let it do the transfer with the following command (remove the n
for 'dry run')
rsync -av user-id@ip-address:/path-to-source-directory/ path-to-target-directory
Tips and comments
After the transfer, you can do what you want with the copy in the target directory. I think you want to compress it, and I suggest that you use
tar
for that purpose and create a tarball.If you cannot run
rsync
orgzip
ortar
in your MacOS, you can boot your Mac computer from a USB pendrive or DVD disk with Ubuntu, and run the programs that way. (The advice to boot the computer from a USB pendrive or DVD disk with Ubuntu applies also to a computer with Windows.)You can read the built-in manual
man rsync
,man gzip
andman tar
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet.
gzip
Change directory with cd path-to-source-directory
to the directory that you want to compress.
Compress single files with
gzip -c file > path-to-external-directory/file.gz
Change directory to where you want to extract the file and run gunzip to uncompress
cd to-where-you-want-to-extract-the-files
gunzip -c path-to-external-directory/file.gz > file
tar
Change directory with cd path-to-source-directory
to the directory that you want to compress.
Compress a group of files to a 'tarball', for example
tar -cvzf path-to-external-directory/file.tar.gz file1 file2 file3
or if there is space enough in the target partition on the external drive for the whole directory
tar -cvzf path-to-external-directory/file.tar.gz .
The space and final dot are important.
You can 'look into' the tar file with the command
tar -tvf path-to-external-directory/file.tar.gz
Extract the compressed files from the tarball with the following commands
cd to-where-you-want-to-extract-the-files
tar -xvf path-to-external-directory/file.tar.gz
The following tools are available in Ubuntu, and I have checked that rsync
, gzip
and tar
are also available in MacOS.
rsync
which can copy files and/or directory trees locally and via a network
gzip
which compresses single files
tar
which can create an archive with many files and directory trees, and compress, if you specify it
Change directory
Change directory with
cd path-to-source-directory
to the directory that you want to compress.
rsync
I don’t have physical access to the server. I always communicate via
ssh. I’ll look up rsync. Thanks.
rsync
is a powerful copy tool, and it has a built-in check, that the transfer is correct.
- It can copy files and/or directory trees locally and via a network
- It is often used for backup
- locally to an external drive or
- via a network connection to a server or between servers
It is straightforward to use rsync
, if you have Ubuntu at both ends of the connection, and I checked that there is an rsync
version also in MacOS.
I like the following command line where the option
-H
takes hard links into account (and avoids double transfers/copies); if there are no hard links, you should remove this option.
-a
'archive' makes a copy that suits backup or synchronizing
-v
'verbose' creates output of all files to be copied with-n
and all files copied in the real case (without-n
)
-n
makes it a 'dry run', just showing what it 'wants to do'
rsync -Havn source/ target
In your case the source is in the server, and you run via the ssh
connection. So, in the client (your Mac computer), run
rsync -avn user-id@ip-address:/path-to-source-directory/ path-to-target-directory
Please notice the trailing slash after the source directory.
If it looks good, you can let it do the transfer with the following command (remove the n
for 'dry run')
rsync -av user-id@ip-address:/path-to-source-directory/ path-to-target-directory
Tips and comments
After the transfer, you can do what you want with the copy in the target directory. I think you want to compress it, and I suggest that you use
tar
for that purpose and create a tarball.If you cannot run
rsync
orgzip
ortar
in your MacOS, you can boot your Mac computer from a USB pendrive or DVD disk with Ubuntu, and run the programs that way. (The advice to boot the computer from a USB pendrive or DVD disk with Ubuntu applies also to a computer with Windows.)You can read the built-in manual
man rsync
,man gzip
andman tar
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet.
gzip
Change directory with cd path-to-source-directory
to the directory that you want to compress.
Compress single files with
gzip -c file > path-to-external-directory/file.gz
Change directory to where you want to extract the file and run gunzip to uncompress
cd to-where-you-want-to-extract-the-files
gunzip -c path-to-external-directory/file.gz > file
tar
Change directory with cd path-to-source-directory
to the directory that you want to compress.
Compress a group of files to a 'tarball', for example
tar -cvzf path-to-external-directory/file.tar.gz file1 file2 file3
or if there is space enough in the target partition on the external drive for the whole directory
tar -cvzf path-to-external-directory/file.tar.gz .
The space and final dot are important.
You can 'look into' the tar file with the command
tar -tvf path-to-external-directory/file.tar.gz
Extract the compressed files from the tarball with the following commands
cd to-where-you-want-to-extract-the-files
tar -xvf path-to-external-directory/file.tar.gz
edited Feb 1 at 7:10
answered Jan 31 at 9:29
sudodussudodus
25.1k32977
25.1k32977
Mypath-to-external-directory
is on my local machine. Thepath-to-source-directory
is on the server. With gzip, I would then run it from the server and specify the local machine path? How would gzip know where my local machine is?
– electrophile
Jan 31 at 13:55
1. Is there a USB port (that you can use) on the server? In that case you can do it in the server. 2. Can you communicate viassh
? In that case you can transfer the files (or whole directory trees) viarsync
and later on compress locally in the Mac.
– sudodus
Jan 31 at 14:17
I don’t have physical access to the server. I always communicate via ssh. I’ll look up rsync. Thanks.
– electrophile
Jan 31 at 15:23
@electrophile, It's a good idea to learn aboutrsync
:-) I use it a lot. It is a powerful copy tool, that works both locally and via network connections. You can read the built-in manualman rsync
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet. -- I will add a paragraph aboutrsync
in my answer.
– sudodus
Jan 31 at 15:28
add a comment |
Mypath-to-external-directory
is on my local machine. Thepath-to-source-directory
is on the server. With gzip, I would then run it from the server and specify the local machine path? How would gzip know where my local machine is?
– electrophile
Jan 31 at 13:55
1. Is there a USB port (that you can use) on the server? In that case you can do it in the server. 2. Can you communicate viassh
? In that case you can transfer the files (or whole directory trees) viarsync
and later on compress locally in the Mac.
– sudodus
Jan 31 at 14:17
I don’t have physical access to the server. I always communicate via ssh. I’ll look up rsync. Thanks.
– electrophile
Jan 31 at 15:23
@electrophile, It's a good idea to learn aboutrsync
:-) I use it a lot. It is a powerful copy tool, that works both locally and via network connections. You can read the built-in manualman rsync
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet. -- I will add a paragraph aboutrsync
in my answer.
– sudodus
Jan 31 at 15:28
My
path-to-external-directory
is on my local machine. The path-to-source-directory
is on the server. With gzip, I would then run it from the server and specify the local machine path? How would gzip know where my local machine is?– electrophile
Jan 31 at 13:55
My
path-to-external-directory
is on my local machine. The path-to-source-directory
is on the server. With gzip, I would then run it from the server and specify the local machine path? How would gzip know where my local machine is?– electrophile
Jan 31 at 13:55
1. Is there a USB port (that you can use) on the server? In that case you can do it in the server. 2. Can you communicate via
ssh
? In that case you can transfer the files (or whole directory trees) via rsync
and later on compress locally in the Mac.– sudodus
Jan 31 at 14:17
1. Is there a USB port (that you can use) on the server? In that case you can do it in the server. 2. Can you communicate via
ssh
? In that case you can transfer the files (or whole directory trees) via rsync
and later on compress locally in the Mac.– sudodus
Jan 31 at 14:17
I don’t have physical access to the server. I always communicate via ssh. I’ll look up rsync. Thanks.
– electrophile
Jan 31 at 15:23
I don’t have physical access to the server. I always communicate via ssh. I’ll look up rsync. Thanks.
– electrophile
Jan 31 at 15:23
@electrophile, It's a good idea to learn about
rsync
:-) I use it a lot. It is a powerful copy tool, that works both locally and via network connections. You can read the built-in manual man rsync
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet. -- I will add a paragraph about rsync
in my answer.– sudodus
Jan 31 at 15:28
@electrophile, It's a good idea to learn about
rsync
:-) I use it a lot. It is a powerful copy tool, that works both locally and via network connections. You can read the built-in manual man rsync
in your Ubuntu 16.04 LTS, and I am sure that you can find good tutorials via the internet. -- I will add a paragraph about rsync
in my answer.– sudodus
Jan 31 at 15:28
add a comment |
Thanks for contributing an answer to Ask Ubuntu!
- Please be sure to answer the question. Provide details and share your research!
But avoid …
- Asking for help, clarification, or responding to other answers.
- Making statements based on opinion; back them up with references or personal experience.
To learn more, see our tips on writing great answers.
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
StackExchange.ready(
function () {
StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2faskubuntu.com%2fquestions%2f1114318%2fcompressing-folder-without-using-additional-space-on-the-drive%23new-answer', 'question_page');
}
);
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Sign up or log in
StackExchange.ready(function () {
StackExchange.helpers.onClickDraftSave('#login-link');
});
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Sign up using Google
Sign up using Facebook
Sign up using Email and Password
Post as a guest
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
Required, but never shown
You need some drive space to write the first compressed file(s), while you still have the original file(s). But you can write to another drive, for example a USB pendrive.
– sudodus
Jan 31 at 8:25
@sudodus Are you saying I can compress an EC2 folder to a flash drive connected to my Mac?
– electrophile
Jan 31 at 8:29
I have no Mac computer, so I don't know, but it should work on a pc. Depending on the size of the folder (to be compressed) and the size of the flash drive, you might succeed directly, or you may have to compress only part of the folder in a first step. Then after checking that the compression was successful, you can remove the original files.
– sudodus
Jan 31 at 8:47
What operating system are you running (distro and version, for example Ubuntu 18.04.1 LTS)?
– sudodus
Jan 31 at 8:58
Ubuntu 16.04 LTS
– electrophile
Jan 31 at 9:01