Gentoo Wiki


This article is part of the Tips & Tricks series.
Terminals / Shells Network X Window System Portage System Filesystems Kernel Other


No Internet connection

Updating the portage tree

If Gentoo has just been installed on the machine (specially if it was a stage3 install, such as with a LiveCD) then first of all the portage tree needs to be updated (and perhaps the package "portage" itself) before installing/updating any other package. To acomplish this:

  1. Download recent up-to date portage snapshot at a high bandwith location (work, university, internet coffee, etc.) and burn it to a CD (or any other portable storage device available).
  2. Back on your machine, rename your /usr/portage directory (to have it as a backup in case something goes wrong). Remember, it's important NOT to have a /usr/portage directory (or have an empty directory) before the next step: if the files within the tarball simply overwrite the old ones portage will refuse to work with some packages arguing that "A file is not listed in the Manifest".
  3. Extract the new portage snapshot:
tar -xvjf /path/to/snapshot -C /usr/
  1. Copy the directories /distfiles and /packages (with all subdirectories, if they exist) to the new location from your old snapshot directory
  2. Extract meta information from the new snapshot, and rebuild ebuild dependencies:
emerge --metadata
emerge --regen

Previous versions of this wiki page omitted the metadata step, but if you do regenerate the ebuilds without emerging the metadata first the system gets corrupted (very noticeable with Gnome) and the only way to recover it is to reinstall everything :-(

At the end of the first emerge command, you may be informed you should update portage itself before any other package, which you'll learn to do in the next section, before that do the regen thing. . Note - it may take a lot of time!

After these steps are completed, you can update or install a new package.

Getting the list of packages to download

For each package you wish to emerge run this:

emerge -fp package1 package2 | sort | uniq | sed '/\(^http\|^ftp\).*/!d;s/\ .*$//g' > links.txt

If you wish to update the whole system rather than just a package:

emerge -fpu world | sort | uniq | sed '/\(^http\|^ftp\).*/!d;s/\ .*$//g' > links.txt

Or (note: will take waaaay longer):

emerge -fpu system | sort | uniq | sed '/\(^http\|^ftp\).*/!d;s/\ .*$//g' > links.txt

If you have more than one computer you can combine the links from each computer and then pipe them to sort and uniq again so that packages required by more than one computer are only downloaded once.

cat computer1_links.txt computer2_links.txt computer3_links.txt | sort | uniq > combined_links.txt

Now the links.txt file consists of list of links to the packages to install, each in a different row (instead of all of the URLs to get a file in the same row, separated by commas - wget can not understand this format).

Download them on a machine with internet connection

Once you have the list of files to download, get them on a different machine. Of course, you can use any download manager or any program for this task (even simply pasting each line in a browser and have it download the files), but it's simpler if you use wget. A good version for MS Windows is included with the UnxUtils (a port of GNU tools for Windows).

With wget, just do:

wget -i links.txt -nc

Option -i tells wget to look inside links.txt for URLs of stuff to download, option -nc tells it not to download it twice or thrice once the file has been retrieved from a working URL.

If for some reason the download stops (e.g. you dialup connection fails), take notice which file was wget trying to grab, and delete each and every line referring to this file except one from links.txt. Place that single line (remember, it must be only one URL so if you know in advance which server is fastest place use the URL to it) to another file, let's call it failed.txt. Once you're back online, continue downloading this same file with:

wget -i failed.txt -c

And, when it finishes, continue with wget -i links.txt -nc Option -c tells wget to continue downloading the file from the byte where connection stopped, unfortunately it is incompatible with option -nc for this last one doesn't check the final size of the file against what it should be.

Once you've grabbed all the files, burn them to a CD (or some other storage device) and proceed to next section.

Installing the packages

Now put the downloaded files in /usr/portage/distfiles directory. Then emerge them:

emerge package1 package2 [...]



Getdelta is a wget wrapper that will allow you to fetch the changes to source packages, instead of the entire tarball, when emerging, making use of the Dynamic Deltup network.


Code: Installation
# echo "app-portage/deltup ~x86" >> /etc/portage/package.keywords
# echo "dev-util/bdelta ~x86" >> /etc/portage/package.keywords
# echo "app-portage/getdelta ~x86" >> /etc/portage/package.keywords
# emerge getdelta

If you are on athlon64, then replace ~x86 with ~amd64.


Please note that only one of the following options are required.

Using .bashrc

This will allow you to make use of getdelta by using "gdemerge" instead of "emerge", but still allow you to easily fall back to the default fetch command when the need arises.

File: /root/.bashrc


alias gdemerge='FETCHCOMMAND="/usr/bin/ \${URI}" emerge'
Using make.conf

This option will change the fetch command and simply using "emerge" will make use of getdelta.

File: /etc/make.conf


FETCHCOMMAND="/usr/bin/ \${URI}"


emerge-delta-webrsync will allow only the changes of a portage snapshot to be downloaded, and sync the tree from the new snapshot built in the process.


Code: Installation
# emerge emerge-delta-webrsync

Note that emerge-delta-webrsync works by pulling patches down, as such, the first run of it will require a full fetch. Subsequent runs however will not.

If you would like emerge-delta-webrsync to show you more information about the download while it is fetching the patches, you can do the following:

1. Open /usr/bin/emerge-delta-webrsync with a text editor as root.
# nano -w /usr/bin/emerge-delta-webrsync

2. Then find the following line:
and change it to:

Now emerge-delta-webrsync will show you more information about the download of the patches, such as percent done and estimated completion.

Bandwidth Limiting

A useful property of wget for dialup users is the --limit-rate option. This limits the bandwidth amount that wget will use when downloading packages so you can emerge programs and go about other online business without it interupting you too much.

Obviously it will take longer to download everything but I this trade off is worth if you must update your system, check mails, read the web, etc. - everything at the same time over a slow connection. With this option you can start an emerge and just forget about it until its done.

To make use of this (assuming you're downloading the files with your machine using the dialup connection and not at another machine with a faster one) just update /etc/make.conf to use the following FETCHCOMMAND and RESUMECOMMAND.

File: /etc/make.conf


FETCHCOMMAND="/usr/bin/wget --limit-rate=1.5k -t 5 --passive-ftp -P \${DISTDIR} \${URI}"
RESUMECOMMAND="/usr/bin/wget --limit-rate=1.5k -c -t 5 --passive-ftp -P \${DISTDIR} \${URI}"

Also, ask local providers for a dialup flatrate (if available in your area).

See also

No Internet Connection Forum topic
Dynamic Deltup

Last modified: Thu, 10 Jul 2008 12:48:00 +0000 Hits: 27,269