<HTML dir=ltr><HEAD><TITLE>Re: [Apt-zip-devel] Exporting USEMD5SUMS</TITLE>
<META http-equiv=Content-Type content="text/html; charset=unicode">
<META content="MSHTML 6.00.2900.3132" name=GENERATOR></HEAD>
<BODY>
<DIV id=idOWAReplyText44843 dir=ltr>
<DIV dir=ltr><FONT face=Tahoma size=2><B>De:</B> Giacomo A. Catenazzi<BR><B>Enviado el:</B> mar 18/03/2008 13:44<BR><B>Para:</B> apt-zip-devel</FONT></DIV>
<DIV dir=ltr><FONT face=Tahoma size=2><B>Asunto:</B> Re: [Apt-zip-devel] Exporting USEMD5SUMS<BR></FONT></DIV></DIV>
<DIV>
<P><FONT size=2>PEDRO MACANAS VALVERDE wrote:<BR>><BR>>> To download, the server could generate an hmtl page with <A href="http:///">http://</A> links<BR>> included to the mirrors with the packages<BR>><BR>>> So, downloading would be very easy: one could also use a spider for<BR>> automatization.<BR><BR>>This is the key feature/problem.<BR></FONT><FONT size=2><BR>>I don't think we could give user link to 50-100 packages,<BR>and require that user will click every link.</FONT></P><FONT size=2></FONT></DIV>
<P><FONT size=2>I agree. Although for a small list can be a solution (4-5 packages to establish the Internet connection with some driver and daemon packages and later update/upgrade all the distro). </FONT></P>
<P><FONT size=2>>But to do to in such manner is very simple on server side.<BR><BR>>So now I'll found two possibilities:<BR>>- generate one (or few more) tar file.</FONT></P>
<P><FONT size=2>This can be very slow to download, if it includes a lot of packages. We can, in any case, use the METALINK that one can pause and resume interrupted downloads (using also Windows and Linux programs for Metalink): <A href="http://en.wikipedia.org/wiki/Metalink">http://en.wikipedia.org/wiki/Metalink</A> </FONT></P>
<P><FONT size=2>>- javascript that will download automatically the files.</FONT></P>
<P><FONT size=2>It is important include this pause and resume buttons / options. Perhaps the user only can download the packages in more than one time /day.</FONT></P>
<P><FONT size=2>>I don't like the "spiders" idea. It will mean that user need "wget",</FONT></P>
<P><FONT size=2>No, it does not necessarily, one can use other spiders than wget</FONT> ( <A href="http://en.wikipedia.org/wiki/Web_crawler">http://en.wikipedia.org/wiki/Web_crawler</A> , including plug-ins for firefox <A href="https://addons.mozilla.org/es-ES/firefox/addon/1616">https://addons.mozilla.org/es-ES/firefox/addon/1616</A> ). In any case, we can offer more than one solution.</P>
<P>The main problem with wget is that you cannot use it in Windows without big problems (you have to install Cygwin). But a compact spider program could solve the problem (i.e a metalink program, that can be easily installed in Windows and in Linux, because there are lots of them in both O.S.es).</P>
<P><FONT size=2>>but it will be difficult to windows user, and we want to avoid<BR>extra tools on connected machine.</FONT></P>
<P><FONT size=2>This is a good principle and one can try it first. One can offer the spider as an option, in a second stage.</FONT><FONT size=2><BR><BR>>But probably this will be the first step of the prototype:<BR>build a list of packages to download.<BR></FONT><FONT size=2></FONT></P>
<P><FONT size=2>I totally agree. This would include a matching with the packages already installed / downloaded by the user in his/her dpkg system </FONT></P>
<P><FONT size=2>>The second step will be to tar the files or to execute the javascript.<BR></FONT></P>
<P><FONT size=2>Which functionalities could be used in the Javascript to download the packages ?.</FONT></P>
<P><FONT size=2>Good work. </FONT></P>
<P><FONT size=2>Pedro.</P>
<P><BR></P></FONT>
<P><FONT size=2> </P></FONT></BODY></HTML>