Saturday, September 22, 2007

Thursday, September 6, 2007

[Work] wget for downloading

excerpt from wget manual page

7.1 Simple Usage

* Say you want to download a url. Just type:

wget http://fly.srk.fer.hr/


* But what will happen if the connection is slow, and the file is lengthy? The connection will probably fail before the whole file is retrieved, more than once. In this case, Wget will try getting the file until it either gets the whole of it, or exceeds the default number of retries (this being 20). It is easy to change the number of tries to 45, to insure that the whole file will arrive safely:

wget --tries=45 http://fly.srk.fer.hr/jpg/flyweb.jpg


* Now let's leave Wget to work in the background, and write its progress to log file log. It is tiring to type --tries, so we shall use -t.

wget -t 45 -o log http://fly.srk.fer.hr/jpg/flyweb.jpg &


The ampersand at the end of the line makes sure that Wget works in the background. To unlimit the number of retries, use -t inf.
* The usage of ftp is as simple. Wget will take care of login and password.

wget ftp://gnjilux.srk.fer.hr/welcome.msg


* If you specify a directory, Wget will retrieve the directory listing, parse it and convert it to html. Try:

wget ftp://ftp.gnu.org/pub/gnu/
links index.html


Next: Very Advanced Usage, Previous: Simple Usage, Up: Examples


7.2 Advanced Usage

* You have a file that contains the URLs you want to download? Use the -i switch:

wget -i file


If you specify - as file name, the urls will be read from standard input.
* Create a five levels deep mirror image of the GNU web site, with the same directory structure the original has, with only one try per document, saving the log of the activities to gnulog:

wget -r http://www.gnu.org/ -o gnulog


* The same as the above, but convert the links in the html files to point to local files, so you can view the documents off-line:

wget --convert-links -r http://www.gnu.org/ -o gnulog


* Retrieve only one html page, but make sure that all the elements needed for the page to be displayed, such as inline images and external style sheets, are also downloaded. Also make sure the downloaded page references the downloaded links.

wget -p --convert-links http://www.server.com/dir/page.html


The html page will be saved to www.server.com/dir/page.html, and the images, stylesheets, etc., somewhere under www.server.com/, depending on where they were on the remote server.
* The same as the above, but without the www.server.com/ directory. In fact, I don't want to have all those random server directories anyway—just save all those files under a download/ subdirectory of the current directory.

wget -p --convert-links -nH -nd -Pdownload \
http://www.server.com/dir/page.html


* Retrieve the index.html of www.lycos.com, showing the original server headers:

wget -S http://www.lycos.com/


* Save the server headers with the file, perhaps for post-processing.

wget --save-headers http://www.lycos.com/
more index.html


* Retrieve the first two levels of wuarchive.wustl.edu, saving them to /tmp.

wget -r -l2 -P/tmp ftp://wuarchive.wustl.edu/


* You want to download all the gifs from a directory on an http server. You tried wget http://www.server.com/dir/*.gif, but that didn't work because http retrieval does not support globbing. In that case, use:

wget -r -l1 --no-parent -A.gif http://www.server.com/dir/


More verbose, but the effect is the same. -r -l1 means to retrieve recursively (see Recursive Download), with maximum depth of 1. --no-parent means that references to the parent directory are ignored (see Directory-Based Limits), and -A.gif means to download only the gif files. -A "*.gif" would have worked too.
* Suppose you were in the middle of downloading, when Wget was interrupted. Now you do not want to clobber the files already present. It would be:

wget -nc -r http://www.gnu.org/


* If you want to encode your own username and password to http or ftp, use the appropriate url syntax (see URL Format).

wget ftp://hniksic:mypassword@unix.server.com/.emacs


Note, however, that this usage is not advisable on multi-user systems because it reveals your password to anyone who looks at the output of ps.

* You would like the output documents to go to standard output instead of to files?

wget -O - http://jagor.srce.hr/ http://www.srce.hr/


You can also combine the two options and make pipelines to retrieve the documents from remote hotlists:

wget -O - http://cool.list.com/ | wget --force-html -i -

[Work] command line parser and usage



void usage(const char *myname)
{
fprintf(stderr, "Usage: %s infile [options] \n", myname);
fprintf(stderr, "Options:\n");
fprintf(stderr, " -help this page\n");
fprintf(stderr, " -lw line width\n");
fprintf(stderr, " -cl low threshold of curvature\n");
fprintf(stderr, " -cu up threshold of curvature\n");
fprintf(stderr, " -rs_iter_max maximum iteration number of ridge searching\n");
fprintf(stderr, " -beta the beta\n");
fprintf(stderr, " -d_max_search the maximum distance threshold before ridge searching\n");
exit(1);
}


void parse_cmd(int argc, char** argv){
if(argc<2)
usage(argv[0]);
for (int i = 1; i < argc; i++) {
if (!strcmp(argv[i], "-help")){
usage(argv[0]);
}
if (!strcmp(argv[i], "-lw")){
i++;
lw=atof(argv[i]);
}else if(!strcmp(argv[i], "-cl")){
i++;
cl=atof(argv[i]);
}else if(!strcmp(argv[i], "-cu")){
i++;
cu=atof(argv[i]);
}else if(!strcmp(argv[i], "-rs_iter_max")){
i++;
rs_iter_max=atoi(argv[i]);
}else if(!strcmp(argv[i], "-beta")){
i++;
beta=atof(argv[i]);
}else if(!strcmp(argv[i], "-d_max_search")){
i++;
d_max_search=atof(argv[i]);
}
}
}

Monday, September 3, 2007

[Work] SVD example



#include < tnt.h >
#include < jama_svd.h >

void svbksb(TNT::Array2D &u, TNT::Array1D &w, TNT::Array2D &v, int m, int n, TNT::Array1D &b, TNT::Array1D &x)
{
int jj,j,i;
float s;

TNT::Array1D< float > tmp(n, 0.0f);

for (j=0; j < n; j++) {
s = 0.0;
if ( w[j]) {
for (i = 0; i < m; i++) s += u[i][j]*b[i];
s /= w[j];
}
tmp[j]=s;
}

for (j=0;j < n;j++) {
s=0.0;
for (jj=0;jj < n;jj++) s += v[j][jj]*tmp[jj];
x[j]=s;
}
}

void svd_solve(TNT::Array2D A, TNT::Array1D b, TNT::Array1D &sol){
int m=A.dim1();
int n=A.dim2();

JAMA::SVD svd(A);
TNT::Array2D u;
svd.getU(u);
TNT::Array1D w;
svd.getSingularValues(w);
TNT::Array2D v;
svd.getV(v);
svbksb(u, w, v, m, n, b, sol);
}


int main(int argc, char* argv[]){

int n, m;
n=m=8;

TNT::Array2D A(m, n, 0.0);
TNT::Array1D b(m, 0.0);
TNT::Array1D sol(n, 0.0);

//here fill in A and b

svd_solve(A, b, sol){

}

[Work] SVD

There exists a very powerful set of techniques for dealing with sets of equations
or matrices that are either singular or else numerically very close to singular. In many
cases where Gaussian elimination and LU decomposition fail to give satisfactory
results, this set of techniques, known as singular value decomposition, or SVD,
will diagnose for you precisely what the problem is. In some cases, SVD will
not only diagnose the problem, it will also solve it, in the sense of giving you a
useful numerical answer, although, as we shall see, not necessarily “the” answer
that you thought you should get.

SVD is also the method of choice for solving most linear least-squares problems.
We will outline the relevant theory in this section, but defer detailed discussion of
the use of SVD in this application to Chapter 15, whose subject is the parametric
modeling of data.

-- from Numerical Recipes

[Work] Carsten Steger's line detector

The old link to Steger's line detector[1] is no longer available.
But you can get an adapted version from GRASP.

[1] Carsten Steger, "An Unbiased Detector of Curvilinear Structures," IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 20, no. 2, pp. 113-125, Feb., 1998

Sunday, September 2, 2007

[Work] intrisic shape depiction

I got the idea to depict intrinsic shape by using active illumination.
The contribution should be big enough for a premium CG conference.

[Work] use emacs with ssh

By using
ssh -X contact.mpi-sb.mpg.de
instead of using
ssh contact.mpi-sb.mpg.de
you can open remote document using emacs with visual interface.