SEQanswers (
-   Bioinformatics (
-   -   Run hundreds of BWA commands without waiting (

CNVboy 06-14-2011 04:02 PM

Run hundreds of BWA commands without waiting
Hi guys
I'm analyzing some high-coverage trio data. So Need to run BWA for hundreds of fastq.gz files. Obviously I should write some script to finish such task without waiting and typing in hundreds of commands one by one. But as a beginner without coding experience, I don't know how to do.

For example, I just put
bwa aln -t 24 index file1>1.sam
bwa aln -t 24 index file2>2.sam
bwa aln -t 24 index file3>3.sam

into the script, and run it..........and it doesn't work at all.
I know I must miss sth., say, the pathway for fastq files.

anyone can give any pattern about such script of executing multiple jobs? thx

fpepin 06-14-2011 05:14 PM

How about giving us the error message that you're getting? What if you just run one of them in isolation? It's hard to guess what the problem is as we have no idea what your environment and files are.

Kennels 06-14-2011 05:16 PM

Without having to learn perl or other languauge, the simplest but probably very clunky way of doing it is to use a shell script.

Create a text file (e.g. with gedit would be simplest), and name it (where .sh stands for shell).

In the file at the top, put in:



echo `bwa aln -t 24 index file1>1.sam`
echo `bwa aln -t 24 index file2>2.sam`
echo `bwa aln -t 24 index file3>3.sam`


This will execute each line as step by step commands. Make sure the quotes is the one generated by the tilde ('~') key at the top left of your keyboard. Of course typing all the lines would be a chore, so you might want to use a spreadsheet to fill in the increments.

One last thing. After you've created this file, you need to give it permission to be executable. In the terminal, cd to where this file is created, and type:


chmod +x
Then, to execute:


nohup ./ > log.txt

You could also just execute ./, but 'nohup' sends all output of this script to log.txt, so if you run into any errors you can trace it back in the log file.

I understand your position so I've tried my best to help out here (not sure what the general rule here about writing code is for other people, but...). Try googling about shell scripts, and I'm sure you'll pick it up quickly and improve the above too.

CNVboy 06-14-2011 05:40 PM

Many many thanks Kennels!!!

Yeah, I just made the spreadsheet, actually with 800 fastq files. And the shell script works quite well!
btw, can you give any clue about writing Perl to finish such task? I think I can read and understand some Perl script, but just cannot write by myself.

thx a lot!

Kennels 06-14-2011 06:13 PM

In perl, you can probably do it with 'for' or 'while' conditional loops using counters, and use 'filehandles' for input/output files, but I'm still a novice at best.
As there is no simple answer to your request other than to sit down and learn it from scratch, I would recommend buying 'Beginning Perl for Bioinformatics' by James Tisdall (O'Reilly books).

Even reading the first few chapters is enough to get you through many common tasks. It sure did for me, and I'm not even finished with the book.
Good luck!

fpepin 06-14-2011 06:40 PM

I hadn't understood the question was about the script, not bwa.

I haven't read that book but I've heard good things about it.

You can do it in the shell fairly easily too:

for i in file*
bwa aln -t 24 index $i>$i.sam

Another way that I like is:

ls file* | xargs -n1 -P24 -I{} bwa aln index {}>{}.sam
xargs is nice in that it can run many jobs in parallel. For example, the example above would run 24 jobs with 1 thread each as opposed to running each sequentially with 24 threads. It makes little difference here, but it comes in real handy when working with tools that are only single-threaded.

Both of the examples above have a longer name for the sam file. You if you prefer your original naming scheme, you can use something like:

for i in file*.sam
echo mv $i `echo $i | sed -e 's/file//'`

That being said, learning a bit of perl will come in handy.

All times are GMT -8. The time now is 04:17 AM.

Powered by vBulletin® Version 3.8.9
Copyright ©2000 - 2021, vBulletin Solutions, Inc.