This question already has an answer here:
Like executing a command using backticks or exec()
or system()
.
</div>
Yes, this is bad practice in almost all cases! This is especially true if you have a choice.
Executing external commands is:
Very hard to do correctly (but very easy to get sort-of-working). You can't just escape the shell args and call it a day, you have to account for potentially multiple levels of escaping for potentially different languages.
Slow. A fork+exec to e.g. rm
is easily a thousand times slower than the corresponding syscall.
A rigid, error-prone and inexpressive integration point. You typically have to convert data to flat lists of strings and back. You can't use the language's features like exception handling, nested data structures or callbacks.
Due to this, the following are BAD reasons to call external commands:
Not knowing how to do X in your language, but knowing a shell command for it. A typical example is cp -R foo bar
.
Not knowing how something works, but knowing a shell oneliner that does it. A typical example is foo *.mp4 > >(tee file)
.
Not wanting to learn a new API for e.g. json or http, and instead using shell tools like jq
or curl
.
However, if you are calling a program that does non-trivial things, that doesn't have a native library or bindings, AND that you know how to invoke with execve
semantics (NOT system
nor perl exec
semantics that invoke shells), this is a valuable tool.
Examples of good uses of executing external commands that follow all the above is invoking make
to build a project from an installer, or running java -jar ...
to start a Minecraft server.
Why not ?
But, be careful and escape all strings inserted in the command :
Some operating system commands might have built-in functionality not available in language (perl's mkdir() lacks *ix mkdir's -p). Then again, something might be easier to do using languages constructs instead of parsing output (readdir() vs ls
).
And it is important to remember that something written in perl might be more portable to non-unix systems than calling os-specific external programs.
It's a great* practice. Perl and PHP are great for lots of things, but they're not great at everything, and there's a use case for using external programs and other tools in your project. But one of things that Perl is definitely great at is gluing together input and output formats and letting you mash together several different tools into a single project, letting each part of the project do what they do best.
* by which I mean, often a great practice. Things like @files=qx(ls $dir)
and @txt=qx(cat $textfile)
make all right-thinking Perl programmers cringe.