PHP lint checking multiple files in parallel

2 November, 2011 - 00:11

When programming, lint checking your source code can be useful to detect low level issues like syntax errors, undefined or unused variables. The PHP command line interpreter has an option -l/--syntax-check to do a syntax check of the provided PHP file, instead of executing it:

$ php -l file_with_syntax_error.php 
PHP Parse error:  syntax error, unexpected T_VARIABLE in file_with_syntax_error.php on line 5
Errors parsing file_with_syntax_error.php

Unfortunately, it only allows to check one file at a time. For example, the following does not work as expected:

$ php -l file1.php file2.php file_with_syntax_error.php 
No syntax errors detected in file1.php

If you want to lint-check (a complete tree of) multiple PHP files, you have to use some kind of loop (e.g. in Bash) to check them one by one.

I recently was faced with this issue, but also wanted to have some parallelism, to speed up the process by putting multiple cores a work, instead of checking one file at a time. After googling a bit I found the following (command line) solutions.

PHP is easy or not

11 March, 2008 - 18:11

PHP is easy, unless you're paying me, in which case its very difficult.
-- Nick Lewis


Download offline version of dynamic pages with Wget

24 November, 2006 - 10:43

Remainder mainly to myself: short list of useful options of wget for recursive downloading of dynamic (PHP, ASP, ...) webpages (because wget's man page is too long):

  • --no-clobber: do not redownload pages that already exist locally.
  • --html-extension: append extension .html to webpages of which the URL does not end on .html or .htm but with things like .php or .php?q=boo&bar=4.
  • --recursive: turn on recursive downloading.
  • --level=3: set the recursion depth.
  • --convert-links: make the links in downloaded documents point to local files if possible.
  • --page-requisites: download embedded images and stylesheets for each downloaded html document.
  • --relative: only follow relative links, not absolute links (even if in the same domain).
  • --no-parent: do not ascend to parent directory of the given URL while recursively retrieving.