Hi All! I could really use some help with pcntl_fork to do the below as I am not quite sure where I should start to tackle this problem. Any help would be appreciated.
Write a CLI script in PHP that meets the following usage specs:
hammer - recursively spiders a URL (or URLs) using concurrent processes.
Usage: hammer [options… ] “url” [ url2 url3 url4 … ]
-c/–concurrency “concurrent_child_count” Maximum number of children to spawn. Default: 1
-s/–seconds “seconds_to_run” Number of seconds to run. Default: 300
“url” URL at which to start spidering.
This must use PHP’s pcntl_fork to fork it’s children, and it must clean up the child processes properly (even if manually sent a kill signal). If given a list of URLs, child processes must somehow know to start spidering at one of these URLs (choosing one randomly is fine). No output or files need to be saved. The intention here is to load a website with heavy usage without providing specific webpage URLs. You may use any tools at your disposal (wget, curl, whatever).