Short code snippets to save precious time

As a developer i find my self constantly writing small code snippets to help me during the day.
Most of these snippets are intended to shorten testing, development or even research work.

PHP – 5 minutes “proxy”
As someone who deals with back-end APIs and Integration constantly i need something which can modify server responses on the fly, so i can test my code properly (ex – timeouts).
Today, in almost any large linux distrobution you can install the well known LAMP stack by selecting a few packages. using this stack we can write the following –

$url = 'http://my-awesome.api.server.com/' . $_REQUEST['q'] . "?";
foreach ($_REQUEST as $k => $v) {
        if ($k != 'q') {
                $url.=$k."=".urlencode($v).'&';
        }
}
$url = substr($url, 0, strlen($url)-1);
$data = file_get_contents($url);
header('Content-Type: application/json');
print "$data";

This short snippet does the very simple job of forwarding a simple GET request along with it’s query parameters to some server.
This can be very helpful for intercepting the response ($data) and modifying it as we please, or add a short sleep(time) to check integration response to some server timeout (or in some weird cases, check we are displaying a loading screen).

Quick and dirty bash scraper
One time i found myself having the need to filter out some information from a very large text based structure (json / xml), and download each of the findings (pictures in this instance). being lasy i decided to attack this using a single line bash –

for sagi in  `curl "http://my-awesome.api.server.com" | python -mjson.tool | grep -io 'picture/[a-z0-9.,-+]*[.jpe?g]' `; do wget $sagi; done

This scripts received a json, parsed it, grepped all the pictures, and sent each url to the for loop, which then did wget on each url. This was way shorter and easier then copy pasting each picture….

more to come….