Thursday, April 30, 2015

Proxy keeping you up at night when running local Selenium tests in Chrome?

A Tale of Pesky Proxies Running Selenium / Protractor UI Tests Locally Inside Chrome On a Mac

As a tester, I face all sorts of environmental idiosyncrasies when testing products across various platforms, not just in the way those platforms behave, but also how they retain state throughout time.  (This is why Sauce Labs is great -- there is no leftover gunk from a prior state, since each test environment is a freshly-baked VM, thus eliminating the old adage "Well it works on my machine, so why not yours?")

However, it's not always practical to jump right into using Sauce Labs, especially when trying to develop new tests.  In attempting to do just that and run on my own local environment, though, Chrome on my Mac would keep asking me for proxy credentials because of various external assets the site attempts to load over HTTPS through content providers and API services.  No matter how I would try to specify the proxy settings to the WebDriver or as command-line arguments to Chrome in the Protractor conf.js file, that pesky prompt would still ask me for them in person.  I even tried Chrome Canary, a specialized browser designed distinctly for Web developers & professionals, yet that one asked for my proxy credentials even more.  My other testing counterparts run Windows and did not seem to have this problem, so it was up to me alone to solve it.  It became apparent that there would be no simple way to subvert this stupid little box from coming up and interrupting all my automation, no matter which switches, system settings, or configuration parameters I set.

Finally, I decided to take a different approach and redirect external servers (that would trip proxy authentication) to a Node server on localhost that would fetch the data through the proxy properly.  I wrote some small functions for this in order to make calls directly to our own application's internal API for testing and to access some of the external services it relies upon, so it was easy to expound upon my initial work to get automated Selenium tests in Protractor finally working in the browser without any annoying prompts.

Note: This involves running things as root.  If you are unable or unwilling to do this, stay tuned for a future post on how to redirect your Web traffic in a slightly different manner than described below.

The essence of the solution is to run a Node.js server locally that listens over ports 80 & 443 for the usual Web traffic that your internal application (behind the proxy) would normally send externally to another site or service.  However, now you will intercept all these external requests and redirect them to localhost.  For some reason, while Chrome doesn't tend to pay attention to your system proxy settings while you are running Selenium tests, Node.js will always fetch whatever data you want.

1. Edit /etc/hosts

Add the server(s) to that you need to connect to but cause proxy problems, such as:       localhost internal-name1 internal-name2

2. Force Chrome to look at /etc/hosts

There's another obscure issue in Chrome (assuming you're also using a Mac and facing the proxy popup issue) where if Chrome is set up to use a Proxy Autoconfiguration (*.pac) file, whether through system settings or some sort of override, it will ignore the hosts file altogether, let alone your proxy settings.  (Gee thanks, Chrome!)  Since our systems generally come pre-configured with a PAC file and I didn't feel like modifying my whole system just to work around something that doesn't affect my "in-person" browsing, I had to make this addition to my conf.js file for Protractor:

chromeOptions: {
    'args': [

3. Make the Node.js server with Express & Router

Unfortunately, this has to be run as root because it involves listening on ports 80 & 443.  Notice where it uses $HTTP_PROXY to pull in proxy info, and right below that where it expects a private key & certificate from OpenSSL.  There are plenty of tutorials online on how to do this.  Then, notice the routes toward the bottom that redirect to certain API sites/endpoints depending on the path requested from localhost.  Here is pretty much the code you'll need, with the exception of your specific proxy credentials and the exact routes which you need to define:

var fs = require('fs');
var express    = require('express');        // call express
var app        = express();                 // define our app using express
var bodyParser = require('body-parser');
var http = require("http");
var https = require("https");
var HttpsProxyAgent = require('https-proxy-agent');

var agent = new HttpsProxyAgent(process.env.HTTP_PROXY);

var privateKey  = fs.readFileSync('key.pem', 'utf8');
var certificate = fs.readFileSync('cert.pem', 'utf8');

var credentials = {key: privateKey, cert: certificate};

// configure app to use bodyParser()
// this will let us get the data from a POST
app.use(bodyParser.urlencoded({ extended: true }));

var makeWebRequest = function(settings, newRes) {
var externalReq = https.request(settings, function (externalRes) {
var body = '';
externalRes.on('data', function(data) {
body += data;
externalRes.on('end', function() {
console.log("Finished with request to " + + settings.path);
externalReq.on('error', function(e) {
console.log("\033[1;31mFAILED\033[0m to make the " + settings.method + " request to " + + settings.path);
// =============================================================================
var router = express.Router();          // get an instance of the express Router

//Route your external API #1
router.get('/api/v1/endpoint1/*', function(originalReq, newRes) {
var settings = {
host: "",
port: originalReq.port,
path: originalReq.url,
method: originalReq.method,
agent: agent
makeWebRequest(settings, newRes);

// ***************************************************************
// Route your external API #2 (hopefully they don't use exactly the same endpoint paths)
// ***************************************************************
router.get('/another_api_service/endpoint2/*', function(originalReq, newRes) {
var settings = {
host: "",
port: originalReq.port,
path: originalReq.url,
method: originalReq.method,
agent: agent
makeWebRequest(settings, newRes);

// REGISTER OUR ROUTES -------------------------------
app.use('/', router);

// START THE SERVER ----------------------------------
var httpServer = http.createServer(app);
var httpsServer = https.createServer(credentials, app);

console.log('Magic happens on ports 80 & 443');

Bonus for Applitools Users

While the steps above pertain to simply getting Protractor and Selenium to work in general without the pesky proxy popup, let's just say there's a certain level of bullsnot I was willing to tolerate until I started using Applitools to bolster the set of automated tests in the arsenal.  For those of you who aren't aware, Applitools is a sophisticated product that will do comparisons of your UI with mockups, previous versions, or various other types of baselines with varying degrees of granularity, all the way from "your browser isn't rendering the anti-aliasing on the text just like Photoshop did" up to "sure all the content inside changed, but all your DIVs & big layout pieces stayed in the same place."  It's a flexible tool, but to get started with it, this stupid proxy prompt was really wasting a lot of my time and getting in my way immensely, so it needed to die in a blaze of ignominy, hence my hack described above.  However, there is one more step for you Applitools users to heed.

If your proxy server requires ID such as, the https.request() function provided by Node.js' HTTPS module must be modified so that it uses the proxy when Applitools features are requested, particularly when saving test results (not just baselines).  For me, this means using the https-proxy-agent Node module rather than tunnel, and making sure to include the username & password.  Copy the following code into a new file, and name it applitools-http-proxy.js:

var HttpsProxyAgent = require('https-proxy-agent');
var agent = new HttpsProxyAgent(process.env.HTTP_PROXY);
var https = require('https');
var __request = https.request;

https.request = function (options, callback) {

    if ('applitools') > -1) {
        options.agent = agent;

    return __request(options, callback);


Then in your Applitools test, simply use this as your first line (assuming the file you just made above is in the same path as your test):


It'll be interesting to see how many people have run into this issue and find this solution to be useful.  Together, we will make test automation that runs without any annoyances!

Thursday, April 16, 2015

Saucin' Up Perl with Selenium WebDriver

My foray into the world of Sauce OnDemand, made public about a month ago in an earlier blog post, landed me a spot as Presenter for April's meeting of the DFW Perl Mongers club!

But wait, you wrote all that code in JavaScript!

Yes, that's true.  And, despite Sauce Labs not really acknowledging the availability of a WebDriver module for Perl, it does indeed exist, and I found it, and wrote some nice automation in Perl to demo to the small crowd.  I even did some extra stuff they weren't anticipating -- showing off how to test mobile apps with Perl too by using Appium + Sauce Labs in order to provide an environment where the same test code can be used to test both a native Android app and a native iOS app, assuming they both had identical resource names for the graphical elements.  Unfortunately, the two APKs I had easy access to both caused a Force Close once the Android emulator in Sauce Labs started them up.  There were also some tweaks, features, and expanded capabilities I wanted to do/demonstrate in Perl, but ran out of time to incorporate them.  Oh well, I got close. :-/

Can I see the presentation too?

Yes, of course!  I have checked the presentation materials & demonstrated source code into GitHub, and the recording of the presentation has been made available on YouTube (part 1 and part 2) thanks to all meetings also being Hangouts On Air.  Given the usually small size of the Perl meetups, having these meetings recorded is why I didn't really bother ballyhooing the event to the public a whole lot.  (That way, in case I bombed, I just wouldn't tout the recording a lot.  But since it went well, yes, I'm definitely getting the word out. ;)  Watch that GitHub repo for enhancements and updates.