Proxy keeping you up at night when running local Selenium tests in Chrome?

A Tale of Pesky Proxies Running Selenium / Protractor UI Tests Locally Inside Chrome On a Mac


As a tester, I face all sorts of environmental idiosyncrasies when testing products across various platforms, not just in the way those platforms behave, but also how they retain state throughout time.  (This is why Sauce Labs is great -- there is no leftover gunk from a prior state, since each test environment is a freshly-baked VM, thus eliminating the old adage "Well it works on my machine, so why not yours?")

However, it's not always practical to jump right into using Sauce Labs, especially when trying to develop new tests.  In attempting to do just that and run on my own local environment, though, Chrome on my Mac would keep asking me for proxy credentials because of various external assets the site attempts to load over HTTPS through content providers and API services.  No matter how I would try to specify the proxy settings to the WebDriver or as command-line arguments to Chrome in the Protractor conf.js file, that pesky prompt would still ask me for them in person.  I even tried Chrome Canary, a specialized browser designed distinctly for Web developers & professionals, yet that one asked for my proxy credentials even more.  My other testing counterparts run Windows and did not seem to have this problem, so it was up to me alone to solve it.  It became apparent that there would be no simple way to subvert this stupid little box from coming up and interrupting all my automation, no matter which switches, system settings, or configuration parameters I set.

Finally, I decided to take a different approach and redirect external servers (that would trip proxy authentication) to a Node server on localhost that would fetch the data through the proxy properly.  I wrote some small functions for this in order to make calls directly to our own application's internal API for testing and to access some of the external services it relies upon, so it was easy to expound upon my initial work to get automated Selenium tests in Protractor finally working in the browser without any annoying prompts.

Note: This involves running things as root.  If you are unable or unwilling to do this, stay tuned for a future post on how to redirect your Web traffic in a slightly different manner than described below.


The essence of the solution is to run a Node.js server locally that listens over ports 80 & 443 for the usual Web traffic that your internal application (behind the proxy) would normally send externally to another site or service.  However, now you will intercept all these external requests and redirect them to localhost.  For some reason, while Chrome doesn't tend to pay attention to your system proxy settings while you are running Selenium tests, Node.js will always fetch whatever data you want.

1. Edit /etc/hosts

Add the server(s) to 127.0.0.1 that you need to connect to but cause proxy problems, such as:

127.0.0.1       localhost internal-name1 internal-name2 https-api-service1.com https-api-service2.com

2. Force Chrome to look at /etc/hosts

There's another obscure issue in Chrome (assuming you're also using a Mac and facing the proxy popup issue) where if Chrome is set up to use a Proxy Autoconfiguration (*.pac) file, whether through system settings or some sort of override, it will ignore the hosts file altogether, let alone your proxy settings.  (Gee thanks, Chrome!)  Since our systems generally come pre-configured with a PAC file and I didn't feel like modifying my whole system just to work around something that doesn't affect my "in-person" browsing, I had to make this addition to my conf.js file for Protractor:

chromeOptions: {
    ...
    'args': [
        "--no-proxy-server"
    ]
}

3. Make the Node.js server with Express & Router

Unfortunately, this has to be run as root because it involves listening on ports 80 & 443.  Notice where it uses $HTTP_PROXY to pull in proxy info, and right below that where it expects a private key & certificate from OpenSSL.  There are plenty of tutorials online on how to do this.  Then, notice the routes toward the bottom that redirect to certain API sites/endpoints depending on the path requested from localhost.  Here is pretty much the code you'll need, with the exception of your specific proxy credentials and the exact routes which you need to define:

var fs = require('fs');
var express    = require('express');        // call express
var app        = express();                 // define our app using express
var bodyParser = require('body-parser');
var http = require("http");
var https = require("https");
var HttpsProxyAgent = require('https-proxy-agent');

var agent = new HttpsProxyAgent(process.env.HTTP_PROXY);

var privateKey  = fs.readFileSync('key.pem', 'utf8');
var certificate = fs.readFileSync('cert.pem', 'utf8');

var credentials = {key: privateKey, cert: certificate};

// configure app to use bodyParser()
// this will let us get the data from a POST
app.use(bodyParser.urlencoded({ extended: true }));
app.use(bodyParser.json());

var makeWebRequest = function(settings, newRes) {
var externalReq = https.request(settings, function (externalRes) {
var body = '';
externalRes.on('data', function(data) {
body += data;
});
externalRes.on('end', function() {
console.log("Finished with request to " + settings.host + settings.path);
newRes.json(body);
});
})
externalReq.end();
externalReq.on('error', function(e) {
console.log("\033[1;31mFAILED\033[0m to make the " + settings.method + " request to " + settings.host + settings.path);
});
}
// ROUTES FOR OUR API
// =============================================================================
var router = express.Router();          // get an instance of the express Router

//***************************************************************
//Route your external API #1
//***************************************************************
router.get('/api/v1/endpoint1/*', function(originalReq, newRes) {
var settings = {
host: "https-api-service1.com",
port: originalReq.port,
path: originalReq.url,
method: originalReq.method,
agent: agent
};
makeWebRequest(settings, newRes);
});

// ***************************************************************
// Route your external API #2 (hopefully they don't use exactly the same endpoint paths)
// ***************************************************************
router.get('/another_api_service/endpoint2/*', function(originalReq, newRes) {
var settings = {
host: "https-api-service2.com",
port: originalReq.port,
path: originalReq.url,
method: originalReq.method,
agent: agent
};
makeWebRequest(settings, newRes);
});

// REGISTER OUR ROUTES -------------------------------
app.use('/', router);

// START THE SERVER ----------------------------------
var httpServer = http.createServer(app);
var httpsServer = https.createServer(credentials, app);

httpServer.listen(80);
httpsServer.listen(443);
console.log('Magic happens on ports 80 & 443');



Bonus for Applitools Users


While the steps above pertain to simply getting Protractor and Selenium to work in general without the pesky proxy popup, let's just say there's a certain level of bullsnot I was willing to tolerate until I started using Applitools to bolster the set of automated tests in the arsenal.  For those of you who aren't aware, Applitools is a sophisticated product that will do comparisons of your UI with mockups, previous versions, or various other types of baselines with varying degrees of granularity, all the way from "your browser isn't rendering the anti-aliasing on the text just like Photoshop did" up to "sure all the content inside changed, but all your DIVs & big layout pieces stayed in the same place."  It's a flexible tool, but to get started with it, this stupid proxy prompt was really wasting a lot of my time and getting in my way immensely, so it needed to die in a blaze of ignominy, hence my hack described above.  However, there is one more step for you Applitools users to heed.

If your proxy server requires ID such as http://user:password@proxy.com:port, the https.request() function provided by Node.js' HTTPS module must be modified so that it uses the proxy when Applitools features are requested, particularly when saving test results (not just baselines).  For me, this means using the https-proxy-agent Node module rather than tunnel, and making sure to include the username & password.  Copy the following code into a new file, and name it applitools-http-proxy.js:

var HttpsProxyAgent = require('https-proxy-agent');
var agent = new HttpsProxyAgent(process.env.HTTP_PROXY);
var https = require('https');
var __request = https.request;

https.request = function (options, callback) {

    if (options.host.indexOf('applitools') > -1) {
        options.agent = agent;
    }

    return __request(options, callback);

};

Then in your Applitools test, simply use this as your first line (assuming the file you just made above is in the same path as your test):

require('./applitools-http-proxy');

It'll be interesting to see how many people have run into this issue and find this solution to be useful.  Together, we will make test automation that runs without any annoyances!

Comments

Popular posts from this blog

Less Coding, More Prompt Engineering!

Start Azure Pipeline from another pipeline with ADO CLI & PowerShell

An Augmented Reality Experience to Complement a Vintage Pinball Machine