Thursday, April 26, 2018

Making a new Transaction Family on Hyperledger Sawtooth with Docker

Are you excited to see what all the buzz is about regarding Hyperledger Sawtooth?  Are you a fan of using Docker containers to conveniently test Sawtooth in an isolated and easily-deployable environment?  Then it probably won't be long before you want to run your own smart contract code beyond the basic examples provided to you.

At the time of this writing, the Docker containers for Sawtooth only provide examples of smart contracts written in Python 3 -- no JavaScript nor Go containers are available from Sawtooth Lake.  As far as transaction processors, which are the entities that actually run the smart contract code, the only ones available as Docker containers from Sawtooth Lake are:
  • sawtooth-settings-tp
  • sawtooth-intkey-tp-python
  • sawtooth-xo-tp-python
"Settings" is required in any Sawtooth deployment.  "Intkey" is an extremely simplistic transaction processor that simply stores integers on the blockchain, and "xo" allows you to play tic-tac-toe on the blockchain.  However, these are enough to help you write and deploy your own transaction processor, as long as you're comfortable with Python.

Installing and Testing Hyperledger Sawtooth on Docker

Follow the instructions on the Sawtooth documentation about downloading the default YAML file, sawtooth-default.yaml.  Then run docker-compose <YAML file> up.  Setting up any computer program doesn't get much simpler than that!  (Unless you're behind a corporate proxy, like me, but hopefully you can finagle your way around it.)

At this point, you can play with a fully-functioning Sawtooth system.  The first thing I wanted to try was the XO transaction family, since I thought it would be fun to play a game with it.  Note that to use the xo client, you must log into the sawtooth-shell-default Docker container:

docker exec -it sawtooth-shell-default bash 

However, there are some deficiencies in their quick & dirty example:
  • The game is instantiated by a creator, but the creator can get locked out of their own game if two other players quickly jump in
  • It doesn’t check to see if Player 1 == Player 2, thus a player can play against themselves
Also as I was looking to test the mechanism around sending events,  I needed the ability to modify the code running on the Docker container.  However, running docker-compose up & down would nuke the changes to my container’s filesystem.

Why? docker-compose is like docker run, where it refreshes the instance back to its initial image state.  With docker exec, you are running commands on an existing instance that is building up uncommitted changes to its file system.  You can commit these with docker commit.  Also, note that docker-compose down removes the created container instances, meaning that docker-compose up remakes everything from scratch.

Make a copy of the XO container for modifications

The docker commit command takes two parameters: the name of the running container, and the name of the desired image output.  You’re welcome to make changes to the XO transaction processor before running docker commit, as once you run docker-compose down & up, they’ll be nuked from the original image anyway.  But just to start with a clean template, I ran commit on an unmodified XO transaction processor container.

The command I used looked like this:

docker commit sawtooth-xo-tp-python-default hyperledger/sawtooth-xo-tp-python-mod:0.1

Now if you look at docker images, you’ll see your new image that matches the format of the original Sawtooth images.

There’s two things left to do:
  1. incorporate your new image into the docker-compose file so it will be brought up with the rest of the network upon your command, and
  2. incorporate an external volume so you don’t have to continue writing docker commit each time you make a tiny change to your files.

Instantiate the image as a container with docker-compose

To do this, I simply duplicated the xo-tp-python section in the YAML file, and tweaked a couple things, as such:

    image: hyperledger/sawtooth-xo-tp-python-mod:0.1
    container_name: sawtooth-xo-tp-python-with-events
      - validator
    entrypoint: xo-tp-python -vv -C tcp://validator:4004

Adding an external volume

The things that make the transaction processor tick should live within a volume you attach to your Docker container so that you can modify it at will from within the host (if you like GUI text editors) or from within the container (if you like masochism and wasting time).  If you don’t do this, you will lose all your changes, or you will have to run docker commit each time in order to save them into the image.

To do this, make a directory on your system where your volume will live.  I called mine:


Note I named it after my container so it would be easy to recall what the directory is for later.  Then, add it as a volume to your YAML file:

    image: hyperledger/sawtooth-xo-tp-python-mod:0.1
    container_name: sawtooth-xo-tp-python-with-events
      - "Source/Sawtooth/sawtooth-xo-tp-python-with-events:/root/tp-store"
      - validator
    entrypoint: xo-tp-python -vv -C tcp://validator:4004

At this point, consider commenting out the original XO transaction processor service in the YAML file to positively test your new service and avoid any interference from the original service.

Now the directory I created on the host will reside on /root/tp-store in the container.  Through the container, move all the guts of the XO transaction processor into the tp-store directory.  You should consider finding the root directory of all the code for the transaction processor so that you can copy its contents into tp-store and then delete that root (for my containers, this is shown below).  Then, save the state of the Docker container once again by running docker commit on your container.  Bring down your Sawtooth instances, and then modify your YAML file one last time to mount your host’s directory containing the transaction processor into the directory where the container will expect it once you bring the system back up. Finally, run docker-compose up and await the moment of truth!

NOTA BENE:  Getting an error as follows: ERROR: repository hyperledger/sawtooth-xo-tp-python-mod not found: does not exist or no pull access

Or even one like this (because you’re behind a proxy that blocks Docker) ERROR: Service 'xo-tp-with-events-python' failed to build: Get net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)

Make sure you have your repository name and tag typed in just exactly how you specified it in the docker commit command!  This eluded me for a while, as I had typed “1.0” in the YAML file rather than 0.1.

Anyway, a quick search led me to the Python modules existing at /usr/lib/python3/dist-packages/sawtooth_xo .  I took this folder, moved its contents to /root/tp-store, and then committed the image before writing docker-compose down.

tp-mod$ cd /root/tp-store/
tp-mod$ mv /usr/lib/python3/dist-packages/sawtooth_xo/* .
tp-mod$ rmdir /usr/lib/python3/dist-packages/sawtooth_xo

host$ docker commit sawtooth-xo-tp-python-with-events hyperledger/sawtooth-xo-tp-python-mod:0.1
host$ <Ctrl+C on the process running docker-compose up>
host$ docker-compose -f sawtooth-default.yaml down

Now, don’t forget to update the path where the Python modules need to be remounted to in order for the application to run correctly:

      - "Source/Sawtooth/sawtooth-xo-tp-python-with-events:/usr/lib/python3/dist-packages/sawtooth_xo"

While you’re at it, consider making some modifications to the transaction language in order to see your changes executed, such as replacing “X” and “O” with “A” and “B”, before starting the system.

Finally, run docker-compose -f sawtooth-default.yaml up to restart your Sawtooth environment.  To test out the game quickly, enter sawtooth-shell-default with docker exec, then write:

sawtooth keygen Alice
sawtooth keygen Bob
xo -v create firstgame --username Alice --url='http://rest-api:8008'
# (Note: It’s very important not to have the trailing slash after the port number in the URL, or else the command will fail.)
xo take firstgame 5 --url='http://rest-api:8008' --username Alice
xo take firstgame 4 --url='http://rest-api:8008' --username Bob
# Etc…
xo list --url='http://rest-api:8008'

Once you run xo list, you will see the state of the board reflect your desired player symbols rather than the standard “X” and “O” if you changed them.

Deploying Live Changes to the Transaction Processor

Unfortunately, it seems as though there is no easy way to roll out changes to your transaction processor code.  The servers must be cycled with docker-compose down & up in order to load any new code.  This means it'll be tedious to debug any code relying on a particular state in the blockchain that takes a long time to set up.

Attempt #1: I tried starting up another instance of the xo-tp-python process, which showed the results locally but transactions on the chain showed as PENDING rather than COMPLETED.

Attempt #2: I tried writing “kill -INT 1” in order to restart the entry point process, but this simply stopped the Docker container altogether.  Upon restarting it, transactions seemed to no longer be validated at all, so I cycled docker-compose.

Attempt #3: I tried writing a Bash script that would allow me to send a signal, which would run a function to stop and restart the transaction processor.  However, in experimenting with this, I was unsatisfied with the way interrupts were handled and with Bash's process management, as kill would spuriously fail to find the process ID of the process I spawned through the shell script.  Furthermore, it would tend to work reliably after I sent a signal to the script first, and then to the spawned process, which is quite a hassle.

Attempt #4: Having modified the entrypoint setting in the YAML file not to run xo-tp-python directly, but instead to call it through a separate script, at least xo-tp-python was not running as PID #1 anymore.  This means I could safely send signals to it without stopping the container.  But, since the Bash script didn't do what I had hoped, I wrote the following Python code:

import signal
import subprocess
import sys

cmd = "xo-tp-python -vv -C tcp://validator:4004"
killCmd = ["pkill", "xo-tp-python"]

def interrupt_handler(sig, frame):
    global p
    p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

def kill_handler(sig, frame):

signal.signal(signal.SIGINT, interrupt_handler)
signal.signal(signal.SIGTERM, kill_handler)

p = subprocess.Popen(cmd, shell=True, stdout=subprocess.PIPE, stderr=subprocess.STDOUT)

while True:
#    for line in iter(p.stdout.readline, ''):
#        print line,

Basically, this script restarts a process upon receiving a signal interrupt.  I experimented with it on my host machine by using the ping command rather than xo-tp-python and it worked to my satisfaction.  After installing this into my Docker image and setting up the following:

    entrypoint: /usr/bin/python3

The script (shown above) manages interrupts in order to manage the state of the transaction processor (so that my script doesn't simply stop upon a signal, and so that the system won't come down because the transaction processor wants to restart).  Upon configuring the YAML file as above, I was able to send signals to it, and see that it restarted the transaction processor, but the main console output shown by docker-compose up showed that in fact no transactions were processed once I killed the initial transaction processor.  As such, more research is required on this front.

NOTA BENE: When you write “exit” to leave your Docker container, all the instructions you wrote in its terminal will be saved to the history if you write “docker commit”.  However, if you don’t write “exit”, then these commands are not saved to the history.  You can leverage this to save useful commands to the history, but beware of it so that sensitive information you write on the terminal does not get stored.

Epilogue & Sources

If I end up doing anything with sending Sawtooth events in a transaction and then subscribing to them with a separate state delta subscriber, I might write about it here.  However, I was too focused on trying to figure out how to deploy changes live, and didn't get any time to research eventing.  If you know a way to refresh a transaction processor immediately, let me know in the comments!

Thursday, March 29, 2018

Reinventing Insurance Through Ethereum and Solidity

Below is code for a very simple smart contract that would implement about 40% of a decentralized insurance policy.  The remainder of the implementation would require a user interface to allow people to subscribe, pay premiums at certain time intervals, make claims, and (especially important for decentralization) verify the claims of others.

But Don't Tell Your Insurance Agent To Go Suck It Just Yet

I've been following matters of cryptocurrency since early 2013, back when Bitcoin was a mere $73.  At that point, I wasn't willing to bet the farm, but the idea of distributed anonymous (but yet fully transparent and public) transaction ledgers for the sake of auditability and traceability was alluring, especially for the sake of tracking physical assets and eliminating other big giant institutions such as title companies.  I ended up placing in a company hackathon in 2015 (and then getting invited to a subsequent, and very exclusive, company hackathon to further refine the idea before many executives) by implementing such an idea by using colored coins, which at the time, was the state-of-the-art way to use cryptocurrency to transcend simply monetary transactions and actually track "assets" with other meanings besides their value in Bitcoin.

At this point, there were rumblings of a magical new cryptocurrency called Ethereum that would revolutionize the notion of adding metadata to transactions, and could in fact be used to implement entire systems (called smart contracts) that could run themselves in order to perform computation to facilitate deciding whether or not to perform a transaction.  At first, I thought it was too good to be true, but later on in 2016, I found myself going to local Ethereum user group meetings where folks were diving into smart contracts and discussing ways to turn entire industries upside-down by cutting out all sorts of middlemen, such as our friendly but hapless insurance agent mentioned earlier.

I tried throughout the rest of 2016 to get my hands on some real live Ether through a faucet so I could start playing with making my own real smart contracts and do some real live experiments.  Little did I know that this was not required, because browser-based clients such as Remix allow you to write smart contracts without needing so much as a testnet, but my fruitless efforts fraught with frustration over none of these faucets actually depositing ether into any of my wallets led to me abandoning Ethereum to go catch up on the evolution in machine learning that has happened since I got out of college instead.

Anyway, the chance to get in on Ethereum at $7-20 per ether is probably long gone.  However, the amazing possibilities to disrupt many different industries, as highlighted by the surge in ICOs (initial coin offerings) in late 2017, has got me and many of my peers finally getting moving on related development efforts.  And so, hot off the press, I bring you this very basic implementation of an insurance policy in Ethereum.

Do As I Say, Not As I Do

The operating principle of this contract has been highly simplified from what would really be needed in an actual insurance policy, and this was written in about an hour while I had no previous experience with Solidity, so it's important to know it might not be well-written code executing a non-well-written insurance policy.  Nevertheless, it will help illustrate some of the key concepts in coding.

The insurance policy starts life when one user instantiates the contract.  Upon instantiation, nothing happens other than the formation of the contract account.  Once the contract account is formed, users of externally-owned accounts (EOAs) -- i.e. you & me who have wallets and control of them with private keys -- can send ether to this contract account in order to subscribe you to the insurance policy.  You can continue sending ether to it as long as you like.  You can keep track of how much ether you have sent to it.

When it comes time to make a claim, it is simple to do so -- just call the claim() function on the contract.  This allows other members of the insurance pool to verify your claim (hypothetically) and, if they approve, they can call the approveClaim() function.  Once two other members of the insurance pool have done this, and once you have paid in the minimum amount, then the contract will automatically credit your account with ether in the amount of payoutAmt.

Some things to note:

  • The contract must have a payable function in order to be able to accept ether.  To keep track of what was paid by whom, use the instance variables msg.sender and msg.value.
  • There is not really a good notion of null in Solidity.  In the instances where there was not a defined Subscriber (i.e. subscribers[address] is undefined), it tends to be just as valid to continue using the null pointer (where as you would get an exception in basically any other language), but once you go to read the values of one of the objects, it would read as 0.  This is why, if you plan to have values that could be legitimately 0, you might want to plan for another variable to indicate initialization of an object in a Mapping.  For my particular case, I could assume that a user with an amtPaid (amount paid in) of 0 probably didn't exist, and even if they did, it wouldn't really matter if they initialized another one properly and went forward using that one.

pragma solidity ^0.4.0;
contract Insurance {

    uint totalPot;
    address myContract;

    struct Subscriber {
        uint payoutAmt;
        uint amtPaid;
        bool hasClaim;
        address[] votes;
    mapping(address => Subscriber) subscribers;

    function Insurance() public {

    function claim() public {
        Subscriber storage sender = subscribers[msg.sender];
        sender.hasClaim = true;

    function viewClaim() public constant returns (uint signedContracts) {
        Subscriber storage sender = subscribers[msg.sender];
        if (!sender.hasClaim) return 99999999;  // only because -1 wouldn't work
        return sender.votes.length;

    function approveClaim(address claimantAddr) public {
        if (claimantAddr == msg.sender) return;
        Subscriber storage claimant = subscribers[claimantAddr];
        if (!claimant.hasClaim) return;
        if (subscribers[msg.sender].amtPaid == 0) return;
        // Remember payOut handles all testing to see if conditions are met

    // Note that paying in automatically subscribes you to the policy if you're not already joined
    function payIn() public payable {
        if (subscribers[msg.sender].amtPaid == 0) {
            subscribers[msg.sender] = Subscriber({payoutAmt: 10 ether, amtPaid: 0, hasClaim: false, votes: new address[](0)});
        Subscriber storage sender = subscribers[msg.sender];
        sender.amtPaid += msg.value;
        if (sender.hasClaim) {

    function viewPaidIn() public constant returns (uint amtPaid) {
        Subscriber storage sender = subscribers[msg.sender];
        amtPaid = sender.amtPaid;

    function payOut(address claimantAddr) private {
        Subscriber storage claimant = subscribers[claimantAddr];
        if (claimant.votes.length > 1 && claimant.amtPaid >= claimant.payoutAmt) {
            claimant.votes = new address[](0);
            claimant.hasClaim = false;


What next?

This, being only 64 lines of code, and done in about an hour by someone skilled at programming but having no previous Solidity experience, leads me to believe you could just have a bunch of clever programmers hypothesizing about how to disrupt any given industry over lunch as a warmup for what work they have to do in order to actually put food on the table, and then go on and actually implement it in the time between dinner and bedtime.  Sure there are a lot of details that need to go into this to really make it polished, like especially a long tail of testing in order to make sure the contract is bulletproof.  But, if it's so easy to write an insurance policy, just imagine what else can be done with Ethereum in just as quickly!

Thursday, February 1, 2018

How Old Am I, According to Spotify

I'll explain this down below.

Spotify facilitates very convenient access to lots of great music on many different types of devices.  However, being a standalone application that manages its own music in its own way means you can't interact with the songs as you could when they were in MP3 format or on vinyl.  For developers, the Spotify APIs can change the game.

A Brief History of My Music Listening

I'm a bit of an odd bird in my musical tastes, which were influenced by my mom as we listened to a lot of oldies stations driving around to various events and activities.  Where most oldies stations typically play music that is 30-40 years old, one station in Dallas (KAAM 770) plays music that is around 50 years old.  Thus, back in the 1990s, they were playing lots of big band and swing standards.  (Now they have caught up to where traditional oldies stations were back then, but playing music more along the lines of The Carpenters rather than classic rock.)  I developed an affinity for Nat King Cole, and was soon introduced by family to the music of Frank Sinatra.  Then, playing in various school bands, I took a liking to orchestral/instrumental scores with rich musical and technical content, including game show music, as I was becoming fascinated with all aspects of game show production.  Fast forward to 2007, when I saw The Jersey Boys in concert, and then Frankie Valli in person shortly thereafter, and I have been seeking my fix of 60s & 70s pop ever since.

It fits, along with my game show collection and seeking exact airdates for all my episodes, that I would like to know when all the music in my collection was recorded and by who exactly.

Curating a Collection, and Odd Ways To Listen

In the days of file sharing, I would come across MP3 files that often had incomplete or incorrect metadata as to their origins, especially the year but sometimes even the artist, often attributing the work of a much lesser-known artist, but sharing a similar style, to the greater-known artist.  I would do research on these songs and correct the information in programs such as JRiver Media Center that not only offered a powerful suite of music and playlist management capabilities, but also the best in digital signal processing to make your music really shine on your speakers.  Sometimes, to heighten the emotional content for me, I would pitch-bend the song, usually 1/4 to 1/2 step flat, but occasionally 1/2 step sharp.  (Too much beyond that and the timbre of voices and instruments becomes distorted, ruining the believable effect.)  On vinyl, an adjustable turntable could provide a similar effect, except JRiver was smarter in that it could adjust only the pitch, not the tempo if you didn't desire.

Gone are the days where people would freely trade MP3s on the Internet.  It was a temporary fix for a music industry that refused to adapt to the times, but now it is so inexpensive and convenient to maintain a subscription to a service such as Spotify.  The only problem is you are typically constrained to using only their media player which lacks much in the way of audiophile DSP capabilities (notably not even an equalizer for the desktop version), and the geeky curator will miss the ability to see see so much data at once (and edit it) as JRiver shows.

Nevertheless, to account for the lacking feature (and satiate my curating curiosity and desire for data), I thought that, as a developer, there might be a way to gain further insight into what I've been missing.  Lo and behold, they offer an API for developers that can get you all this information, and possibly even a gateway into playing music through your own DSP tools (if Web-enabled).  Since I have had Spotify since the end of 2014 and haven't really paid much attention to my old music files since then, there are a great many new (to me) songs in my list that I don't know much about except for when I search for them on Google.

On a Quest For Information

Spotify usually provides a lot of interesting, insightful analytics as long as you have not blocked their emails.  (They really don't email you much, so if you're blocking them, you're missing out.)  One of my favorite emails from them included the following tidbit:

Spotify must think I'm an African-American at least 60 years of age.  (Or maybe a white person of around 50, since once they offered me a deal on Chicago + Doobie Brothers tickets.  Of course, I took them up on it.)

This is very cool and only the tip of the iceberg of data you could collect about yourself.  However, things like this often lead to more questions than answers, such as "What's another genre I'd like that doesn't relate except for only some esoteric way a computer would understand?" or "What's the song I really liked in my Discover Weekly but forgot to bookmark?" or "How have my listening preferences changed over time?"

Nevertheless, one thing I've understood since about 2007 is that my musical tastes fall in line chronologically with "doo-wop to disco", meaning core 1962-1978 with fringes between 1956-82 (gotta include the classic rock & roll).  Of course, this doesn't include the works of Frank Sinatra and Nat King Cole, who were often still covering songs written in the 1930s and 40s all the way into the 1960s and beyond.  But because their new recordings were often completely different interpretations and instrumentations, I am not so concerned with the song's original release date as I am with the date of those particular recordings.

As such, I was expecting to obtain data from Spotify and present it graphically in such a way that would display a big hump around 1962-78, with a short tail backward, and a long tail onward into the present.  However, this is what I got:

Not much love for the 1980s; just 60s & 70s, and mostly rehashes of the same from later years.

However, Spotify doesn't concern itself with the original release date of a song, but with the release date of an entire album.  This means that looking back on, for instance, The Voice album by Frank Sinatra, which was in 1955 already a compilation of his much earlier works (and released once again in 1999), any song on that album (originally recorded between 1943-52) would show up as 1955 (or possibly 1999, one year after Sinatra passed away) on Spotify.  I know for a fact there are songs in my library recorded in 1958 and 1961, but the graph above doesn't show anything prior to 1962.  I haven't experimented with how it handles local files I have curated because, quite frankly, I don't want them intermingling, so this data pertains to only what I stream from their service.

So, How Old Am I, According to Spotify?

To be totally honest, I don't know, but if I had to guess, it would think I was as old as about 70, with maybe a 50/50 chance of being white, and formative tastes appearing in the late 1950s, and staying decidedly hip until around 30 when everyone got sick of disco.  Well, I'm around 30 now, and my tastes have never been decidedly hip.

Once upon a time, Spotify made playlists for their users, consisting of music from the 1970s, 80s, and etc. that reflected music you would have listened to back then based on your current tastes.  It struggled to do much for me, but I have to give it credit for giving me a "1970s playlist" consisting of many compilation albums from the likes of Glenn Miller and other famous big bands dating back to the 1940s.

How Old Are You, According to Spotify?

If you are a developer, here is how you tell.  It's definitely cutting-edge JavaScript code (if you traveled back to 2012 to read this), but not too difficult to get working.  All you need is:

  1. A Spotify account where you have signed into the developer portal and registered an app.
  2. The example Spotify Web API app, downloadable from
  3. Your app in Spotify configured with the correct redirect URI.  If running it locally, it is http://localhost:8888/callback/ because this is already established in app.js for you.
  4. Your app.js file configured with the corrseponding client ID and secret provided by the developer portal, and the redirect URI you specified earlier.

Once you have these things, incorporate the following changes into the code you just checked out from Github (because I'm too lazy to fork it and incorporate the changes for you to download conveniently).  Then, kick off the Node server, open your browser, navigate to your server, open your developer tools, log in, and watch the console.log statements reveal your data.  Hopefully running all these calls for your whole library won't DDoS Spotify; with only about 200 songs in my case, I didn't bother putting in any pauses.  The next step would be seeing how the age of the music you are listening to has evolved over time (will I finally like 1980s music within 10 years?), but I guess that would only be interesting for me, since most people probably only listen to current music.

$ git diff
diff --git a/authorization_code/app.js b/authorization_code/app.js
index b37f9c5..1e51945 100644
--- a/authorization_code/app.js
+++ b/authorization_code/app.js
@@ -12,9 +12,9 @@ var request = require('request'); // "Request" library
 var querystring = require('querystring');
 var cookieParser = require('cookie-parser');

-var client_id = 'CLIENT_ID'; // Your client id
-var client_secret = 'CLIENT_SECRET'; // Your secret
-var redirect_uri = 'REDIRECT_URI'; // Your redirect uri
+var client_id = 'something'; // Your client id
+var client_secret = 'something else'; // Your secret
+var redirect_uri = 'http://localhost:8888/callback/'; // Your redirect uri

  * Generates a random string containing numbers and letters
@@ -44,7 +44,7 @@ app.get('/login', function(req, res) {
   res.cookie(stateKey, state);

   // your application requests authorization
-  var scope = 'user-read-private user-read-email';
+  var scope = 'user-read-private user-read-email user-library-read';
   res.redirect('' +
       response_type: 'code',
diff --git a/authorization_code/public/index.html b/authorization_code/public/index.html
index 9c57f1c..9544ba8 100644
--- a/authorization_code/public/index.html
+++ b/authorization_code/public/index.html
@@ -64,6 +64,7 @@
       (function() {

+           var years = {};
          * Obtains parameters from the hash of the URL
          * @return Object
@@ -78,6 +79,50 @@
           return hashParams;

+               function getAlbumInfo(albumIds, next) {
+                       $.ajax({
+                url: '' + albumIds.join(","),
+                headers: {
+                  'Authorization': 'Bearer ' + access_token
+                },
+                success: function(response) {
+                                 var json = eval(response);
+                                 console.log(json);
+                                 for (var a in json.albums) {
+                                       var album = json.albums[a];
+                                       var year = album.release_date.substring(0, 4);
+                                       if (years[year] === undefined) {
+                                         years[year] = 0;
+                                       }
+                                       years[year]++;
+                                 }
+                                 console.log(years);
+                                 if (next !== undefined) {
+                                       getPlaylists(next);
+                                 }
+                }
+            });
+               }
+               function getPlaylists(next) {
+                       $.ajax({
+                url: next,
+                headers: {
+                  'Authorization': 'Bearer ' + access_token
+                },
+                success: function(response) {
+                                 var json = eval(response);
+                                 console.log(json);
+                                 albumIds = [];
+                                 for (var song in json.items) {
+                                       albumIds.push(json.items[song];
+                                 }
+                                 console.log(albumIds);
+                                 getAlbumInfo(albumIds,;
+                }
+            });
+               }
         var userProfileSource = document.getElementById('user-profile-template').innerHTML,
             userProfileTemplate = Handlebars.compile(userProfileSource),
             userProfilePlaceholder = document.getElementById('user-profile');
@@ -109,9 +154,9 @@
                 success: function(response) {
                   userProfilePlaceholder.innerHTML = userProfileTemplate(response);
+                                 getPlaylists('');
           } else {


I didn't take fully after my mother in my musical preferences.  While I tend to enjoy disco hits with funky grooves or complex chords and progressions, her feeling on the genre is summarized by her shirt with iron-on letters reading "Disco Sucks."  I'll post a picture of it if she can find it.