Decentralized Microservices With IPFS

What is IPFS?

IPFS (the InterPlanetary File System) is a distributed file system that seeks to connect all computing devices with the same system of files. In some ways, this is similar to the original aims of the Web, but IPFS is actually more similar to a single bittorrent swarm exchanging git objects. (You can read more about its origins in the paper IPFS - Content Addressed, Versioned, P2P File System. )

What are microservices?

Microservices - also known as the microservice architecture - is an architectural style that structures an application as a collection of loosely coupled services, which implement business capabilities. The microservice architecture enables the continuous delivery/deployment of large, complex applications. It also enables an organization to evolve its technology stack.

ipfs2

What makes the combination of IPFS and microservices interesting?

The nice thing about IPFS is it’s immutable storage. IPFS works like a github repository. Just like commit hashes, IPFS hashes will always point to the same immutable files. This makes IPFS an interesting platform to develop microservices. Similar to dependency versioning in npm (node packaging manager), you know for sure that the code at a certain hash will always do the same thing. Secondly, these services are not hosted by a single source (centralized), but are hosted worldwide on several locations (distributed), which makes them more robust. They will remain available at all times.

ipfs1

Storing files on IPFS

Before creating an IPFS microservice, let’s store a simple website on IPFS. You probably haven’t noticed, but the image above was actually uploaded to the IPFS and downloaded using an IPFS gateway: https://ipfs.io/ipfs/. This gateway allows you to load decentralized files stored on IPFS in your web browser , which is pretty cool and opens up amazing possibilities regarding web apps.

Uploading a static website to IPFS / IPNS.

(To follow this tutorial, please install IPFS first.)

Besides simple files, you can also store entire directories to IPFS.

$ ipfs add -r .

So, I uploaded a simple static website I created a while ago onto IPFS and it can be accessed using the directory hash QmUyif9MLjMzpFeXvjLJiatCKsQJhykV1oZH7GPuXRcJjo like so:

https://ipfs.io/ipfs/QmUyif9MLjMzpFeXvjLJiatCKsQJhykV1oZH7GPuXRcJjo/en/

Now I have a simple static site hosted on IPFS. The problem is, however, when I update my site, the hash will change, and any links I have shared will continue pointing to the old version.

That’s where IPNS comes in. It allows you to store a reference to an IPFS hash under the namespace of your IPNS hash or peerID.

$ ipfs name publish QmUyif9MLjMzpFeXvjLJiatCKsQJhykV1oZH7GPuXRcJjo

That will return your peerID and the hash you are publishing to it. A few seconds later my terminal spits back the following IPNS hash:

Published to QmaJLU8Jb2NmpHsgCGu3EqawjqFB23FfXHvt3kWted1PdQ:/ipfs/QmUyif9MLjMzpFeXvjLJiatCKsQJhykV1oZH7GPuXRcJjo

My website is now accessible through:

https://ipfs.io/ipns/QmaJLU8Jb2NmpHsgCGu3EqawjqFB23FfXHvt3kWted1PdQ/en/

Notice the ipns in the url, instead of ipfs. Every time I update my website, IPNS will make sure anyone accessing the IPNS will get the hash of my latest site.

Creating an IPFS Microservice

IPFS and decentralized services

If we can upload websites, we can add javascript code to those websites that looks at the url and extracts additional IPFS hashes. Those can be used to access other files saved on IPFS. This allows us to create services or applications on IPFS that accept other IPFS files as inputs and do something with those.

A nice example of such a service is this IPFS video player.

https://ipfs.io/ipfs/QmVc6zuAneKJzicnJpfrqCH9gSy6bz54JhcypfJYhGUFQu/play#/ipfs/QmTKZgRNwDNZwHtJSjCp6r5FYefzpULfy37JvMt9DwvXse

The entire code of the video player can be found here.

var tf = $('#input');

function hash() {
  return window.location.hash.substring(1)
}

function render(type, path) {
  var video = $('<video>')

  video
    .attr("id", "videoid")
    .attr("width", "100%")
    .attr("height", "100%")
    .attr("controls", "controls")
    .attr("poster", path + "/poster.jpg")
    .attr("preload", "auto")
    .attr('data-setup', '{"example_option":true}')
    .addClass('video-js vjs-default-skin vjs-big-play-centered')

  if (type == "file") {
    $("<source>")
      .attr("src", path)
      .appendTo(video)
  } else {
    var formats = ['mp4', 'webm', 'ogv']
    formats.forEach(function (fmt) {
      $("<source>")
        .attr("type", "video/" + fmt)
        .attr("src", path + "/video." + fmt)
        .appendTo(video)
    })
  }

  $("div#video-div").innerHTML = "" // clear it
  $("div#video-div").append(video)

  videojs("videoid").ready(function(){
    this.play();
  });
}

var last = ""
function update(path) {
  if (last == path)
    return
  last = path

  $.get(path, function() {
    render("dir", path)
  }).fail(function() {
    render("file", path)
  })

  tf.val(path)
  window.location.hash = "#" + path
  console.log("updated to: " + path)
}

function isFile(path) {
    return path.split('/').pop().split('.').length > 1;
}

(function main() {
  tf.bind('keyup keypress blur change cut copy paste', function(event) {
    update(tf.val())
  })

  tf.bind('hashchange', function(event) {
    update(hash())
  })

  var text = hash()
  if (text.length < 1)
    text = ""
  update(text)
})()

If we take a look at the code, we can see that

window.location.hash.substring(1)

extracts the following string from the url: /ipfs/QmTKZgRNwDNZwHtJSjCp6r5FYefzpULfy37JvMt9DwvXse

This hash is then used to fetch the input file for the video. If we would create another video file, add it to IPFS and the new hash to the video player url, the video player will act as a service to play that other file.

Let’s build a simple text to speech app

The following code is a simple webapp that reads out the url and speaks the text defined in the say query parameter.

<!DOCTYPE html>
<html lang="en">

<head>
    <meta charset="utf-8">
    <meta http-equiv="X-UA-Compatible" content="IE=edge">
    <meta name="viewport" content="width=device-width, initial-scale=1">

    <title>text to speech render</title>
    <script src="lib/jquery.min.js"></script>
</head>

<body>
    <h1 id="text"></h1>
    <script>
        function speak(text) {
            var msg = new SpeechSynthesisUtterance();
            $('#text').text(text);
            msg.text = text;
            speechSynthesis.speak(msg);
        }

        (function main() {
            $(function() {
                if ('speechSynthesis' in window) {
                    var text = decodeURIComponent(window.location.search.split('say=')[1]);
                    speak(text);
                } else {
                    window.alert('no text to speech browser support.')
                }
            });
        })()
    </script>
</body>

</html>

It was uploaded to IPFS and can be accessed using the following link:

https://ipfs.io/ipfs/QmPdWu3ko1qAwKJe83itC2UuAzRoKjnpQy8a1p4YYJDECr/?say=hello%20world

Setting Up a Private Npm Registry With Sinopia

When you have several front end projects going in the same company, and you are using a component based framework like Angular 2, it is important that you can reuse those components over all projects. It is possible to pay npm for private repositories, or to make use of npm link, but in this post, I will:

  • Implement a private npm registry using Sinopia.
  • Create a dummy Angular 2 module.
  • Publish the module to my private npm registry
  • Access the module from secondary application.

Installing Sinopia

Do a global install of the package:

npm install -g sinopia

Set your npm registry to your local Sinopia:

npm set registry http://localhost:4873/
# if you wish to set it back to the default npmjs:
npm set registry https://registry.npmjs.org/

Run Sinopia using:

sinopia

If everything works well, you should see that Sinopia automatically generated a config file and is now running at http://localhost:4873/.

warn  --- config file  - /Users/jestersimpps/.config/sinopia/config.yaml
warn  --- http address - http://localhost:4873/

The Default configuration will allow all users to do anything. I’m going to leave this for now as I want to focus more on publishing and installing my own packages instead. If you are interested in setting up your own configuration file, you can find a full config file here.

Creating a packaged Angular 2 module

We will create an angular module from the ground up. Cd into the folder (./dummy in my case) you wish to use for your module and generate a package.json file by typing:

mkdir dummy
cd dummy
npm init

This will ask you some questions about the library you wish to create, but don’t worry about those, you can just skip through them. Next, we have to include Angular 2 as a dependency, so we will do:

npm i --save @angular/core

The --save command will add the installed dependency to the previously creatde package.json file.

Once Angular2 is installed, we will create a .ts source file at the root of our library with the name of our library:

dummy.ts

This will be the name we will use later on when we will import our library from a secondary application:

import {dummy} from "dummy/dummy";

This main file in the root will be used to export all of the files we will create next that make our module work.

to be continued…

A Post History Scroller

Quite a few months ago I was working on a personal project called : mailwall.me. The idea was a platform where you can share what you think someone will like using their mailwall.me address. As most people in the world have email, but not necessary a fb,twitter,whatsapp,etc. account, people can email media to people with a mailwall.me account. The message would get parsed, media extracted and displayed in the proper way on the platform. You could tie it up with IFTTT to mail you on social media events and it would automatically categorize everything accordingly. This history posts scroller is an artifact of this project. I was looking for a new way to create a historical overview of the messages.

angular-history-scroller

The code is a work in progress and haven’t tested it anymore. I was going to make a generic version of it, but didn’t find the time.

Getting Started With Microservices and Go

Traditionally applications have been built as monoliths; single applications which become larger and more complex over time, which limit our ability to react to change. Go is expressive, concise, clean, and efficient. Its concurrency mechanisms make it easy to write programs that get the most out of multicore and networked machines, while its novel type system enables flexible and modular program construction. Go compiles quickly to machine code yet has the convenience of garbage collection and the power of run-time reflection. It’s a fast, statically typed, compiled language that feels like a dynamically typed, interpreted language.

From Gulp to Webpack

I recently started working on a big Angular 2 webapp. The company I joined has been working on this project since Oktober 2015. It is one of the first companies in Belgium that jumps on the ng2 bandwagon and it has been a joy and challenge to learn and work with this framework every day.

Because gulp experience was already in house, the team has implemented a gulp build coupled with SystemJs. These days ( August 2016 ), partly popularized by React, Webpack has become a ‘thing’. My colleagues at the company have been following the trend and have decided to switch to Webpack for their next project. Secretly, there’s a common desire to implement it in the current project as well, but… there is this tight deadline, so…

…I decided to give it a shot myself and tried to hack something together this Saturday and see how far I would get. To be honest, I didn’t know what to expect myself. I’ve heard all the buzz but haven’t actually given Webpack any time as I have been working relentlessly on my own startup krackzee over the past 8 months, which is a php Laravel project. So I have been ‘out’ of the front-end world for quite some time. To all non-javascript developers; 8 months in the javascript realm is like a decade in your language. ( Read: A JS framework on every table by Allen Pike )

javascript fatigue

Webpack

First impressions

The installation has a lot of simalarity to installing grunt or gulp. You pull in some repositories, make sure you save them as devDependencies in your package.json and create some config files to configure your build process. Here’s the interesting part though; It’s almost like a plug and play build process tool. Forget about writing those insane, customized grunt and gulp scripts. Here’s a tool that just works out of the box with minimal configuration. And it works well!

Getting started

The Angular 2 team has done a pretty good job explaining Webpack on their website. It has also become their build tool of choice. I started by basically going through their guide and adapting their implementation in a seperate branch to our current project. Don’t do this! I made this mistake and I might have learned a bit more about what the Webpack config files do, but it will fail miserably unless you have implemented webpack before and you know how everything works together. My suggestion is to start a different repo alltogether with a clean version of one of the Angular 2 webpack starters out there. Just browse Github a bit, there’s plenty of them.

I chose for the angular2-webpack-starter from AngularClass

Clone your favorite starter

Start by cloning your favorite Webpack starter repo somewhere on your hard drive.

git clone https://github.com/AngularClass/angular2-webpack-starter.git
cd angular2-webpack-starter
npm i

If you choose to use the same starter as me, make sure you have Node version >= 5.0 and NPM >= 3 as mentioned in their readme.

Move your code

After having a local version of the starter, I removed all of their awesomely, perfected, genius code from the src/app folder and copied in our own. Remove the git directory if you haven’t yet, and set up your own repo. Maybe make a first commit and push. From this moment onward there’s a lot of nitty gritty changes to the Webpack configuration, but basically you are almost done!

Reconfigure Webpack

Folders and files

Depending on how you have set up your Angular 2 application you will have a different main.ts file.(assuming you are coding Typescript, I’ll just ignore anyone who isn’t) The main.ts file is that file that holds your main app component (everything starts from there, your JS app big bang). Webpack best practice commands you to split it up into 3 files:

entry: {

  'polyfills': './src/polyfills.browser.ts',
  'vendor': './src/vendor.browser.ts',
  'main': './src/main.browser.ts'

},

It’s pretty obvious but, define your polyfill imports in the src/polyfills.browser.ts file, your vendor modules in the src/vendor.browser.ts file and bootstrap your app in the src/main.browser.ts file. I just left the polyfill file alone because it was mostly the same of what we already had. Then, in the config/webpack.common.js file, set your entry points as depicted above. In the platform folder, define your environment & provider settings if you have them.

Remove all script tags in your index.html file that point to any application code. These will get injected by a Webpack plugin called HtmlWebpackPlugin. Compare the index files of the starter with your own and basically resolve any merge conflicts :)

Set your own constants in the config/webpack.common.js:

const METADATA = {
  title: 'Yourapp',
  baseUrl: '/',
  isDevServer: helpers.isWebpackDevServer()
};

If you work at a special company and have a designated designer that takes care of the scss for you, there’s a chance you might have to set up some build configuration using the ExtractTextPlugin that enables you to use an import to define the to-be-converted scss files. (I will not cover this as we are currently moving our styling into a separate repo)

Environment config

Depending on your environment configuration you will have to make changes to the config/webpack.dev.js,config/webpack.prod.js,config/webpack.test.js files. We previously set the environment in a json during the gulp build. In the new Webpack configuration, I am using a constant I set in the ‘DefinePlugin’ that is then used in my code to determine the Api endpoint and such.

Testing config

If you were already using Karma and Jasmine, just compare the starter’s config/karma.conf.js file with your own. Make changes to the already existing file - if any. Make sure you have a karma.conf.js pointing to the file in the config folder.

Troubleshooting

After all this, cross your fingers and run the Webpack dev server using npm start. In my case there were issues because we are using html template files for our components and have referenced them to our app root directory.

Refer the HTML template files in relation to the current typescript file:

@Component({
    selector: `component1`,
    templateUrl: './component1.component.tmpl.html'
})

instead of:

@Component({
    selector: `component1`,
    templateUrl: 'app/../blablablabla/../component1.component.tmpl.html'
})

Conclusion

In memory live reloading

Make the move asap, it won’t take too much of your time! Especially if your grunt/gulp build process is currently not sporting any in memory live reloading. The time you save with this feature alone easily makes up for the time invested in figuring out implementing Webpack.

Lazy loading modules

Also, Webpack will pack your modules so that they will be lazy loaded (async) in your application. Often you don’t need all of the dependencies at runtime. Lazy loading will boost your application significantly. Read more about it here.

Anyways, you won’t be disappointed. Webpack is magic ;)