Creating the Coverstar Kpop Video Social Network

The experience of building the Coverstar App, a kpop video social network app, was an exciting challenge. As a small startup with limited manpower and financial resources available, choosing the right technology stack was extremely important.
The backend platform needed to be scalable, cost efficient and capable of delivering high performance video playback. With a frontend MVP iOS app in place, and a plan for an initial social network release within 3 months, the technology stack needed to allow for rapid development.
Serverless technology (e.g. AWS Lambda) presented a environment that met the need of the project. The choice to employ serverless technology was a game changer and helped us to move at an unprecedented pace. Compared to working with containers and microservices, the simplicity of writing functions in AWS Lambda, without worrying about provisioning, scaling and auto windup/down to keep cost low, was incredible.

Below is a high level diagram of the Coverstar backend implementation:

RDBMS vs NoSQL

A primary decision that needed to be made was between using a relational database (RDBMS) versus a NoSQL database system. This required us to take a closer look at the nature of the data we needed to store. We predicted that our database system will mainly store user data, video and media metadata and data from interactions between our users (e.g. messages, notifications). The nature of our data would be mostly relational, as opposed to having unrelated data entries.

NoSQL would require storing a range of data redundantly, denormalized, or use a service like Amazon EMR to create relationships. Another factor was that the volatility of the data schema. High volatility would favor a NoSQL database which has no schema requirements. Considering all those factors we decided to opt for a relational database system. We had a closer look at AWS Aurora which has a few advantages over Postgres but since we wanted to stay cloud platform agnostic we picked Postgres.
Below is the database schema of the initial release. Since then the number of tables grew considerably.

Adaptive Bitrate Streaming (ABR)

In order to deliver great video playback experience to the global Coverstar community, slow and/or changing network conditions needed to be taken into consideration.
Two of the most popular implementations of ABR streaming are Apple’s HLS and MPEG-DASH. Both of these technologies work by encoding a source video file into multiple streams with different bitrate. Those streams are split into smaller segments usually a few seconds in length. The video player then seamlessly switches between the streams depending on the available bandwidth and other factors. Apple’s recommended segment duration was 6s for HLS streams. Since most of our video streams are 15-30s clips, a faster bitrate adoption using 3s segments worked well.

Final thoughts:

Creating a social network backed was a challenging task and tuning the system will be a continuous effort. The services that AWS is providing, including Elastic Transcoder, Lambda, SNS has made it possible even for a small startup to create a competitive video social network. In hindsight we were very happy with the technology choices since the system has been incredibly stable, performant and cost effective.

VIM gives you Super Powers!

Those who know me most likely know that I’m quite a VIM enthusiast.
I love the speed and flexibility of the editor and it’s ubiquitous availability on UNIX systems.

VIM has a steep learning curve but I guarantee you the effort pays off!
If you are starting off learning VIM, I can recommend the game VIM Adventures which is a fun way to learn VIM.

You can view my .vimrc at GitHub.

Integrate a Node.js SPA (Single Page Application) with a PHP Web App

I’ve been working on an interesting problem to seamlessly integrate an existing PHP web app with a newly built SPA running on Node.js.
There are many different approaches to solving this problem and this solution is more of a low level implementation, so a certain familiarity with HTTP headers and cookies is required.
This post also doesn’t describe scalability which I might talk about it in a future post.

This post describes:

  1. How to implement session sharing between PHP and Node.js
  2. How to authenticate and authorize the RESTFul API calls made from the browser
  3. How to secure the internal API calls made between the PHP and the Node server

1. How to implement session sharing between PHP and Node.js

Your user logs into the app, providing some credentials and a cookie or token of some sort is returned, which you use to identify that user.
Your AJAX requests to the API server will carry that same logged-in token (PHPSESSID) as before. Then we are checking that token against an internal API on the php server, and restricting the information down to ‘just what the user is allowed to see’.
The important consideration is that RESTful web services require authentication with every request.

The diagrams below shows the interactions between the servers:

2. How to authenticate and authorize the RESTFul API calls made from the browser

Same-origin-policy and CORS

Since the PHP and Node API server are running on different IP addresses the HTTPS requests made using the XMLHttpRequest object are subject to the same-origin policy. This means that HTTP requests could only be made to the domain the page was loaded from.

The CORS (Cross Origin Resource Sharing) mechanism provides a way for web servers running on different IPs or domains to support cross-site access.

Client side code:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
$.ajaxPrefilter( function( options, originalOptions, jqXHR ) {
options.crossDomain ={
crossDomain: true
};
options.xhrFields = {
withCredentials: true
};
});

function getSessionId(){
var jsId = document.cookie.match(/PHPSESSID=[^;]+/);
if(jsId) {
if (jsId instanceof Array)
jsId = jsId[0].substring(10);
else
jsId = jsId.substring(10);
}
return jsId;
}

var id = 'myRequestedDataId';
// make ajax call
var pathAndQuery = "/api/myapi";
$.ajax({
url: 'https://' + API_HOST + pathAndQuery,
type: 'GET', // or POST
beforeSend: function(request) {
request.setRequestHeader("id", id);
request.setRequestHeader("PHPSESSID", getSessionId());
},
success: function(data) {
var json = JSON.parse(data);
if(json.status === "success") {
// do stuff
}
},
error: function(e) {
console.log(e);
// navigate to the login
window.location.href = '/index.php';
}
});

Node.js code for implementing cors headers:

1
2
3
4
5
6
7
8
9
10
11
12
13
var https = require('cors');
var corsOptions = {
methods: 'GET,HEAD,PUT,PATCH,POST,DELETE',
preflightContinue: true,
origin: 'https://your.phpserver.com' || process.env.WEBORIGIN,
credentials: true,
allowedHeaders: 'phpsessid,id'
};

var app = express();

app.use(cors(corsOptions)); // support cors requests
app.options('/api', cors(corsOptions)); // enable pre-flight requests

PHP code for verifying the session cookie:

1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
$sid = $_REQUEST["PHPSESSID"];
$id = $_REQUEST["id"];
// Set the session to the supplied session id.
session_id($sid);
session_start();
$cid = $_SESSION[APP]['client_id'];
if (empty($cid) {
// invalid session id
$error = "Failed to match session.";
}
// continue and validate id
// ...

// Output the json response:
if (empty($error)) {
$json = "{\"status\": \"success\"}";
} else {
$json = "{\"status\": \"error\"}";
}
echo $json;
?

3. How to secure the internal API calls made between the PHP and the Node server

Depending on how strong the security needs to be, there are several approaches securing the internal APIs, some of them are:
-Restricting the IP/PORT to only allow access from the internal servers.
-Using TSL in combination with basic authentication. If the security needs are not as high this is an easy way of implementing security.
-Token based security e.g. oAuth.
-Using Client-authenticated TSL Handshakes, below is a good article:
https://engineering.circle.com/https-authorized-certs-with-node-js-315e548354a2#.sakue1rg6

Vintage Synthesizers

I’ve been playing with the idea of making electronic music again. Back in the 80s I was fascinated by upcoming new bands like Kraftwerk, Depeche Mode, Yello and Front 242.
I loved the new sounds that were possible using analog and digital synthesizers. I started to make music myself with a little Casio K1 Sample keyboard. Later I bought a used Waldorf PRK Processor keyboard which I used with different midi expanders. The PRK keyboard quality was amazing, it had a 68000 CPU and 5 ½’’ floppy drive and was a monster at 84 pounds!
http://www.synthmuseum.com/ppg/ppgprk01.html

My neighbor was borrowing me his Dave Smith Instruments Mopho Keyboard for a couple of weeks:
https://www.davesmithinstruments.com/product/mopho-keyboard/
I loved the powerful crushing basses this thing can produce and the arpeggiator, sequencer, LFO and two oscillators allow for an immense range of sounds.

Other great options are the original Moog synthesizers or a Yamaha DX7, both are classics.

Below is a website that has an immense collection of vintage synthesizers including sounds samples:
http://www.vintagesynth.com/