Monday 15 January 2018

Feasibility study for soundcom

Im completely new to the feasibility study, but i looked up on google to get some points on how to present my feasibility study for soundcom with react native.


Soundcom is an android application which is a powerful audio encoder and transmits data through sounds. Amazing right? But, right now the application is built in android. Let's do how feasible it is to fully build it as a react native application. 


Soundcom, is an application which takes the audio as input and encodes the audio and transmits via sounds! While coming to react native is a framework which allows one to build native applications using javascript and React. Let's discuss the available features in both the application and proposed technology and also android. 

Current features within Soundcom

 - Takes the audio files as input or records the audio. Which shows the ability of the application to accept the audio file in any format and record it clearly to later transmit it. 
 -  Decoding and encoding logic, we have to think of the language feasibility and availability, right now it's android and is written in Java, which tells it's a powerful tool and with highly available libraries. 
 - Ability to transmit the encoded data - again this show the logic, and asks for the feasibility of the language. 

React Native

Now, let's checkout the features of react. 

 - We can build native applications using react native and deploy it anywhere. 
 -  It let's you build the app (mobile) using only javascript, which is again a powerful language feasible to build the logic as well as an amazing front end too. 
- It's a modular framework for us to use, to migrate it to any platform or to even change any module or enhance. I also feel, this can be a great start towards a modular and scalable architecture of soundcom.

Following are the links to some opensource libraries to encode and decode audio files (soundwaves)

    - aurora.js
    - audiolib.js

I would like to work on this migration in future :) 

Thursday 11 January 2018

All about Stackle API endpoints

For someone who is new and don't know what Stackle is, here's a small introduction.
Stackle is a web communication portal aimed at providing Open Source organizations a platform to have discussions on their GitHub projects and their issues. It provides GitHub integration which allows the administrator of an organization to create a forum thread for the particular organization. Users signing in are able to view forums of the organizations they contribute to and engage in the forum discussions. (This description is taken from the original repo on github, here is the link)

The above image is the home page of Stackle, and the following images are the profile and dashboard pages.

Right now Stackle supports 14 APIs. Let me describe about each API.

1. /api/posts

GET - The above API is the GET method, which makes a REST call to get all the posts made under an organization. 

2. /api/user/post

POST - The above API makes a POST call to the given endpoint to send a post to the feed under the organization. Since it's a POST call, one should specify the body( the input which you want to post) to make a successful post in the feed.

We have to even mention the content type based on how we are going to pass the body, it can be a JSON or raw data. Following is a sample body.

title : "This is the title"
description : "Description about the post" "
org_name : "Orgname"
tags : [],
repository : "URL to the repository"
link_issue : "URL to the Issue"
user : "your user id",
date : "data to show when posted"
votes : 0,
comments : []

3. /api/post/:postid

GET - The above API makes a GET call to the above endpoint, where the query param will the post id, according to which the response will be the post feed of the given id.

DELETE -  We can hit the same endpoint with delete method, to delete a given post based on the ID

4. /api/posts/:userId

GET - The above endpoint gets posts by a particular user. The response will be a simple filter of posts made by a particular user. Here the query param will be the user id.

5. /api/posts/org/:orgname

GET - The above endpoint get posts related to specific organization by org name, here the query param is the organization name (not ID) 

6. /api/org/:orgname

GET - The above endpoint get details of a specific org, these details mainly represent the github organization details,  here the query param is the organization name (not ID) 

7. /api/comment/:postId

POST - This API endpoint posts a comment on a post, the query param is supposed to be a post ID and also the body is supposed to be the comment content. 

8. /api/orgs

GET - This API endpoint gets all the stacks/ organizations registered with the Stackle app.

9. /api/stack/create

POST - The above API endpoint creates a organization or Stack, mostly this requires some admin privileges. Following will be the body, obviously the header should contain the content type (if raw or json) 

name - Org_name
description - Small description about the organization
stackleUrl - Stackle URL (this will be generated by the app)
githubUrl - Github URL
created_user - userId

10. /api/delete/stack/:stackid

DELETE - The above api endpoint deletes the given organization. the query param will the stack id or organization id

11. /api/subscribe

POST - The above api endpoint helps in subscribing to an organization or stack for notifications. following is the body type, keys.

uid - userId
stack_name - Stack name

12 . /api/stack/subscribed/:userid

GET - this endpoint gets the subscribed stack of a user. 

13. /api/newuser

POST - Creates a user

Wednesday 10 January 2018

DroneSym, some best alternatives for Firebase

DroneSym is one amazing project which helps users to track drones and locate them. The user will be provided with a dashboard to track the drones and also the ability to add new ones.

So, we have to deal with a real time database, which requires constant monitoring and updating of the database based on the drone locations. This application uses Firebase as it's real time database, which requires to track the co ordinates of the drone and update/track the moment of the drone constantly.

Hence, I have done some research on few of the database alternatives to Firebase. Following are some of the special features of Firebase.

 - Firebase(NoSQL database provides real time database sync every millisecond.
 - It also provides the ability to build serverless apps (But in DroneSym we are using Flask)
 - Firebase also provides awesome offline sync with the data, which uses local storages to store the info and automatically sync to the databases when online

However the free plan for Firebase is very limited and we should definitely think of few alternatives. I came up with three alternative options.

1. Cloudboost

Cloudboost as some better pricing for free users or beginners who are planing to try out small projects. It provides around 500 free connections around devices where as firebase provides only 100 for free subscription. It definitely have some cool features, I'm attaching the following screenshot.

The following is the pricing model.

2. Deployd 

Deployd is an open source solution which provides Ready-made, configurable Resources add common functionality to a Deployd backend, which can be further customized with JavaScript Events.
Following are the feature list provided, (which i picked from their official github doc, here is the link)

    - secure access to database APIs directly from untrusted clients (browser js, mobile apps, etc)
    - notify clients in realtime of events occurring within the database
    - user and session management
    - all APIs exposed over REST / HTTP
    - bundled browser JavaScript client
    - web socket authentication and session management
    - can be hosted by modern cloud platforms
    - support extension through node modules and npm

3. DeltaDB

DeltaDB is an offline-first database designed to talk directly to clients and works great offline and online, which is also an open source solution. 

Following are some amazing features which are comparable to firebase, (I picked these feature list from the original deltadb github repo, here is the link)

 - Works the same whether the client is offline or online
 - NoSQL database that works in your browser and automatically syncs with the database cluster
 - Stores all data as a series of deltas, which allows for smooth collaborative experiences even in
frequently offline scenarios.
 - Uses a simple last-write-wins conflict resolution policy and is eventually consistent
 - Uses a homegrown ORM to speak to underlying SQL databases. (Support for underlying NoSQL databases will be added)
 - Is fast. Clients push their deltas on to the server's queue. The server processes the queue separately and partitions the data so that clients can retrieve all recent changes very quickly.

There will be many other well known alternatives, but i made a research specific to cost effective/free open source solutions. 

Tuesday 9 January 2018

How to setup GoCloud

GoCloud is an open source project which provides unified APIs to access various actions related to Amazon Web Services and Google Cloud Platform. 

Let's take Google Cloud Platform as an example to setup our GoCloud package and check how we can access the APIs provided End to End. 

In order to configure Google Cloud Platform(GCP) to integrate with GoCloud package, one should create a service account on GCP, and the website will provide you with a JSON file which consists of the various environment variables as keys. Such as ClientID,  ClientEmail, PrivateKey etc, to authenticate through our GoCloud package. This JSON file which the values which we get from the GCP service account, we have to save it as "googlecloud.json" within the home environment. Well, for production level, we can convert the JSON file, the keys within the file in specific as environment variables. 

Since it is a Go Library, we can initialize GoCloud package the following way in our code before accessing the Google Cloud Platform APIs

import ""

googlecloud, _ := gocloud.CloudProvider(gocloud.Googleprovider)

Here, we are going to create an object, by mentioning the CloudProvider(it's an API from GoCloud to receive the object type as either GCP or Amazon Web Services). Here we mention the cloud provider as Google Cloud Platform. 

After setting up the GoCloud package, you can access various APIs provided by Google Cloud Platform and perform operations like creating clusters, deleting instances, and so on. 

Friday 5 January 2018

All about GoCloud

GoCloud is a Golang library which provides unified APIs to access various cloud providers APIs. It mainly expose the following five APIs for various functionalities, which we will discuss using the Amazon Web Services.

  • Compute
  • Loadbalancer
  • DNS
  • Storage
  • Container

Prerequisites will be the following. We need to mention the AWS credential in gocloudconfig.json

  "AWSAccessKeyID": "xxxxxxxxxxxx",
  "AWSSecretKey": "xxxxxxxxxxxx",

and setup the environment variables to create an AWS object to access the APIs provided by it. 

export AWSAccessKeyID =  "xxxxxxxxxxxx"
export AWSSecretKey = "xxxxxxxxxxxx"

Later, we need to initialize the GoCloud library, either by NPM or providing path to the local dir

1. Compute APIs

Compute APIs allows us to manage the cloud provider instances and servers, which allows us to use APIs to create instances, stop, start, reboot and delete a server or instance of that particular provider.

We will use the Createnode API to create an instance. We pass the parameters providing the image(configuration), type of instance and region. (EC2)

  create := map[string]interface{}{
 "ImageId":      "ami-ccf405a5",
 "InstanceType": "t1.micro",
 "Region":       "us-east-1",

 resp, err := amazoncloud.Createnode(create)
 response := resp.(map[string]interface{})

2. Container APIs

Container APIs allows us to spin up containers over virtualized platforms. This allows us to create clusters ( group of servers or instances together), delete cluster, create and delete services. Following example demonstrates creation of a cluster. We use the Createcluster API provided by the provider. Also pass the params such as the cluster name and at which region you want to spin up one.

  createcluster := map[string]interface{}{
  "clusterName": "gocloud-test",
  "Region":      "us-east-1",

 resp, err := ecscontainer.Createcluster(createcluster)

 response := resp.(map[string]interface{})


DNS APIs provides a way to manage the DNS services, where we use the Amazon Route 53 web service. Using the DNS APIs one can list, delete and create DNS. Following is an example on creating a DNS service.

  createdns := map[string]interface{}{
  "name":             "",
  "hostedZoneConfig": "hostedZoneConfig",

 resp, err := awsdns.Createdns(createdns)

 response := resp.(map[string]interface{})

4. Loadbalancer APIs

Loadbalancer APIs allows one to manage the loadbalancer services. For Amazon Web Services to create, delete, attach node with loadbalancer, detach node with loadbalancer, and list load balancer. Following example shows how to delete a loadbalancer service using the Deleteloadbalancer API

  deleteloadbalancer := map[string]string{
  "LoadBalancerName": "my-load-balancer",
  resp, err := awsloadbalancer.Deleteloadbalancer(deleteloadbalancer)
  response := resp.(map[string]interface{})

5. Storage APIs

Storage APIs provides a way to create disk(Storage), delete, attach and detach disks. Following is the example to createdisk, using the Createdisk API, where we pass params such as Region, size and the zone. 

  createdisk := map[string]interface{}{
  "AvailZone":  "us-east-1a",
  "VolumeSize": 100,
  "Region":     "us-east-1",
  resp, err := amazonstorage.Createdisk(createdisk)
  response := resp.(map[string]interface{})

All about NodeCloud

NodeCloud provides a unified API for all cloud providers, which supports Amazon Web Services and Google Cloud Platform.

1. Let's start off with the compute APIs and use Amazon Web Services to test.

First step is to install the NodeCloud package, either by NPM or include the package which is present in the local directory. We specify what cloud provider we are going to use, by using the getProvider API from NodeCloud. The credentials can be provided using the environment variable.

We need to specify different options to pass to the AWS SDK. We need to import the compute package from AWS SDK(EC2) before using the APIs provided by AWS. We can use the list API to check out the list of AWS instances which are present.

We can also perform actions like, stopping the instances, reboot the instance and start/stop instance. Following is the sample code.

const params = {
  DryRun: false

  .then(res => {
  .catch(err => {

2. Storage APIs

The prerequisites are the same, we need to just import the S3 instances from AWS to access the storage APIs, in order to check for the storage of each instance or even create a new instance.

Here we use the bucket API and pass parameters specifying the configurations of the new instance which we want to create. Following is the sample config taken from the NodeCloud repo sample.

We can also access APIs to delete the buckets, upload blobs and data, list the buckets present.

const params = {
  Bucket: "ncbucketcr",
  CreateBucketConfiguration: {
    LocationConstraint: "us-west-2",

  .then(res => {
    console.log(`Bucket created ! ${res}`);
  .catch(err => {
    console.error(`Oops something happened ${err}`);

3. Database APIs

Lets consider Amazon DynamoDB which is a NoSQL database. Using the NoSQL APIs, we can do create an item, Delete an item and Update an item. Following is the example to define the parameter with the object which we need to insert and the table name.

const params = {
  Item: {
    artist: {
      S: 'GG',
  ReturnConsumedCapacity: 'TOTAL',
  TableName: 'Test',

  .then((res) => {
  .catch((err) => {

4. Network APIs

Lets see how we can interact with the DNS services inside AWS using NodeCloud network APIs.  We use Amazon Route 53 which is a Domain Name Space web service.

Let's use the DNS APIs to perform various action such as creating zones, deleting zones and displaying list of zones.

Following is the sample params object for creating a zone which contains the callerReference and Virtual Private Cloud region.

const params = {
  CallerReference: 'STRING_VALUE',
  DelegationSetId: 'STRING_VALUE',
  VPC: {
    VPCRegion: 'us-east-1',

  .then((res) => {
  .catch((err) => {

Tuesday 19 December 2017

All about Kute, an amazing project under scorelab

Kute is a real time travel tracker for Sri Lanka users, who are looking out for car pooling and track public transport. Its a convenient application to use for real time car pooling and track public transport for accurate timings.

After you login, you find the above two options for you to use. If you choose to private vehicles, you can either register your vehicle as shown in the following screens.


You will find the above flow to register your mobile number, vehicle name and share your route. So that it would help people who are travelling by the same way. Also you will find options to simply add existing routes which you have added previously, and find friends. 

Coming to the public vehicles, you will get an option to share your location for a particular publish transport say bus or train, and people who are following that particular vehicle can track on google map accurately and estimate the arrival or departure timings. It's a very good application to use for a busy working day. 

To contribute to this application, you can visit the github page and look up for setting up the project locally. The community is friendly and welcoming for any beginner level help.