How to generate CSS spacing utility with PostCSS

What is PostCSS

PostCSS is a tool for transforming CSS with JavaScript. Unlike SCSS and Less, you can create and extend CSS with your own language. It unlocks limitless possibilities with what you can do with CSS.

What we're trying to achieve

We're going to recreate the Bootstrap Spacing Utility. In a nutshell, it will generate classes for margins and padding with different directions and sizes. However, I'll be using different sizes. Here's a sample of the code that will be generated:


.mt-1 {
	margin-top: 0.25rem;
}
.px-2 {
	padding-left: 0.5rem;
	padding-right: 0.5rem;
}

Because I prefer not to introduce new syntax, the plugin would look for a comment with a special string and append the new classes.

The code

Let's create a file for our plugin. I'll just call mine plugin.js.


const postcss = require('postcss')

module.exports = postcss.plugin('postcss-custom', function () {
	return function (root) {
		root.walkComments(function (comment) {
			if (comment.text && comment.text.indexOf('@spacing') >= 0) {
				spacing(comment)
			}
		})
	}
})

function spacing(comment) {
	const unit = 0.25
	const directions = {
		// placed inorder so that they can overwrite correctly
		t: ['top'],
		r: ['right'],
		b: ['bottom'],
		l: ['left'],
		x: ['left', 'right'],
		y: ['top', 'bottom'],
		'': [],
	}
	function generate(type, alias, direction, props, multiplier) {
		const postfix = multiplier < 0 ? 'n' + Math.abs(multiplier) : multiplier
		const rule = postcss.rule({ selector: `.${alias}${direction}-${postfix}` })
		if (direction === '') {
			rule.append(postcss.decl({
				prop: type,
				value: (multiplier * unit) + 'rem'
			}))
		} else {
			props.map((cur) => {
				rule.append(postcss.decl({
					prop: `${type}-${cur}`,
					value: (multiplier * unit) + 'rem'
				}))
			})
		}
		comment.after(rule)
	}

	for (let dir in directions) {
		for (let i = -2; i <= 8; i++) {
			generate('margin', 'm', dir, directions[dir], i)
			if (i >= 0) {
				generate('padding', 'p', dir, directions[dir], i)
			}
		}
	}
}

To add this plugin to PostCSS, simply add a new entry with the path to the JS file e.g. styles/plugins/my-plugin.js

In the plugin file we need to export a default object for our PostCSS. When you define the plugin with postcss.plugin(), you need to give it a name. In this case it's postcss-custom.


module.exports = postcss.plugin('postcss-custom', function () {})

The above function will search the document for all comments with the special text @spacing. Then it calls spacing() to generate the classes.

The important code to note in the comment() function is in the generate() function. postcss.rule() will create a rule with the selector given. Then we add new property to that selector with .append(). Finally we will append that selector after the comment with .after().

As you can see, creating a new PostCSS is quite easy.

Keywords: postcss, js, css, tutorial

Are online freelancing platforms a source of cheap labour

Freelancing platforms are great. You can find work from all over the world. Work whenever your feel like it. Charge the rate you want. Right? Perhaps those who have tried freelancing platforms would already know what the earning potential of freelancing platforms are. There are definitely plenty of jobs to choose from. And there's no doubt you can earn money from it. But unless your cost of living is well below $1k, you're better of elsewhere.

The global market

Do you remember what happened when China opened it's door to foreign markets? Remember how everything became cheaper and companies moved their manufacturing to China. That's what happens when you compete in the global market. You're not just competing on skills, you're competing on price. There are plenty of skilled workers in developing country who would gladly accept $1k per month for a high skilled job. And there's very little reason why anyone would choose to pay more for someone who lives in a developed country.

You might think that buyers would value quality over price. It's better to pay a little more for better results. Unfortunately buyers are often horrible at judging quality. When you buy a house, you'll never evaluate the materials used in the building, or the structural integrity of the building etc. And even if you tried, you wouldn't be very good at it either. The same problem plagues freelancing platforms.

Like a dating app

If only it was just about price and quality. There's a big problem when networks get large and big tech tries to optimize them. They start ranking people like how they rank products you buy online. Like dating apps, your profile will be given priority if you have a better ranking than others. That means newcomers will have a herculean task just landing their first job.

And like dating apps, the only way a buyer can judge you is by your profile. Something superficial and not representative of your real skills. There's not way for you to truly showcase yourself.

The feedback loop

Because of these two simple problems, the problems get worse. Buyers prefer cheap labour. Cheap labour get better rating and get priority. New comers lower their rates to get job. With the influx of skilled and cheap labour, there is no incentive for buyers to pay more.

Should you use a freelancing platforms

Although it sounds all doom and gloom, there's still some hope. Just like how some would pay double for a Mac Book, there will be some who would pay more for the service they like. It is still a good way to get access to decent jobs once in a while. Perhaps it's worth a try.

Keywords: freelance, job, work

Harnessing the power of Webpack for Django development

Webpack has become the de facto JS build tool. It is used in almost every front end project; whether it's a SPA written in React or a HTML site styled with SCSS. Because of this, backend developers who work with Django or Laravel cannot escape Webpack. If you can't beat them, join them.

How can Webpack help?

With Webpack we can automatically reload the page when our Django code changes. Similarly, we can also take advantage of Webpack Hot Module Replacement to hot reload our CSS and JS files. We can achieve that by using the Webpack Dev Server as a proxy for our Django server. The Webpack dev server will handle all the frontend request and pass the rest to Django.

The Setup

Let's install all the dependencies for this to work.


npm i -D webpack webpack-cli webpack-dev-server

In your Webpack config add the following:


devServer: {
	contentBase: 'path/to/static',
	host: 'localhost',
	port: 8000,
	writeToDisk: true, // optional
	hot: true,
	index: '', // to proxy root
	proxy: { // proxy everything
		context: () => true,
		target: 'http://localhost', // address of Django server
	},
	// reload for our django files
	before(app, server) {
		const chokidar = require('chokidar') // this is dependencies used by webpack
		chokidar.watch('path/to/django/files').on('all', function(event, path) {
			server.sockWrite(server.sockets, 'content-changed');
		})
	},
}

Set contentBase to point to your Django static directory. This is where Webpack will serve all the static content and client code.

The proxy option will send all request it cannot handle to your Django server, while serving the CSS and JS files themselves. This is how we achieve HMR.

In the before config, we will watch any Django files that we want to trigger a full page reload. A good candidate would be the template files.

Run the following command to start the dev server:


webpack-dev-server --config ./webpack.config.js --mode development

You might also want to add a command to your npm scripts for easy access.

New Levels of Productivity

Now you can edit your JS, CSS, HTML, and Python files and have them reload automatically. We achieved that with only one Webpack config and no change to our work flow.

Keywords: django, webpack

Easiest way to reload webpack dev server when other files change

Using webpack dev server with hot module reload for your frontend is very convenient and productive. But if a file is not a dependency in webpack, it will not reload. This can be a hassle if you're editing HTML files or developing alongside a backend like Django or Laravel, and would like the page to refresh when a file changes. This can be accomplish with a just a few lines of code.

Add this to your webpack config:


// webpack config
devServer: {
	before(app, server) {
		const chokidar = require('chokidar');
		chokidar.watch('path/to/my-files/').on('all', function (event, path) {
			server.sockWrite(server.sockets, 'content-changed');
		});
	},
}

What does this code do?

The before() function will be run by webpack when starting the dev server. It allows you to add your own middleware etc. require('chokidar') imports a file watching library used by webpack. It is similar to nodejs fs.watch() but with some nice functionality added. You're free to use a different library if you want. Next, we ask chokidar to watch the some files. It accepts a wide variety of inputs such as globs and arrays. Each time a file changes, we will tell webpack that a content has changed and it will reload the page for us.

That was surprisingly easy! Yet there is almost no documentation anywhere, and even some plugins solving this problem!

Keywords: webpack, dev, server, hot, reload, backend, file, change

Comparison of Golang Input Validator Libraries

This is a comparison of some open source golang validation libraries on github. These libraries do not handle data sanitization. Each libraries will have it's strengths and weakness. Let's get started.

go-playground/validator

go-playground/validator implements value validations for structs and individual fields based on tags. It's probably the oldest and most popular validation library out there. It supports complex validation methods such as cross-field comparison, i18n, diving into fields. This library would be great if you have complex validation requirements.

asaskevich/govalidator

asaskevich/govalidator is a package of validators and sanitizers for strings, structs and collections, and is based on validator.js. Similar to go-playground/validator, govalidator uses struct tags to define rules. It also provides individual validation functions and is a little easier to learn. This is a great alternative if you like mature and popular libraries.

go-ozzo/ozzo-validation

go-ozzo/ozzo-validation provides configurable and extensible data validation capabilities. Unlike the previous two libraries, ozzo-validation uses programming constructs instead of struct tags to define rules. This makes it less error-prone and more readable than the other two.

AgentCosmic/xvalid

AgentCosmic/xvalid is a lightweight validation library that can export rules as JSON so browsers can apply the same rules. xvalid is similar to ozzo-validation in that it uses programming constructs instead of struct tags to define rules. Being able to export rules to the browser will help you keep validation rules consistent. This is a great library if you only need simple validation.

Keywords: golang, library, validator, validation, open source, comparison, alternative

Increase Golang web app testing speed by over 2 times with parallel testing and more

We're all familiar with how tests get slower as we add more over time. If you're not already using parallel testing in Go, this article will give you an example of how to implement it in a web application. In one of my projects, I managed to decrease test time from 40s to just over 10s by employing these techniques.

The Solution

The easiest way to get major speed improvements is to add t.Parallel() to every top level test function.


func TestMyFunc(t *testing.T) {
	t.Parallel() // add this line
	... testing code
}

There's a pretty high chance that your test might break. The biggest culprit is having have singletons in your code. If any test uses a shared state, whether in a singleton or not, you cannot make that particular test parallel. Other tests can still be parallel. Some other obstacles could be caused by bad testing practice such as relying on previous test states for the next test, or using the same rows in a table etc. You can solve this by ensuring each test are designed to run in isolation.

Other Techniques

Another simple way to decrease test run time is simply not cleaning up the tests every run. That means not deleting from the database etc. If your tests are designed in isolation, this will be an easy change. The other benefit from this technique is having your tests run in a "dirty" state that mimics production environments.

You can take it a step further and break tests in smaller units like so:


func TestMyFunc(t *testing.T) {
	t.Run("t1", func (t *testing.T) {
		t.Parallel()
	})
	t.Run("t2", func (t *testing.T) {
		t.Parallel()
	})
}

This will also help during debugging because you can isolate the test i.e. -run TestMyFunc/t2

You can shave off a few more seconds by running your code and database off a ramdisk. See how I run Go code from ramdisk.

Lastly, if you're not already using a tool to automatically run your code when you save a file, you should start now to save plenty of time during development.

Keywords: golang, go, test, speed, performance, parellel

How to Get File Notification Working on Docker, VirtualBox, VMWare or Vagrant

Have you tried using tools like inotify, nodemon or webpack inside Docker or VirtualBox but just couldn't get it to work? The problem lies with the virtual file system. They simply do not support the file notification API. The only existing solution out there is to use a polling method. One such program is fswatch. But it becomes slow once the number of files become big.

xnotify solves all these problems easily while adding some awesome features for common use case. It works by sending file events from your host machine to your virtual machine over HTTP. It works on all popular platforms like Windows, Linux and OSX with a single binary. In addition to that, you can easily run commands on file change using the -- option.

Here's a real world example of how I automatically run my Go test and reload the server automatically within VirtualBox:

PowerShell script on my windows host:


.\xnotify.exe --verbose --client :8001 -i . -e "(vendor|\.git)$"

The command above watches all files in the current directory, excluding "vendor" and ".git" folders, and sends the event to localhost:8001, the address where our next program is listening on.

Bash script on my Linux VM to automatically run tests:


./xnotify --verbose --listen "0.0.0.0:8001" --base /my/project/dir --batch 50 -- go test -c app/test -- ./test.test -test.failfast

This will listen on 0.0.0.0:8001 for file events that the Windows program is sending. Then it runs the commands after the -- option, using the working directory of /my/project/dir.

Bash script on my Linux VM to automatically run reload the server:


./xnotify --verbose --trigger --listen "0.0.0.0:8001" --base /my/project/dir --batch 50 -- go build cmd/main.go -- ./main

This works in the same way, except that the commands after -- is used to run the web server instead of the tests.

For more examples, see the README at the project page.

Keywords: file, notification, docker, virtualbox, vmware, vagrant, xnotify

Go: Automatically Run Tests or Reload Web Server on File Changes

Wouldn't it be convenient if you could automatically run your tests or reload your web server whenever you save your file. Here's a very simple way to achieve it. First download xnotify. Then run the following command in your project directory.


./xnotify -i . -- go test -c ./... -- ./test.test

That will automatically run your tests.


./xnotify -i . -- go build main.go -- ./main

That will automatically reload your web server.

What the command does is watch the current directory with -i . then runs the command that comes after --. The equivalent of running go test -c ./... && ./test.test. The -c option tell Go to compile into a binary instead of running the test. We need to do this so that the child process would not stay alive after killing the parent process. Here's a more complex use case:

./xnotify -i . -e "(vendor|\.git)$" --batch 50 -- go test -c project.com/package/name -- ./test.test -test.failfast -test.run TestThisFunction

Some explanation on the code:

-e "(vendor|\.git)$" excludes "vendor" and ".git" directory.

--batch 50 will wait for 50ms before executing so that saving multiple files at once wouldn't restart too many times.

This time we pass a package name instead of the path with project.com/package/name.

-test.failfast will stop the test if any tests fails.

-test.run TestThisFunction will only run the test function called "TestThisFunction".

Keywords: go, automatic, cli, xnotify, test, build, server

Which subreddit deletes the most comment? Is there censorship?

If you've been a long time reddit user you've probably realized that some subreddits are filled with deleted comments. Out of curiosity, I decided to do a quick analysis on just how many comments are deleted. To get started let's look at the result.

Result

Looking at the top 10 subreddit gives us a good baseline for how many comments are usually deleted because of the larger sample size they provide. Interestingly r/science has an absurd amount of deleted comments. In the second chart we see that the average amount of deleted comments is 0.23% without r/science. That number will be useful for the next chart.

In the last chart I picked some interesting subreddits to study. The green line shows the average amount that was taken from the previous chart. Something I noticed is that style of moderating is a bigger influence on the deletion than the nature of the sub. For example we can see AskMen (0.11%) vs AskReddit (0.2%) vs AskWomen (2.53%), and relationship_advice (0.38%) vs relationships (1.32%).

Methodology

All data were pulled from the top 30 posts in each subreddit over 7 days in an ad-hoc manner. All were done at least 24 hours apart to prevent any overlap.

Source Code

I'm releasing the source code so everyone can perform their own analysis. It's written in JavaScript and needs to be run in the browser on reddit. The main functions to run are right at the bottom.


function getPosts(subreddit, callback) {
  fetch('https://gateway.reddit.com/desktopapi/v1/subreddits/' + subreddit + '?&sort=hot')
    .then(response => {
      return response.json()
    })
    .then(data => {
      let ids = Object.keys(data.posts)
      callback(ids)
    })
    .catch(err => {
      console.error(err)
    })
}

function getComments(id, callback) {
  fetch('https://gateway.reddit.com/desktopapi/v1/postcomments/' + id + '?sort=top&depth=100&limit=100000')
    .then(response => {
      return response.json()
    })
    .then(data => {
      let stats = {
        total: 0,
        deleted: 0
      }
      for (let id in data.comments) {
        let comment = data.comments[id]
        stats.total++
        if (comment.deletedBy == 'moderator') {
          stats.deleted++
        }
      }
      callback(stats)
    })
    .catch(err => {
      console.error(err)
    })

}

function analyze(subreddit, callback) {
  getPosts(subreddit, ids => {
    var total = 0,
      deleted = 0,
      counted = 0
    for (let id of ids) {
      getComments(id, stats => {
        total += stats.total
        deleted += stats.deleted
        counted++
        if (counted == ids.length) {
          callback(subreddit, deleted, total);
        }
      })
    }
  })
}

function analyzeToJSON(subs) {
  var dict = {}

  function loop(i) {
    if (i >= subs.length) {
      console.log(JSON.stringify(dict));
      return
    }
    analyze(subs[i], (sub, deleted, total) => {
      dict[sub] = {
        deleted: deleted,
        total: total
      }
      loop(++i)
    })
  }
  loop(0)
}

function analyzeToCSV(subs) {
  let csv = ''

  function loop(i) {
    if (i >= subs.length) {
      console.log(csv);
      return
    }
    analyze(subs[i], (sub, deleted, total) => {
      csv += `${sub},${deleted},${total},${Math.round(deleted / total * 10000, 2) / 100}%\n`;
      loop(++i)
    })
  }
  csv += 'sub,deleted,total,percent\n';
  loop(0)
}

function compile(entries) {
  let results = {}
  for (let entry of entries) {
    for (let sub in entry) {
      if (!results[sub]) {
        results[sub] = {
          deleted: 0,
          total: 0,
        }
      }
      let r = results[sub]
      let e = entry[sub]
      r.deleted += e.deleted
      r.total += e.total
    }
  }

  for (let sub in results) {
    let r = results[sub]
    r.percent = r.deleted / r.total
  }

  return results
}

function getMean(entries) {
  let total = 0,
    count = 0
  for (let sub in entries) {
    total += entries[sub].percent
    count++
  }
  return total / count
}

function getMedian(entries) {
  let list = []
  for (let sub in entries) {
    list.push(entries[sub].percent)
  }
  list.sort((a, b) => a - b)
  if (list.length % 2 === 0) {
    let a = list.length / 2,
      b = a - 1
    return (list[a] + list[b]) / 2
  } else {
    return list[(list.length - 1) / 2]
  }
}

function getStdDiv(entries) {
  let list = []
  let mean = getMean(entries)
  for (let sub in entries) {
    let d = entries[sub].percent - mean
    list.push(Math.sqrt(d * d))
  }
  return list.reduce((accumulator, currentValue) => accumulator + currentValue) / list.length
}

function getCSV(data) {
  let csv = 'sub,deleted,total,percent\n'
  for (let sub in data) {
    let r = data[sub]
    csv += `${sub},${r.deleted},${r.total},${Math.round(r.percent * 10000, 2) / 100}%\n`;
  }
  return csv
}

function formatPercent(i) {
  return Math.round(i * 10000, 2) / 100 + '%'
}

function report(entries) {
  const percents = Object.values(entries).map(e => e.percent)
  const max = formatPercent(Math.max.apply(null, percents))
  const min = formatPercent(Math.min.apply(null, percents))
  console.log(`Mean: ${formatPercent(getMean(entries))}, Median: ${formatPercent(getMedian(entries))}, Standard Deviation: ${formatPercent(getStdDiv(entries))}, Min: ${min}, Max: ${max}`);
  console.log(getCSV(entries));
}

var presets = {
  top10: [
    'funny', 'AskReddit', 'gaming', 'pics', 'science', 'worldnews', 'todayilearned', 'aww', 'movies', 'videos'
  ],
  custom: [
    'AskReddit', 'AskMen', 'AskWomen', 'TwoXChromosomes', 'sex', 'relationships', 'relationship_advice',
    'unpopularopinion', 'news', 'politics', 'The_Donald', 'MGTOW', 'AmItheAsshole',
  ],
}

var results = {
  top10: [{
      "funny": {
        "deleted": 3,
        "total": 5053
      },
      "AskReddit": {
        "deleted": 34,
        "total": 8687
      },
      "gaming": {
        "deleted": 2,
        "total": 3954
      },
      "pics": {
        "deleted": 4,
        "total": 3691
      },
      "science": {
        "deleted": 195,
        "total": 2786
      },
      "worldnews": {
        "deleted": 30,
        "total": 5580
      },
      "todayilearned": {
        "deleted": 1,
        "total": 4889
      },
      "aww": {
        "deleted": 5,
        "total": 3133
      },
      "movies": {
        "deleted": 10,
        "total": 5129
      },
      "videos": {
        "deleted": 1,
        "total": 4038
      }
    }, // add more results here
  ],
  custom: [{
      "AskReddit": {
        "deleted": 34,
        "total": 8723
      },
      "AskMen": {
        "deleted": 0,
        "total": 1626
      },
      "AskWomen": {
        "deleted": 37,
        "total": 2351
      },
      "TwoXChromosomes": {
        "deleted": 67,
        "total": 1872
      },
      "sex": {
        "deleted": 9,
        "total": 549
      },
      "relationships": {
        "deleted": 12,
        "total": 1276
      },
      "relationship_advice": {
        "deleted": 12,
        "total": 2684
      },
      "unpopularopinion": {
        "deleted": 5,
        "total": 2233
      },
      "news": {
        "deleted": 20,
        "total": 3252
      },
      "politics": {
        "deleted": 30,
        "total": 7723
      },
      "The_Donald": {
        "deleted": 21,
        "total": 3069
      },
      "MGTOW": {
        "deleted": 1,
        "total": 850
      },
      "AmItheAsshole": {
        "deleted": 30,
        "total": 8131
      }
    }, // add more results here
  ],
}

// analyzeToJSON(presets.top10)
// analyzeToCSV(presets.custom)
// report(compile(results.top10))
// report(compile(results.custom))

Keywords: reddit, subreddit, comments, censorship

How to Compile Go Code 40% Faster With RAM Disk

Go is already famous for it's impressive compilation speed. But if you came from a scripting language with practically no compile time, you're probably not satisfied. Here's the compile time after running go build main.go a few times to warm up the file system.


real	0m2.590s
user	0m2.685s
sys	 0m0.775s

It's easily 2 to 3 times slower when the files aren't cached. Which could happen if your disk is experiencing a lot of usage. Here's the compile time when compiling from RAM disk. A whopping 40% faster; almost a second off.


real	0m1.871s
user	0m2.124s
sys	 0m0.380s

Here's the bash script to get things working:


#!/bin/sh

if [ ! -d ~/ramdisk ]; then
mkdir ~/ramdisk
fi
sudo mount -t tmpfs -o size=512M tmpfs ~/ramdisk
rsync -ah ~/go ~/ramdisk/
rsync -ah --exclude '.git' ~/path/to/project ~/ramdisk
export GOPATH=$HOME/ramdisk/go

This creates a directory under the home folder as ~/ramdisk. Then assigns 512MB disk space and mounts it on the RAM. The rsync calls copy all Go files and project files to the RAM disk. Finally, it sets GOPATH to the new Go path under ~/ramdisk.

The next step is to reflect all file changes to the RAM disk instead of editing the files directly on it. This way you don't have to worry about losing your work. To do that we need a tool to watch for file changes and automatically duplicate the file. You can use any tool you like e.g. inotify, fswatch, nodemon etc. I'm going to use xnotify, a high level tool which can help with the build process.


./xnotify --verbose -i . --base /home/vagrant/ramdisk/project --batch 50 -- go build cmd/main.go -- ./main | xargs -L 1 ./copy.sh

copy.sh:


#!/bin/sh

NAME=$2
SRC=/path/to/project/$NAME
if [ -f $SRC ]; then
echo Copying: $NAME
cp $SRC ~/ramdisk/project/$NAME
fi

The command above basically copies the file to the RAM disk and runs go build cmd/main.go && ./main when a file changes. Now if we want to stop using the RAM disk we just need to run this script:


#!/bin/sh

sudo lsof -n ~/ramdisk
sudo umount ~/ramdisk
rm ~/ramdisk -r
export GOPATH=$HOME/go

Keywords: go, golang, compile, speed, ram, disk, ramdisk