Binary Jam

Simon's blog, SharePoint, Arduino type things.

Quickly editing and beatifying JSON files in a folder

I had a need to batch edit a bunch of json files I had scraped containg my user information, each in a seperate file. There was going to be a mass update of the system so this way I could get a before and after.

Certain attributes in the export of each user will have changed in every item, and they were not relevant so when doing a bulk comparison using code compare these would show up so had to be removed.

My files were all named out.guidofuser.json and stored in folders by group so I needed to edit all of them, remove the attributes, beautify them else viewing the comparison would be hard if there was a difference.

NodeJS is wonderful at doing this stuff.

a quick npm init and npm install fs, path then this code, ripped through all my json files, edited and beautified them.

var fs = require("fs");
var path = path || require('path');

var walkSync = function (dir, filelist) {

    files = fs.readdirSync(dir);
    filelist = filelist || [];
    files.forEach(function (file) {
        if (fs.statSync(path.join(dir, file)).isDirectory()) {
            filelist = walkSync(path.join(dir, file), filelist);
        } else {
            var fPath = path.join(dir, file);
            var jsFile = require(fPath);

            for (var g in jsFile.groups) {
                delete jsFile.groups[g].modified;
                delete jsFile.groups[g].created;
            delete jsFile.lastLogin;
            delete jsFile.loginCount;

            var json = JSON.stringify(jsFile, null, 4);
            fs.writeFileSync(fPath, json);

    return filelist;

var fl = walkSync("C:\\Users\\simon\\stg2", []);

Local Host file use and Azure Webs

Sometimes you are working on a site and you need to test the real url on it, but you are not ready to flip the DNS entry yet.

In IIS land you could add a host header and add an entry into your hosts file and it would just work.

Now enter Azure and you could have a dev site, perhaps even a deployment slot that contains your website and you want to test it, but some of that code relies on the URL, you have a URL Rewrite perhaps.

In order to add a custom domain to Azure you need to be able to make changes to the DNS server, it leads you to make either a CNAME entry or an A Rec / TXT entry and it validates that before adding the hostname binding.

So how do you do a test on dev with the real URL and a local hosts file entry ?

Well it turns out that you only need the TXT entry to confirm ownership, which will then allow the addition of the Custom Domain to Azure.

My TXT DNS entry

How Adding a host looks, before DNS entry. Oh well you can spot my IP if you look hard enough. Its not a real site anyways.

Once you have added a TXT record and the propagation has happened, then even though you do not have an A record, or CNAME, the Custom domain will be allowed to be added, just hit that Add hostname button and its there.

Of course you cannot get to it unless you add an entry to your DNS server or hosts file for testing.

My JSLink best practice

There are lots of examples about regarding how to correctly do JSLINK stuff, and I’ve nicked ideas from all of them.

I’ve not been happy with any of them and I still wonder about mine, but this is the best I’ve come up with.

It’s MDS compliant, in includes a routine to automatically assign a view to allow multiple JSLinks on a page and apply the same jslink on diff parts if required.

Its written in a module.

It’s a work in progress, it will evolve but I reckon I’m as far as I can get in this evolution.

Bits taken from Wictor Wilén, Martin Hatch, Paul Cimares.


Best video on build

Not really for anyone else but me, but if you interested in building a pi with  then feel free.

Watch this video cos its the best i’ve seen so far



Excellent article on IIS Export Application

Great article on creating an IIS export and configuring all the settings.

This will also help with those trying to create a paramters.xml file

How to: Create IIS Site Package with Web deploy

How to: Create IIS Site Package with Web deploy

Not Self Hosting

Had enough of self hosting and for $13 a year wordpress can sort it all out and I keep my domain.

So seems to be ok. DNS kicked in for me.  Cheeky wordpress doesnt do www, I find that out later after I paid.  But google’s results are all being properly directed to the equivalent page on wordpress so that’s nice.

So a saving of $11 a year is nowt really, but none of the hassle of hosting and constant updates, and the main reason I moved was a hack notice I got and I had to re-verify with google. Who needs the hassle when you know that nowadays I might have 3 registered viewers and a couple of hundred search drop ins when the main reason for this is for me to keep a record of my most handy stuff, though I should pull my finger out and slap it all in github.






BrowserSync, gulp based script, handling middleware via Corp proxy

Phew that was a long title.  So what’s this about.

I live in a land of corporate proxies with giant .pac scripts, of https services and authenticated proxies.

It …makes….all….this….js….dev… HELL.

I use browser-sync as my local testing sever, its great,  I need it to handle requests to remote apis because of CORS and other security issues, until I can wrap a proxy around the remote system, even then its handy to have the ability to proxy the api calls via a node server (browser sync) for me.

This becomes an absolute bloody nightmare when you have an authenticated corporate proxy server.

None of the JS tools play nice,  there is no such thing as a centralised store for proxy settings, so you have to enter then in the .rc file of every tool, git, npm, bower, and now the custom middleware.  This is where windows got it right and Linux, well sucks.   Oh I wish that I still had ISA servers client transparent proxy.

So the example I have here is a gulp file, that configures browser-sync to run and to call into the middleware extension to handle proxying of api calls to my remote system and for that component to play nice with the corporate proxy.

You need the agent, I tried without it and failed miserably.


var gulp = require('gulp');
var browserSync = require('browser-sync').create();
var proxy = require('http-proxy-middleware');
var HttpsProxyAgent = require('https-proxy-agent');

var proxyServer = "http://localhost:8080";   //Cos Fiddler yeh!

var jsonPlaceholderProxy = proxy('/api/', {
    target: '',
    changeOrigin: true,
    logLevel: 'debug',
    secure: true,
    agent:new HttpsProxyAgent(proxyServer)

gulp.task('default', function () {
    "port": 8000,
    injectChanges: true,
    "files": ["./src/**/*.{html,htm,css,js,json}"],
    "server": { "baseDir": "./src" },










DisplayTemplate:Hide edit fields based on choice field

This is a simple displaytemplate (JSLINK) that will hide fields based on a value in a choice.  I use it for pseudo content-types, having the advantage of being able to switch.

It’s not MDS compliant.

In this example “ItemType” is the field name of the Choice field and could be “Standard” or “PopUp”

The array of hideme items are the fields that are hidden when that choice field is selected.

The HiddenItems starts empty and contains the hidden elements to be unhidden on a change of type.

It relies on jQuery being in the masterpage or somewhere on the page.

You store this as a JS file in a doclib or SPfolder or siteassets etc,  edit the form page and point the JSLINK setting to that file.  remember to use the tokens ~site or ~sitecollection as JSlink doesnt like fixed urls.

$(function () {
"use strict";

    var hiddenItems = [];
    var hideMeItems = {
        Standard: ["[id^=PopUpBodyText]","[id^=ReadMoreUrl]","[id^=ReadMoreUrlTarget]"],
        PopUp: ["[id^=Teaser]","[id^=TargetUrl]","[id^=TargetUrlTarget]"]


    function hideItems() {
        var selected = ($('[id^="ItemType"]').val());

        $.each(hiddenItems, function () {
        hiddenItems.length = 0;

        $.each(hideMeItems[selected], function () {
            var tr = $(this).closest('tr');

Adventures in BrowserSync

I’m real new to browsersync and node development. So this has been a pretty steep learning curve, but I thought I’d document something I had to figure out as the documentation and guides on the web are hard to find or just missing.

For those who don’t know and are new to this javascript lark, browsersync is tool that runs under node to create a mini web server, but also it injects javascript into your pages and communicates to the server when the file watcher sees a change to a file.

The effect of this is you can configure it, then run this thing to point at the files in your directory that you are editing, it will fire up a browser and as soon as you save a file it will reload the page.  A real cool feature is called hot reloading, in certain circumstances and configuration it can detect you have changed say an image or css file and it will only change that item in the page, it uses JS to mangle it to the new version and won’t do a full page reload.

I’m using a modified version of browsersync called lite-server, by john papa, just because it was the one I came across first. I’ll be honest I’m not sure what lite-server gives me over browsersync native, it’s just where I started.  That said, you will spend a lot of time reading the browsersync docs not the lite-server page.

The main point of me writing this article was that as well as serving pages and auto reload, browsersync gives me the ability to handle API calls and proxy them to local files (possible another server but I’m not there yet).

In the framework I’m writing, to mimic the new SP framework (early days though) experience but on legacy stuff to deliver a sandbox WSP, the example code makes a call using SPServices library (this could be REST) that call as you may know has the path _vti_bin in it.  So my browsersync config has code in it (the config is javascript) that can intercept this and deliver my content instead.

Below is the bs-config.js file I wrote to achieve this.

The module.exports is the standard bit that configures bs with what files to the watch and how to configure the mini server

The special part is the middleware setting. I have set it so that the 2nd param points to my handleApi function call.  The reason I set the 2nd param (thats the “1:” bit) is that if you clear the 1st parameter then it no longer logs to the console the items its serving, which is handy.

As you can see the handleApiCall function is real simple, it detects the “_vti_bin” in the path and reads a file from a specific place and puts it out in the response stream along with the correct headers for xml.

This could be improved, lots, it could read the request object and parse it to determine what file to send back.

Of course someone has probably already done something like this, but I needed to do something quickly and there is enough to learn.

Saything that I will be looking into proxy-middleware a module for express/browsersync that will likely proxy to a real server not just my local files.


You learn there are so many OS projects out there in npm land so its hard to find the right things.

// jshint node:true
function handleApiCall(req, res, next) {
    if (req.url.indexOf('_vti_bin') !== -1) {
        var fs = require("fs");
        fs.readFile("./WebComponents/.container/.mockapi/1.xml",function (err, data) {
            if (err) throw err;
            res.setHeader('Content-Type', 'text/xml');

module.exports = {
    'port': 8000,
    'files': [
    'server': {
        'baseDir': './WebComponents/src',
        'middleware': {            2:handleApiCall        }

Create a free website or blog at

Up ↑