Search

Binary Jam

Simon's blog, SharePoint, Arduino type things.

Best video on Homeassistant.io build

Not really for anyone else but me, but if you interested in building a pi with homeassistant.io  then feel free.

Watch this video cos its the best i’ve seen so far

 

 

Excellent article on IIS Export Application

Great article on creating an IIS export and configuring all the settings.

This will also help with those trying to create a paramters.xml file

How to: Create IIS Site Package with Web deploy

How to: Create IIS Site Package with Web deploy

Not Self Hosting

Had enough of self hosting and for $13 a year wordpress can sort it all out and I keep my domain.

So seems to be ok. DNS kicked in for me.  Cheeky wordpress doesnt do www, I find that out later after I paid.  But google’s results are all being properly directed to the equivalent page on wordpress so that’s nice.

So a saving of $11 a year is nowt really, but none of the hassle of hosting and constant updates, and the main reason I moved was a hack notice I got and I had to re-verify with google. Who needs the hassle when you know that nowadays I might have 3 registered viewers and a couple of hundred search drop ins when the main reason for this is for me to keep a record of my most handy stuff, though I should pull my finger out and slap it all in github.

 

 

 

 

 

BrowserSync, gulp based script, handling middleware via Corp proxy

Phew that was a long title.  So what’s this about.

I live in a land of corporate proxies with giant .pac scripts, of https services and authenticated proxies.

It …makes….all….this….js….dev… HELL.

I use browser-sync as my local testing sever, its great,  I need it to handle requests to remote apis because of CORS and other security issues, until I can wrap a proxy around the remote system, even then its handy to have the ability to proxy the api calls via a node server (browser sync) for me.

This becomes an absolute bloody nightmare when you have an authenticated corporate proxy server.

None of the JS tools play nice,  there is no such thing as a centralised store for proxy settings, so you have to enter then in the .rc file of every tool, git, npm, bower, and now the custom middleware.  This is where windows got it right and Linux, well sucks.   Oh I wish that I still had ISA servers client transparent proxy.

So the example I have here is a gulp file, that configures browser-sync to run and to call into the middleware extension to handle proxying of api calls to my remote system and for that component to play nice with the corporate proxy.

You need the agent, I tried without it and failed miserably.

gulpfile.js

var gulp = require('gulp');
var browserSync = require('browser-sync').create();
var proxy = require('http-proxy-middleware');
var HttpsProxyAgent = require('https-proxy-agent');

var proxyServer = "http://localhost:8080";   //Cos Fiddler yeh!

var jsonPlaceholderProxy = proxy('/api/', {
    target: 'https://www.binaryjam.com',
    changeOrigin: true,
    logLevel: 'debug',
    secure: true,
    agent:new HttpsProxyAgent(proxyServer)
});

gulp.task('default', function () {
  browserSync.init({
    "port": 8000,
    injectChanges: true,
    "files": ["./src/**/*.{html,htm,css,js,json}"],
    "server": { "baseDir": "./src" },
    "middleware":jsonPlaceholderProxy
  });
});

 

 

 

 

 

 

 

 

 

DisplayTemplate:Hide edit fields based on choice field

This is a simple displaytemplate (JSLINK) that will hide fields based on a value in a choice.  I use it for pseudo content-types, having the advantage of being able to switch.

It’s not MDS compliant.

In this example “ItemType” is the field name of the Choice field and could be “Standard” or “PopUp”

The array of hideme items are the fields that are hidden when that choice field is selected.

The HiddenItems starts empty and contains the hidden elements to be unhidden on a change of type.

It relies on jQuery being in the masterpage or somewhere on the page.

You store this as a JS file in a doclib or SPfolder or siteassets etc,  edit the form page and point the JSLINK setting to that file.  remember to use the tokens ~site or ~sitecollection as JSlink doesnt like fixed urls.

$(function () {
"use strict";


    var hiddenItems = [];
    var hideMeItems = {
        Standard: ["[id^=PopUpBodyText]","[id^=ReadMoreUrl]","[id^=ReadMoreUrlTarget]"],
        PopUp: ["[id^=Teaser]","[id^=TargetUrl]","[id^=TargetUrlTarget]"]
    };


    $('[id^="ItemType"]').change(hideItems);
    hideItems();

    function hideItems() {
        var selected = ($('[id^="ItemType"]').val());

        $.each(hiddenItems, function () {
            this.show();
        });
        hiddenItems.length = 0;

        $.each(hideMeItems[selected], function () {
            var tr = $(this).closest('tr');
            tr.hide();
            hiddenItems.push(tr);
        });
    }
});

Adventures in BrowserSync

I’m real new to browsersync and node development. So this has been a pretty steep learning curve, but I thought I’d document something I had to figure out as the documentation and guides on the web are hard to find or just missing.

For those who don’t know and are new to this javascript lark, browsersync is tool that runs under node to create a mini web server, but also it injects javascript into your pages and communicates to the server when the file watcher sees a change to a file.

The effect of this is you can configure it, then run this thing to point at the files in your directory that you are editing, it will fire up a browser and as soon as you save a file it will reload the page.  A real cool feature is called hot reloading, in certain circumstances and configuration it can detect you have changed say an image or css file and it will only change that item in the page, it uses JS to mangle it to the new version and won’t do a full page reload.

I’m using a modified version of browsersync called lite-server, by john papa, just because it was the one I came across first. I’ll be honest I’m not sure what lite-server gives me over browsersync native, it’s just where I started.  That said, you will spend a lot of time reading the browsersync docs not the lite-server page.

The main point of me writing this article was that as well as serving pages and auto reload, browsersync gives me the ability to handle API calls and proxy them to local files (possible another server but I’m not there yet).

In the framework I’m writing, to mimic the new SP framework (early days though) experience but on legacy stuff to deliver a sandbox WSP, the example code makes a call using SPServices library (this could be REST) that call as you may know has the path _vti_bin in it.  So my browsersync config has code in it (the config is javascript) that can intercept this and deliver my content instead.

Below is the bs-config.js file I wrote to achieve this.

The module.exports is the standard bit that configures bs with what files to the watch and how to configure the mini server

The special part is the middleware setting. I have set it so that the 2nd param points to my handleApi function call.  The reason I set the 2nd param (thats the “1:” bit) is that if you clear the 1st parameter then it no longer logs to the console the items its serving, which is handy.

As you can see the handleApiCall function is real simple, it detects the “_vti_bin” in the path and reads a file from a specific place and puts it out in the response stream along with the correct headers for xml.

This could be improved, lots, it could read the request object and parse it to determine what file to send back.

Of course someone has probably already done something like this, but I needed to do something quickly and there is enough to learn.

Saything that I will be looking into proxy-middleware a module for express/browsersync that will likely proxy to a real server not just my local files.

Alternatively https://www.npmjs.com/package/apimock-middleware.

You learn there are so many OS projects out there in npm land so its hard to find the right things.

// jshint node:true
function handleApiCall(req, res, next) {
    
    if (req.url.indexOf('_vti_bin') !== -1) {
        var fs = require("fs");
        fs.readFile("./WebComponents/.container/.mockapi/1.xml",function (err, data) {
            if (err) throw err;
            res.setHeader('Content-Type', 'text/xml');
            res.end(data.toString());
        });
     
    }
    else{
        next();
    }
}

module.exports = {
    'port': 8000,
    'files': [
        './WebComponents/src/**/*.*'
    ],
    'server': {
        'baseDir': './WebComponents/src',
        'middleware': {            2:handleApiCall        }
    }
};

Braindump:Adding CORS support to old SOAP Webservice

This has been a bit of a nightmare,  once you start you will find hundreds of stackexchange articles about this and the problems you will have.

So here are some key things.

Below IE11 ? (maybe 10)  CORS support was provided with the XDR object and this wasn’t automatically used in libraries like jQuery so your jQuery stuff wont work because IE doesnt use the proper XMLHttpRequest objects or what it has is borked.  Till now at least.     So IE11 right!.

There is some simple code you need in your .NET project.

This extract of System.webserver is needed (Dont just paste this you have to insert it into an existing section of your web.config.

This will allow ANYONE to connect. So go read up what each of these attribute do.
What they will do together is all get added to the HTTP headers returned to the server on all items, yes including your aspx pages, now feel free to work out how to restrict that, I had enough by this point and my site only has two things on it both needing this.

  <system.webServer>
    <httpProtocol>
      <customHeaders>
        <!-- CORS Enabled -->
        <add name="Access-Control-Allow-Origin" value="*" />
        <add name="Access-Control-Allow-Methods" value="GET,PUT,POST,DELETE,OPTIONS" />
        <add name="Access-Control-Allow-Headers" value="origin, content-type, accept" />
        <add name="Access-Control-Allow-Credentials" value="true" />
        <add name="Access-Control-Max-Age" value="31536000" />
      </customHeaders>
    </httpProtocol>
</system.webServer>

Next you will need a global.asax  (not a global.aspx.cs like some guides refer to)

The function Application_BeginRequest is one of those that gets called as part of the request lifecycle.
What we are doing here is handling the case when you are doing a non-standard request which for SOAP services will actually be “text/xml;”

Having a non-standard request initiates part of the CORS protocol that does what they call a pre-flight request, to ask the server, “is this allowed or what?”, you are reponding with a A=OK matey.

protected void Application_BeginRequest(object sender, EventArgs e)
{
    if (HttpContext.Current.Request.HttpMethod == "OPTIONS")
    {
        HttpContext.Current.Response.StatusCode = 200;
        HttpContext.Current.Response.End();
    }
}

Thats its.  That’s all you need to do server side to do this.

A matching client side request might look like this

$.ajax({
       url: serviceUrl,
       type: "POST",
       dataType: "xml",
       data: soapEnv,
       crossDomain: true,
       contentType: "text/xml; charset="utf-8"",
})

For me all this started working.  I did all my mappings, and pushed my array into knockout observables and Chrome was working brilliantly.

Then came IE. The pain, the endless searches.

Whilst debugging this I could see my “options” (preflight) request was being made and IE reported no headers returned.  Which was nonsense cos chrome was doing it and working.

I rewrote that C# code over and over with many alternatives.  Chatted to a nodejs bloke who showed me his code, which did exactly the same (well close enough).

That code works.

I saw articles that said that the website your connecting has to be in the Intranet security zone.  Bullshit! If that’s the case how can you connect to yahoo or any other CORS compliant service.

So the one to watch when your developing this stuff.   SELF CERT SSL.

If you have created a self cert and you browse to the service in IE and its got a RED un trusted cert and it will to begin with, this wont work.  The confusing part is the chrome doesnt care, and also IE issues and gets the OPTIONS http request, instead of throwing an error before hand, it shows a 200 status, but it has to issue a request to know its not secure, its why it doesnt show any headers in the network analysis I suspect because at that point it just goes “eeek” and stops.

So export your certificate, dont next next next it,  export it specifically to “trusted root” *  DO THAT AT YOUR OWN RISK, if your not sure dont do it and go buy a proper cert to get this working.

Close your browser and check it worked by navigating to the service endpoint in the browser, if its not red it worked, if it is go do it again, but right.

With all this done you ajax call from IE11 should work with CORS.

If it don’t well good luck🙂.

 

VS Code, Git, Gulp and SharePoint

UPDATE:Since writing this VSCode 1.0 was released which told me I was using the old Git package.  So I updated Code and Git. The new Git has a better windows credential manager and is built in now, you may find the git config win.cred stuff in here is not needed, so experiment and don’t do that bit unless you need to.  Also this was on Windows not mac or Linux, so some things might be different.

UPDATE2:I created a framework for Sandbox solutions based on gulp and git and browsersync and yeoman and npm and bower etc.  search for generator-sb-framework.   It was done 2 weeks before they announced SPfx framework, I was thinking along the same lines, but without access to 365 back end code🙂   Mine will work on SP2010 and 20!£ and 365 though.  It’s a work in progress anyone wants to help let me know. Now on with the Article.

 

 

I get asked to “knock stuff up” quite often, its a single page that has to grab data and display it in a funky way and NOT look like SharePoint whilst utilising SharePoint.

I could be deploying it to MOSS or 2010 or 2013, any of the three so I tend to write with SPServices, and I find that REST isnt done.

I tend to write these things as single htm pages hosted in a doc lib, so this lends itself to angular and knockout with bootstrap etc.

Of course when you write stuff like this, what your doing is essentially creating a mini website connecting to services that exist outside of your website, except thans to virtual folder like _layouts they are actually part of your site, you just dont have to worry about CORS.

In the past I would just create the doc lib, open the folder (webDav) and just edit away.

This isn’t good enough though, I want my stuff to be under source code control, natively and you just can’t working like this.

Today I took the time to investigate using VSCode, git and gulp to acheive what I wanted.

To do this stuff you will need, VSCode, Git for Windows, GitHub for windows is handy too it has a better Gui and Node.js installed on your machine. You need NPM in node, but it comes with node now so no need for an extra installation. You will also need a Git repository, I’m using VS online, which has its own issues.  Go ahead and install all that lot.

There are some things called bower, and grunt and yeoman and all that.  Not touching that lot. Not yet anyways, so don’t expect it here.

To start with, create a new visual studio online project, name it, pick a methodology and pick GIT as the Version control.

Navigate to your project, because next I’m making it easier to connect by creating a credential.

Explore your project and you get a page a bit like this

There are various bits on this page, but first let’s click the Generate Git Credentials.

Then click create a personal access token,  this applies across all git projects so if you have one already then you don’t have to do this again, but you need to store that user name password combo securely in something like lastpass or dashlane

These are handy cos you can revoke them and they have a lifespan.

Go back to your default code page cos now we want to clone the Git repository.  This is easier in Git Bash , this works like linux so no “DIR” commands here

 

Firstly, if your in a company chances are you have a proxy server.  You need to configure git with  proxy server settings.  Because git doesn’t have a bypass capability I set my proxy server up as fiddler and then I can control what’s going on with regards to bypass lists.

So as I work in d:dev I changed Dir and then set my proxies

Screenshot_6

This will set you global git config to point at local fiddler address (IF YOU CONFIGURED FIDDLER LIKE THAT).  Feel free to set it to whatever proxy you want.

 

There is a slight problem when using VSCode and git with a remote host, in that when you issue a remote command it needs credentials but has no way of showing them so it just hangs till you kill it.

I found that you can save you creds locally, I do that just before cloning, see next lot of commands.

Next to do is to clone the git repository you just created in Visual Studio online.  This will create the local version of git. You check in there and then sync to the master when you want. Then branch and pull and push and whatever else git does, which I’m still getting my head around.

Copy the Clone command from visual studios code page,

 

Screenshot_7

don’t use mine, its easy to guess what mine would be called, you won’t get access and if you do MS and me will be having words.

but do it in git bash and enter your creds when prompted.  You can use any user name you like, but use the specials credentials you created earlier as the password,  this works, other ways I’ve tried don’t.

Your creds are now cached in the windows credential manager thing, look in control panel to remove it.

Next lets open the Folder in VSCode, File->open folder and select you VSCodeDemo folder, inside of which is a .git folder.

Once open create a test.htm file and put some words in it, anything, this is a test. Save it.

Switch to the Git tab and check in locally those changes, give it a message of “Initial test” and click the Tick at the top

Screenshot_8

Now to sync up to the web, look at the status bar at the bottom and click the cloud with an up arrow icon, which means publish or the recycle icon, you will see one of these.

Screenshot_9

When you return to visual studio on line your code tab now shows the branches you have published, synced, whatever that means.

That’s the git integration done.  Have a play add files rollback etc.  Next up configure VScode and some handy plugins.

When running VSCode in an enterprise and you want to use plugins you have to configure it to use a proxy server.

File>Preferences->User Settings

This will open two files for edit, default settings for reference (dont edit) and the settings.json

Here’s mine

{
	"editor.fontSize": 12,
	"editor.tabSize": "4",
        "http.proxyStrictSSL": false,
        "http.proxy": "http://127.0.0.1:8080",
        "https.proxy": "http://127.0.0.1:8080"
}

 

Again Im using fidder as my proxy server, because I have more control.

Next hit F1 and lets install some extensions.  F1 and type “ext install”

it will show you available extensions, heres what I’ve installed.

Screenshot_10

You may want to control settings for code and your plugins locally per project so open the workspaces prefs.  File->Preferences->Workspace settings.  Again this opens a settings.json file here’s mine

{
    "minify.minifyExistingOnSave": true,
    
        "indent_size": 4,
        "indent_char": " ",
        "css": {
            "indent_size": 2
        },
    "beautify.onSave": false
}

 

This sets my auto minify on to create those min.js files for me automatically. You have to force create one first by hitting F1 and type minify.  This creates the min file and from that point on its automatic. Play with this cos if you change its settings you can minify a whole directory of js files.

Next on to task runner (gulp in vs code).

Make sure you installed node.js and npm works and if thats got proxy config to do go do it (I’ll let you figure it out).

In fact go follow this guide

https://cmatskas.com/setting-up-a-gulp-task-with-visual-studio-code/

OK so that made no sense, but I guess you at least installed gulp globally.

So back to our project.  Open a Node.JS command shell and change directory to our project, now run the command

npm init

Enter the details at the prompts, you can next thru most of them, change the name to all lower case those it complains if you dont.  Im sure these have proper uses but for now, meh.

Now back in VSCode hit F1 and type “Configure Task Runner”

it opens the tasks.json file.  Rip all that out and replace it with this  (we will create the actual gulp task in a minute)

{
    "version": "0.1.0",
    "command": "gulp",
    "isShellCommand": true,
    "args": [
        "--no-color"
    ],
    "tasks": [
        
        {
            "taskName":"deploy",
            "isBuildCommand": true
        }
    ]
}

Save that.  Our task to create is a robocopy so on build it will copy the contents of our folders to another folder.

switch back to node.js and install local version of gulp and robocopy (cos globals wont work)

npm install gulp
npm install robocopy

In the root of our project create a file  “gulpfile.js”

var gulp = require('gulp'),
 fs = require("fs");
 
 var robocopy = require('robocopy');

gulp.task('deploy',  function() {
    return robocopy({
        source: 'Web',
        destination: 'd:\tmp\web',
        files: ['*.*'],
        copy: {
            mirror: false
        },
        file: {
            excludeFiles: ['packages.config'],
             excludeDirs: ['Forms'],
        },
        retry: {
            count: 2,
            wait: 3
        }
    });
});

This creates a gulp task called deploy (the one we matched in the tasks.json) that uses robocopy to copy everything under the folder Web to another folder on this machine d:tmpweb.  In VS code Hit Ctrl-Shift-B and that builds and copies.

So in using this mechanism we have to assume some stuff.  Firstly that we build our deployable content under a folder under the root called web.  We do this cos of all the bloody rubbish node and git pish and paff and every thing else needs.

Because we have all this stupid files in our project though, some of them are not compatible with GIT.  So we have to exclude them from git.  This is easy.

Create a file in the root of the project called  .gitignore

node_modules

Add that line to it to exclude the node_modules, as that’s all that’s incompatible really.

And lastly where the SharePoint ?  You promised me Sharepoint, Tocker!,

Well what we want is to deploy this Web folder into a Document Library in SharePoint (assuming that we are allowed to have .js.htlm.htm extensions files, if not well save everything as a .aspx with no aspx code in it).

Imaging that we have our site collection and doc library, its full path is this

http://binaryjam.sharepoint.com/sites/mysite/myDocLib

Well we just have to copy to the webDav folder equivalent.

This is a slight change to the gulpfile.js robocopy task.

Edit that gulpfile.js file and change destination from d:\tmp\web

to

 destination: '\\binaryjam.sharepoint.com\davwwwroot\sites\mysite\myDocLib',

That davwwwroot is the magic door.

That’s it.  Build SPA’s that live in doc libs with Git source control and minifying and all thatlovely stuff.

Of course this tasker stuff lends itself to all sorts of tasks, like less/sass compilation, typescript compilation for angular 2 (or just typescript).  Have fun.

 

 

 

 

 

MDS Compliant Disp Template example

This is just a reminder for me really, take a look if you like, but its not done, based on Martin Hatch’s colour field example with Wictor’s mds detection.

https://github.com/binaryjam/DisplayTemplateExample

 

/// <reference path="DefinitelyTyped/microsoft-ajax/microsoft.ajax.d.ts" />
/// <reference path="DefinitelyTyped/sharepoint/sharepoint.d.ts" />
/// <reference path="DefinitelyTyped/DisplayTemplateFieldColour.d.ts" />
"use strict";
//This is a work in progress, trying to come up with a kind of best practice, of best practices
//because the office pnp examples do not do things how other JS peeps might.


//Im not convinced about this either, this should be SOD'ed  / Leaving in for future reference
//(jQuery || document.write('//ajax.aspnetcdn.com/ajax/jquery/jquery-1.10.0.min.js'));

//Creates the namespace and registers it so MDS knows its there, 
Type.registerNamespace('BinaryJam.JSLink');

(function(ns) {
	
	ns.DisplayTemplateFieldColour = function () {
		
		//private members
		var overrides = {};
	  	overrides.Templates = {};
	  
	   	overrides.Templates.Fields = {
	       //Colour is the Name of our field
	       'Colour': {
	          'View': colour_FieldItemRender,
	          'DisplayForm': colour_FieldItemRender
	        }
	    };

		//Create CSS classes
		var style = document.createElement('style');
		style.type = 'text/css';
		style.innerHTML = '.binaryJam_dt_FieldColour { display:inline-block; margin 3px;width:20px;height:20px;border:1px solid black; }';
		document.getElementsByTagName('head')[0].appendChild(style);
		
		function colour_FieldItemRender(ctx) {
			if (ctx !== null && ctx.CurrentItem !== null) {
				
				var divStyle="style='background-color:" + ctx.CurrentItem['Colour'] + "'";
				var html = "
" + ctx.CurrentItem['Colour'] ; return html; } }; function registerTemplateOverrides() { SPClientTemplates.TemplateManager.RegisterTemplateOverrides(overrides); }; function mdsRegisterTemplateOverrides() { var thisUrl = _spPageContextInfo.siteServerRelativeUrl + "js/jslink/test1.js"; RegisterModuleInit(thisUrl, registerTemplateOverrides); }; //Public interface this.RegisterTemplateOverrides = registerTemplateOverrides; this.MdsRegisterTemplateOverrides = mdsRegisterTemplateOverrides; }; })(BinaryJam.JSLink); //This is the "official" way to check MDS //https://msdn.microsoft.com/en-us/library/office/dn913116.aspx if ("undefined" != typeof g_MinimalDownload && g_MinimalDownload && (window.location.pathname.toLowerCase()).endsWith("/_layouts/15/start.aspx") && "undefined" != typeof asyncDeltaManager) { BinaryJam.JSLink.DisplayTemplateFieldColour.MdsRegisterTemplateOverrides(); } else { BinaryJam.JSLink.DisplayTemplateFieldColour.RegisterTemplateOverrides(); };

 

Create a free website or blog at WordPress.com.

Up ↑