Archive for category Programming

Show code coverage from jest framework in emacs

I recently switched from jasmine/istanbul to jest framework for my project jQuery Terminal and I’ve got coverage drop from 80-81% to about 69% with some update of the code and commented out code that I was working on.

So I thought it would be cool to highlight the lines that where covered and not covered by tests and since I use emacs as my editor I thought that I write a function that will do that.

Here it is:

(defun root-git-repo ()
  (interactive)
  (replace-regexp-in-string "\n"
                            ""
                            (shell-command-to-string "git rev-parse --show-toplevel")))

(defun line-pos-at-line (line)
  (interactive)
  (save-excursion
    (goto-line line)
    (line-beginning-position)))

(defun coverage-mark-buffer ()
  (interactive)
  (let* ((dir (root-git-repo))
         (json-object-type 'hash-table)
         (json-array-type 'list)
         (json-key-type 'string)
         (json (json-read-file (concat dir "/coverage/coverage-final.json")))
         (filename (buffer-file-name (current-buffer)))
         (coverage (gethash filename json))
         (statments (gethash "statementMap" coverage)))
    (save-excursion
      (maphash (lambda (key value)
                 (let* ((statment (gethash key statments))
                        (start (gethash "start" statment))
                        (end (gethash "end" statment))
                        (start-line-pos (line-pos-at-line (gethash "line" start)))
                        (start-pos (+ start-line-pos (gethash "column" start)))
                        (end-line-pos (line-pos-at-line (gethash "line" start)))
                        (end-pos (+ end-line-pos (gethash "column" end)))
                        (color (if (= value 0) "dark red" "dark green"))
                        (face `((t (:background ,color)))))
                    (hlt-highlight-region start-pos end-pos face)))
               (gethash "s" coverage)))))

The function is using hlt-highlight-region from highlight.el by Drew Adams.

The function don’t check if coverage file exists. It assume that you’re opening from from git repo and that coverage file is in coverage directory (default for jest) in git root.

The function is quite slow, and it need to process a lot, if the file is big. You can see how it look like in this page, generated by Emacs library htmlize.

If you want to clear the buffer you can use this function:

(defun coverage-clear-buffer ()
  (interactive)
  (save-excursion
    (end-of-buffer)
    (hlt-unhighlight-region 0 (point))))

Clear is much faster.

If you’re iterested in code coverage in Emacs you can take a look at jest-coverage minor mode that I’ve created based on this solution.

Advertisements

, , ,

Leave a comment

How to create Web Server from Browser

In my GIT Web Terminal I use BrowserFS to write files (it’s required by isomorphic-git that I use to create git interface in browsers).
BrowserFS is implementation of node fs module but for browsers, it use few types of storage, I use indexedDB same as example for isomorphic-git.

I thought that i would be cool to edit the project itself and be able to view the files when I edit them in the browser (before I commit), like they where serve from web server. And it turn out it’s possible with Service Worker since you can access IndexedDB from that worker and you can create HTTP response from string or from arrayBuffer (BrowserFS is returning arrayBuffer from its readFile function).

Here is the code for service worker that show pages and directory listing:

self.addEventListener('install', function(evt) {
    self.skipWaiting();
    self.importScripts('https://cdn.jsdelivr.net/npm/browserfs');
    BrowserFS.configure({ fs: 'IndexedDB', options: {} }, function (err) {
        if (err) {
            console.log(err);
        } else {
            self.fs = BrowserFS.BFSRequire('fs');
            self.path = BrowserFS.BFSRequire('path');
        }
    });
});

self.addEventListener('fetch', function (event) {
    event.respondWith(new Promise(function(resolve, reject) {
        function sendFile(path) {
            fs.readFile(path, function(err, buffer) {
                if (err) {
                    err.fn = 'readFile(' + path + ')';
                    return reject(err);
                }
                resolve(new Response(buffer));
            });
        }
        var url = event.request.url;
        var m = url.match(/__browserfs__(.*)/);
        function redirect_dir() {
            return resolve(Response.redirect(url + '/', 301));
        }
        if (m && self.fs) {
            var path = m[1];
            if (path === '') {
                return redirect_dir();
            }
            console.log('serving ' + path + ' from browserfs');
            fs.stat(path, function(err, stat) {
                if (err) {
                    return resolve(textResponse(error404(path)));
                }
                if (stat.isFile()) {
                    sendFile(path);
                } else if (stat.isDirectory()) {
                    if (path.substr(-1, 1) !== '/') {
                        return redirect_dir();
                    }
                    fs.readdir(path, function(err, list) {
                        if (err) {
                            err.fn = 'readdir(' + path + ')';
                            return reject(err);
                        }
                        var len = list.length;
                        if (list.includes('index.html')) {
                            sendFile(path + '/index.html');
                        } else {
                            var output = [
                                '',
                                '',
                                '',
                                '<h1>BrowserFS</h1>',
                                '<ul>'
                            ];
                            if (path.match(/^\/(.*\/)/)) {
                                output.push('<li><a href="..">..</a></li>');
                            }
                            (function loop() {
                                var file = list.shift();
                                if (!file) {
                                    output = output.concat(['</ul>', '', '']);
                                    return resolve(textResponse(output.join('\n')));
                                }
                                fs.stat(path + '/' + file, function(err, stat) {
                                    if (err) {
                                        err.fn = 'stat(' + path + '/' + file + ')';
                                        return reject(err);
                                    }
                                    var name = file + (stat.isDirectory() ? '/' : '');
                                    output.push('<li><a href="' + name + '">' + name + '</a></li>');
                                    loop();
                                });
                            })();
                        }
                    });
                }
            });
        } else {
            fetch(event.request).then(resolve).catch(reject);
        }
    }));
});
function textResponse(string) {
    var blob = new Blob([string], {
        type: 'text/html'
    });
    return new Response(blob);
}

function error404(path) {
    var output = [
        '',
        '',
        '',
        '<h1>404 File Not Found</h1>',
        `File ${path} not found in browserfs`,
        '',
        ''
    ];
    return output.join('\n');
}

Service worker is using __browserfs__ marker to distinguish normal url from urls that are from BrowserFS/IndexedDB. Everything after it it’s serve from BrowserFS. So if you write file foo you can access it using __browserfs__/foo. Here is the code sample that will create a file using browserFS:

fs.writeFile('/foo', 'hello world', function(err) {
   if (err) {
     console.log('Error write file');
   }
});

You can see how this work in my GIT Web Terminal just clone any repo (best is the one that have directories) and view files with url https://jcubic.github.io/git/__browserfs__/repo/path/to/file.

, ,

1 Comment

How to create D3 Plugin

I was learning D3 and wanted to write plugin attrs, that will allow to use object to set attributes on svg DOM nodes. It seems that this functionality was in D3 but was removed.

I was searching and was not able to fine how to create the plugins, but then I look at source code for d3 transition (it’s easier to search bundle file than original files on github).

And at the end, there was code that I’ve needed to use. That’s what I thought it require to add to prototype.

So here is my attrs plugin:

d3.selection.prototype.attrs = function(attrs) {
  this.each(function(d, i) {
    var element = d3.select(this);
    Object.keys(attrs).forEach((key) => {
      element.attr(key, attrs[key]);
    });
  });
  return this;
};

You can use it like normal attr:

var g = d3.select('body')
  .append('svg')
  .attrs({width: 200, height: 200})
  .append('rect').attrs({
    x: 50,
    y: 50,
    width: 100,
    height: 100,
    stroke: 'rgb(255, 100, 100)',
    'stroke-width': 10,
    fill: 'black'
  });

It works the same as jQuery plugins.

, ,

Leave a comment

How to add GPS coordinate to your photo using data from google

I have Nikon DSLR and I wanted to add GPS location so when I upload them to flickr I have it on a map and don’t need to add location by hand.

If you have android phone, you don’t need any extra GPS hardware, to have GPS location on your photos taken with your camera.

The solution, if you have GPS enabled in your phone is it take location history from google map using this url: https://www.google.com/maps/timeline?pb (export option is in dropdown in gear icon at the bottom)

You will have JSON file with GSP coordinates and timestamps for all your locations from your phone.

Now to extract the GPS out of JSON file I use this small script written in Python, it use exiftool to write EXIF back to the file because PIL can only read exif not write it.

#!/usr/bin/env python

from __future__ import division
import simplejson as json
from PIL import Image
from dateutil import parser
from optparse import OptionParser
from subprocess import call
from datetime import datetime, timedelta

def get_date_taken(path):
    return Image.open(path)._getexif()[36867]

def nearest(items, pivot):
    return min(items, key=lambda x: abs(x - pivot))

def comparator(date, hours_shift = None):
    def compare(x):
        current = datetime.fromtimestamp(int(x['timestampMs']) / 1000.0)
        if hours_shift is not None:
            current = current + timedelta(seconds = hours_shift * 60 * 60)
        return abs(current - date)
    return compare

def get_gps(gps, date, hours_shift = None):
    return min(gps['locations'], key=comparator(date, hours_shift))

def parse_date(str):
    return datetime.strptime(str, "%Y:%m:%d %H:%M:%S")

def timestamp(dt, epoch=datetime(1970,1,1)):
    td = dt - epoch
    return (td.microseconds + (td.seconds + td.days * 86400) * 10**6) / 10**6

def timezone(date, hours):
    return date - timedelta(seconds = hours * 60 * 60)

if __name__ == '__main__':
    from sys import argv
    opt = OptionParser()
    opt.add_option('-l', '--location')
    opt.add_option('-t', '--timezone')
    (options, args) = opt.parse_args()
    if options.location is None or len(args) != 1:
        print "usage %s [--timezone <hours shift>] --location [History JSON File] <IMAGE FILE>" % argv[0]
    else:
        gps_list = json.loads(open(options.location).read())
        date = parse_date(get_date_taken(args[0]))
        if options.timezone is not None:
            loc = get_gps(gps_list, date, int(options.timezone))
        else:
            loc = get_gps(gps_list, date)
        found = datetime.fromtimestamp(
            int(loc['timestampMs']) / 1000.0
        )
        print "%s == %s" % (date, found)
        call([
            'exiftool',
            '-m',
            '-GPSLatitude=%s' % str(int(loc['latitudeE7']) / 1e7),
            '-GPSLongitude=%s' % str(int(loc['longitudeE7']) / 1e7),
            args[0]
        ])

To use it you need to use terminal and execute gps.py --location &lt;PATH TO JSON&gt; &lt;JPEG File&gt;

The only one issue I’ve found is that PIL can’t extract exif from RAW/NEF files, so you can only use JPEG but you can write exif to RAW/NEF but read create time out of JPG file, if you shot in both JPG and RAW like I do.

, , ,

2 Comments

How to simulate error for http requests in R application

I was working on shiny R application and the app was using xmlParse(path) to parse url from web API I was using. I needed to add error handling to respond to 404 and 500 errors, so I’ve changed my code to use httr library. That way I could use two steps process of getting xml from API.

My code look like this:

      res <- GET(url)
      if (res$status_code == 500) {
        message <- "Error From API" 
        simpleFatalError(message)
        stop(message)
      } else if (res$status_code == 404) {
        message <- paste("Study", .session$sid, "not found")
        simpleFatalError(message)
        stop(message)
      } else if (res$status_code != 200) {
        message <- paste(
          "Unknown response code",
          res$status_code,
          "for Study",
          .session$sid
        )
        simpleFatalError(message)
        stop(message)
      }
      studyXML <- tryCatch(
        xmlRoot(xmlTreeParse(
          content(res, "text", encoding = "UTF-8"),
          useInternalNodes=TRUE
        )),
        error = function(e) {
           simpleFatalError("parse error")
        }
      )

and now I had the problem how to simulate 500 error because that error in the API that was needed to be handled was fixed by the API team. So what I’ve did was installing Fiddler I work on windows if you work on Linux better would be to use Burp Suite.

And I used Fiddler script which seems simpler solution because it was just one line of code:

    static function OnBeforeResponse(oSession: Session) {
        if (m_Hide304s && oSession.responseCode == 304) {
            oSession["ui-hide"] = "true";
        }
        if (oSession.url.Contains("study.xml")) { // this was part of the url that returned xml from API
            oSession.oRequest.FailSession(500, "Blocked", "Fiddler blocked this file");
        }
    }

But I had the problem that Fiddler was not handling my backed requests. And it turn out that I needed to enable proxy for httr which is as simple as:

res <- GET(path, use_proxy("127.0.0.1", 8888))

Then I was able to test my connecting to the API, and get proper error in shiny app.

, ,

Leave a comment

How to automate uploading SSL certificates from letsencrypt’s to DirectAdmin

On my shared hosting (where i have my personal website and my Polish blog) I had access to DirectAdmin panel and it have option to set SSL certificate, but it don’t have support for lensencrypt out of the box.

So I thought that I create python script that will upload Certificate for all of my domains that I can access from DirectAdmin and upload http challenges that are used by letsencrypt to confirm that I’m in control of the domain.

The script can be found on github gist

You need to update the cert.py script to include your username/password, ftp host and url for DirectAdmin interface, in my case I was using static url that was redirecting to different port and hosting provider domain, I’ve used that url because sometime the real server and domain change. That’s why the script use recursive function that send HEAD requests to get real url after redirect.

The gist include bash script, that can be used in cron (you need to put your domains and subdomains). If you will execute the script first time and was not using certbot before you will need to put your email and agree to TOS (using command line options).

The script cert.py is used in two modes one, if certbot environment variables are set, to upload a challenge to ftp and the other to set SSL certificate for each domain using DirectAdmin.

Instruction how to install certbot can be found on certbot.eff.org. After installation you can find short docs in man page for certbot that explain each command line option.

I’ve tested the script only on GNU/Linux. For Windows/MacOSX it will require modification like path to certificates generated by letsencrypt.

, , ,

Leave a comment

Console.log for all or selected function calls with stack trace

I needed to debug some function calls, I needed to know which function is called and when, I decide to not to use debugger and step because it would be slow so I’ve written the function that logs all function calls, and include stack trace if needed. The function can be called in few ways:

  1. globals('createUser', true)

    will log one function call and include stack trace, if second argument is omitted or false it will not print stack trace

  2. globals(['createUser', 'deleteUser'], true)

    will log both functions and include stack trace

  3. globals({'createUser': true, 'deleteUser': false})

    will log both but only the first one will have stack trace

  4. globals()

    will log all global function calls

  5. globals(true)

    will log all function calls and include stack trace for all functions

Here is the function itself:

function globals(arg, stack) {
  var show_stack;
  if (typeof arg == 'boolean') {
    stack = arg;
    arg = undefined;
  }
  var is_valid_name;
  if ($.isPlainObject(arg)) {
    var keys = Object.keys(arg);
    is_valid_name = function(name) { return keys.indexOf(name) != -1; };
    show_stack = function(name) { return arg[name]; };
  } else {
    show_stack = function() { return stack; };
    if (arg === undefined) {
      is_valid_name = function() { return true };
    } else if (arguments[0] instanceof Array) {
      is_valid_name = function(name) { return arg.indexOf(name) != -1; };
    } else {
      is_valid_name = function(name) { return arg == name; };
    }
  }
  document.addEventListener("DOMContentLoaded", function() {
    Object.keys(window).forEach(function(key) {
      var original = window[key];
      if (typeof original == 'function' &&
          !original.toString().match(/\[native code\]/) &&
          'globals' != key && is_valid_name(key)) {
        window[key] = function() {
          var args = [].map.call(arguments, function(arg) {
            if (arg instanceof $.fn.init) {
              return '[Object jQuery]';
            } else if (arg === undefined) {
              return 'undefined';
            } else {
              return JSON.stringify(arg);
            }
          }).join(', ');
          console.log(key + '(' + args + ')');
          if (show_stack(key)) {
            console.log(new Error().stack);
          }
          return original.apply(this, arguments);
        };
        // just in case some code parse function as strings
        window[key].toString = function() {
          return original.toString();
        };
      }
    });
  });
}

this function can be easily extended to log methods calls on an object just use object instead of window. Here is example of higher order function that create debug functions, here is the code with few improvements:

function debug(object, name) {
  var fn = function(arg, stack) {
    var show_stack;
    if (typeof arg == 'boolean') {
      stack = arg;
      arg = undefined;
    }
    var is_valid_name;
    if ($.isPlainObject(arg)) {
      var keys = Object.keys(arg);
      is_valid_name = function(name) { return keys.indexOf(name) != -1; };
      show_stack = function(name) { return arg[name]; };
    } else {
      show_stack = function() { return stack; };
      if (arg === undefined) {
        is_valid_name = function() { return true };
      } else if (arguments[0] instanceof Array) {
        is_valid_name = function(name) { return arg.indexOf(name) != -1; };
      } else {
        is_valid_name = function(name) { return arg == name; };
      }
    }
    document.addEventListener("DOMContentLoaded", function() {
      var functions = Object.keys(object).filter(function(name) {
        return typeof object[name] == 'function';
      });
      functions.forEach(function(key) {
        var original = object[key];
        var str = original.toString();
        if (!str.match(/\[native code\]/) && !str.match(/<#debug>/) &&
            key != 'debug' && !original.__debug &&
            is_valid_name(key)) {
          object[key] = function() {
            var args = [].map.call(arguments, function(arg) {
              if (arg instanceof HTMLElement) {
                return '[NODE "' + arg.nodeName + '"]';
              } else if (arg instanceof $.fn.init) {
                return '[Object jQuery]';
              } else if (arg === undefined) {
                return 'undefined';
              } else if (arg === window) {
                return '[Object Window]';
              } else if (arg == document) {
                return '[Object document]';
              } else {
                return JSON.stringify(arg);
              }
            }).join(', ');
            console.log((name?name + '.': '') + key + '(' + args + ')');
            if (show_stack(key)) {
              console.log(new Error().stack);
            }
            return original.apply(this, arguments);
          };
          object[key].toString = function() {
            return str;
          };
          object[key].__debug = true;
          object[key].prototype = original.prototype;
          for (var i in original) {
            if (original.hasOwnProperty(i)) {
              object[key][i] = original[i];
            }
          }
        }
      });
    });
  };
  fn.toString = function() {
    return '<#debug>';
  };
  return fn;
}

You can debug all jQuery methods using:

debug(jQuery, '$')();

second argument is label for log

the function that returned accept the same arguments as the first function. You can call the function with window object to get the same function so if you want log only one function with stack trace will look like this:

var globals = debug(window);
globals('addExcludedWarning', true)

Another cool idea is to include time of the function invocation, that is left as an exercise, Hint you can use performance.now() function to check current time in milliseconds.

, ,

Leave a comment

%d bloggers like this: