New year lighthouse!

Google Chrome / Lighthouse graphic

Happy new year (albeit a somewhat belated greeting on our part). Let’s hope 2019 brings us all some reprieve from the unrelenting bad news out there! They say that the key to happiness is dealing with what you can control, and so to that end we’re going to spend this blog post re-visiting an old tool that you may have ignored for a wee while – I know I have, given that my day-to-day work rarely includes front-end web development.

So: Lighthouse, a collection of open-source tools aimed at checking web pages for things like performance, accessibility, search engine optimisation (urgh) and general best-practice-type-stuff. The Lighthouse tooling was added to Google Chrome some time back; certainly version 3 has been there since Chrome v69.

Lighthouse is triggered by simply opening DevTools in Chrome, and selecting to run an audit on a given website. After it has done its thing, you are presented with something like this:

Screenshot of Lighthouse tools report

Once the tool has finished doing its thing, and the report screen is rendered, you can dive into all of the stats to your heart’s content. Clicking on each top-level circle provides a list of passes and fails, with onward links to aid remediation. Even in 2019, many web sites fall down on their accessibility scores – running Lighthouse on your site is a sobering experience.

2019: we’ve got lots to do, and I bet you do too. But what a nifty DevTool.

Migrating IBM Quickr

IBM Lotus Quickr icon

For the last eighteen months or so we’ve been engaged in a project to provide two things for a client:

  1. Migrate multiple IBM Quickr places to our platform, and;
  2. Develop a new web application to replace IBM Quickr, both for the migrated content, and for new places as required.

This is not an insignificant piece of work: Quickr was a content management platform that had a lot of investment from Lotus and IBM at various points, and it was around for a long time. Furthermore, once we had won the work and we were in the throes of advanced planning, we were also looking to undertake a massive re-write of the entire Via migration utility.

A re-write? Really??

Those who have been with us for a while will recall the web-based migration tool – still available on our site – which provides an excellent demonstration for part of what our platform does. However, if you are serious about bulk-migrating data: multi-NSF applications, mapping security fields, custom content, driving through schema changes and the like, then it’s our Java-based migration utility you need.

When we started to build Via over five years ago, we embarked upon an ambitious development programme: not only would we code the API layer, we would also build migration tooling ready for day one. This was admirable but misguided, because the upshot was that we were building direct-to-MongoDB migration code alongside the abstraction layer of our API. In short, the migration utility did not use our own API, which was clearly undesirable for all manner of reasons.

Opting to re-work the migration utility was therefore The Right Thing To Do. We simply had to mitigate that technical debt, take a deep (collective) breath and crack on. The Quickr migration project clearly underlined the need for a flexible, speedy migration path.

Almost every piece of code in the migration utility has been re-designed and re-written. We have multiple “importer” classes which can handle selection criteria at the form, document, view, database, directory and, of course, Quickr level. A “bootstrap” class orchestrates the import process, and handles everything from dynamic configuration loading to dealing with different types of directories for migrating user profiles too. We can even merge multiple Quickr places into one.

Want to know more? Fancy migrating from Quickr, or even Domino.Doc? (Yes, that too!) Drop us a line.

Engage 2018 - not long now!

There is much scurrying around in LDC towers as we prep for moving to Rotterdam en masse for Engage next week.

This year will be a rarity for us because we are hoping to actually spend some time appreciating the conference itself and talking to people, as there are no sessions to present or production go-lives to handle.

We do have a giveaway prepared, as anyone who has been watching our tweets will know, but you will have to wait a few more days to see what it is!

If you want one, you will have to come to the stall and talk to us — and we hope you do! The whole team has been heads-down for the past year, with some of us not attending any community conferences or even seeing daylight in that time.

So drop by and let’s have a chin-wag and a beer!

Engage 2018!

In just a few weeks we’ll be at this year’s Engage event in Rotterdam: we’re so looking forward to it. Every year Theo, Hilde and team put together an excellent event in the most amazing venues. This year is no different: Engage takes place on water — the SS Rotterdam to be specific.

We love sponsoring Engage — in recent years it’s become our main conference — and as ever we’d be delighted to talk to you. Do pop by and see us: we’ll be near the the entrance to the product showcase, at stand #3. We also have some rather nifty give-aways this year, so keep an eye on our Twitter feed.

GDPR: one month to go

The European General Data Protection Regulation, GDPR, comes into force in a month’s time. The GDPR is a massive overhaul and extension of existing data protection and privacy laws in the EU, the first in over twenty years.

We talk about the GDPR, naturally, in the context of Europe and the European Union. However, the GDPR has a far wider impact, although the reaction from some organisations and individuals to date would have you believe otherwise. Any organisation that has dealings, directly or indirectly, with EU citizens needs to consider the following questions:

  • Can you search across all relevant applications in your IT landscape?
  • Can you structure, save, and refer to, these search queries?
  • Do you know what sort of data you hold on individuals?
  • Do you know what constitues “personally identifiable” information?
  • Who is your nominated data controller?
  • Are you in a position to process “forget me” requests, and information requests from data subjects?
  • What are your data disposal policies, and how are they managed?
  • Do you understand how that data is being used?
  • Have you got agreement from those individuals to hold their data?
  • What security measures do you have in place for data, personal and otherwise?
  • Data about a subject has to be portable: is it relatively straightforward for you to move such data from your current systems?

If you are concerned about any of these issues, or worry about vendor support for GDPR and related data protection legislation, contact us.