Mark, Julian and Ben will all be at IBM ConnectED from Friday evening this week. If you’d like to say hello, talk about LDC Via, enter our competition (come back in a couple of days to find out about that) or simply share a beer / some ludicrous sugary confection (in the case of the Wookiee), we’d be delighted to see you.
As you probably know, the conference itself is taking place in the Swan this year. Mark will be around a lot, although staying off-site, whilst the other two reprobates are also residing in the Swan, so will no doubt be irritatingly easy to find.
In addition to general lingering and chatting, you will be able to find Mark at his splendidly useful “Beyond The Every Day” session, “1 App, 2 Developers, 3 Servers: Getting the Same Application to Run on Different Servers” which is happening at 3.45pm in Mockingbird 1 - 2 on Tuesday—here’s a session link for those with ConnectED site access: https://portal.ibmeventconnect.com/wps/myportal/connected/site/Sessions/SessionFinder/detail/00071
Additionally, Mark has somehow become an IBM Champion (no we don’t know how either), and as such you will be able to find him at the Leadership Alliance events looking out of place and desperate to talk tech with anyone.
If you need to know what we look like (oof), take a look-see here. Smooth.
(For those wondering why Matt is not with us this week… he claims to have a very good reason, but we’re unconvinced ;-))
One of the key use cases for LDC Via that we see is for archiving data for successful IBM Domino applications. We all know that Domino has limitations: when a database gets too big, it can really affect performance… but there’s so much good stuff in Domino–like security and integration with email–that it would be a shame to have to migrate your whole application to a completely different platform.
What if there was a “middle ground”? How about moving older or inactive data to a different storage area, but one that still maintains document-level security? You may be surprised to learn how straightforward it is to integrate XPages with other platforms via a REST API.
By way of illustration, we’ve created a sample Domino application that connects to the LDC Via service, obtains a list of LDC Via-supplied databases that the current user can see, and lets the user browse those databases (an individual MongoDB database contains one or more “collections”, and each collection contains documents). Hopefully you can see by looking at the code that makes up the sample app just how easy this is. The main component that drives it all is a managed bean that pulls data from our REST services:
Using this code combined with a single XPage, we can retrieve the aforementioned database list, together with lists of collections within those databases, and for each collection the documents it contains (all based on what the user is permitted to see):
The screenshot shows the list of migrated NSF files (“databases”) that our test user has access to. We’ve selected the database called unpsampler.nsf, and then a collection therein called “Document”. This collection contains 88 documents accessible to the user, and using the API we can then view the first 30 documents, paging through the rest in much the same way you would with a Domino view.
This is a very simple demo, and it wouldn’t take much effort to customise the code to fit your specific requirements. From the end user’s point of view there need be no difference at all in their experience, except that your XPages application is hopefully more responsive as you reduce the amount of live data you’re working with!
There is a phrase regarding security on websites that terrifies developers: “It’s not a question of whether you get hacked, but when.” It’s popular with the media of course―doom-laden articles will always sell, and strictly speaking this terrifying sentence is correct―but only for a given value of ‘correct’.
For example, I don’t think anyone could stop the NSA getting access to private data if the NSA had a thousand years and warehouses of dedicated brute-force script kiddies at their disposal (no doubt backed by law). What we can do however, is make cracking data so bloody hard that the whole process becomes untenable. For LDC Via, we consider the “base data type” for all migrations to be financial transactions from large corporates―in other words, we do not play fast and loose with this stuff.
As we built LDC Via we had to consider the usual suspects in terms of attack vectors, and the defensive layers we need to provide. This is compounded by the fact that IBM Domino has always provided such good, simple security. We provide various options when using LDC Via, from simple data conversion to fully-converted applications with reasonably complex security requirements―and so there are appropriate decisions to be made.
Thankfully, different members of LDC take care of different areas of the application, and this enables us to impose a global security policy with each member taking care of a number of the constituent layers.
We break down the defensive layers into two categories:
- “The Usual Suspects”
- “Just For Us”
###“The Usual Suspects”###
These are the layers that all web apps have to deal with. Quite a lot of them are irrelevant for on-premises solutions, but take on huge importance in our cloud offering.
Ports and firewalls. An easy one―only those ports absolutely necessary are opened. By default this means SSL (port 443), with the option of port 80 if non-secure transactions are required. That is it: no database ports, no remote access ports etc. We separate dedicated client servers from each other with firewalls as well, so there is no inter-communication between your dedicated boxes and those used by our other clients.
Operating system. We use a hardened Linux build for our production servers, with proper patching and all the things that keep administrators happy.
Hardware. Where provided by our hosting providers we offer hardware encryption. Now that most decent physical storage is based on solid-state technology, we have found that hardware encryption is not the performance horror it once was. If this is a requirement for your implementation, please tell us.
NSA. A new player in the security arena is the National Security Agency in the US. Their curious assertion that they have the right to take any data they deem fit for their purposes from anywhere on the planet means that even if your data is not stored in the continental USA, if the hosting provider has an office registered there then the NSA can pressure that organisation to provide access to your data (with the hosting provider and LDC Via legally obliged not not to tell you about it).
Well, we are registered in the UK, and we have access to multiple hosting providers, both local and international, to meet the different data security needs of clients.
Databases. All databases offer a variety of security options, and the methods we implement will vary according to the back-end server involved. In our current offering (based on MongoDb) we use both standard MongoDb users and roles for security (rather than those wide-open “service accounts” that so often lead to wholesale database leaks in the case of a security breach).
However, that’s not enough: MongoDb only really offers granular security down to a “collection” level, so that leads us on to the the next level of security…
###“Just For Us”
Readers and authors. One of the core tenets of IBM Domino security, and something that is a devil to reproduce in other systems. Readers and authors fields, controlling document- and field-level access to data. Very few systems and databases provide document-level security, so to effect this level of security, we have a “data wrapper”: we have constructed a wrapper that goes round any attempt to access the database from the application (a Java driver wrapper is also provided). The meta-data stored in the database is checked against the requesting user and the various levels of access are granted on that basis. This includes group access as well as individual rights.
Configurable security. At any time, administrators can modify which fields in a collection are used to define security, and of course, will always be able to see all documents (you won’t lose anything!)
So there you go, some of the background to security in LDC Via. As Mark says, it’s like arguing about politics: never-ending, often quite heated, but one day we will build a better world (if the rest of the buggers stop talking rubbish).
We’ve spoken about the whys and wherefores of LDC Via already, but some of the use cases we mentioned in our last post probably bear some expansion. So let’s do that.
1. Re-homing historical data from servers that are being retired
A number of organisations maintain Domino infrastructure purely so that they can reference data held thereon—no new development, no real activity, but they need to “keep the lights on”. These servers may be poorly-supported, may not have robust back-up, and can even be running out-dated or un-patched infrastructure (I think many of us have heard about Domino 6 or 7 boxes still running in production). So why not kick those boxes into touch and serve up your data in a modern, cost-effective manner (with support)?
2. Archiving data from Notes & Domino applications that have grown too large for Domino
There are all manner of tips ’n’ tricks to make your applications scale. Ultimately however, there are some limitations in our venerable platform, and it is not the solution for every hosting use-case.
The mark of a successful Domino application is one that has lasted years and become overloaded with documents. Why not give the old app a new lease of life? Move historical data off to a platform like LDC Via, and let your Domino application’s view indices breathe once more.
(Incidentally, if this use-case is especially of interest, do look out for our LDC Via-XPages integration demo on this very site)
3. Integrating Domino-based data with other applications
Domino can be a bit of a silo. Sure, there are ways of surfacing its data in other environments, but that can get tricky fast. Pushing selected data to LDC Via, with its REST API layer and pleasing user interface could be a solution here. All modern application platforms play well with REST-ful systems, and Via provides a platform for customisation and extension as you see fit.
4. Selectively making Domino-based data available outside the firewall without exposing your Domino infrastructure
5. Migration of Domino applications to other stacks for increased interoperability and performance
Current environments include the MEAN stack, and this has a vibrant user-base. Why not take advantage of these skills and opportunities? This means granting Domino data access to a new generation of developers.
6. Exposing ‘IBM Notes’ application data to non-Notes clients (e.g. web and mobile)
As per item #4, a lot of corporate data may only be accessible via IBM Notes clients inside the firewall. You could Domino-enable such applications in a bid to move away from fat clients. Or perhaps your organisation has a target platform for projects that doesn’t include Domino? Well, LDC Via could be an option.
You’ve heard us talk a little about the Why of LDC Via and hopefully that’s got you interested. But we’re very aware that what we’ve created is a significant change from running your Domino server.
What we need are people to kick the tyres of our NSF migration tools, our admin interfaces, security and also the development API. If you think you’d like to be involved then get in contact with us (some of you already have, thank you!) If you’ve been involved with a beta program before then the process will be familiar, we’ll get you set up with our beta environment, help you get started and then let you play. In return we’ll be asking you to give us feedback, as brutally honest as possible, about the good, the bad and the things that need to be changed before LDC Via is production-ready.
In return for all of your effort you will get our eternal gratitude and very preferential pricing when we switch from beta to live.
Here’s a little more detail about the three main areas of LDC Via we want you to look at:
First we have the migration process. This is split into two parts, either our online tool which will talk directly to your internet facing Domino server, or the on premises tool which acts as a conduit between your internal Domino server and our cloud based servers. We want to be able to migrate your full Domino databases and make sure that we are getting 100% fidelity. We’ve obviously done significant testing of this ourselves already, but we can always do more.
Second we have the admin interfaces. These allow you to see your data. Think of the main admin screen as the equivalent of an “All Documents” view combined with the “Document Properties” dialog box (only more useful, hopefully!) You can manage user accounts and security for your organisation too, which includes modifying document-level security settings.
Finally, for the developers out there, we have our API. This is what allows you to create new applications or integrate with your existing applications and access your data in whatever way you need. Here we’re wanting to make sure that the APIs we have created are sufficient, and to identify new APIs that will make your life easier.
So if all this sounds like it’s something you want to get involved with, please let us know and we’ll add you to the list. We aim to be starting the process in the next few weeks.