Friday, March 8, 2013

New Developments with eVSM

I just realized I've been working on eVSM for almost a year, and haven't really talked about any of the changes we've made in that time.  Part of the reason for that is I've not wanted to talk about new upcoming features until they were released and well documented.

I now realize many features have been in use for at least three months, if not longer.

The first I'd like to talk about is one of the flagship functions of eVSM, the eVSM Calculator.  This tool has been around since early versions of eVSM, and has grown new features over the whole time of its existence.  What used to be a simple function for exporting shape data fields to Excel and pulling user equations back in from Excel has grown into a powerful tool that writes the equations for you.

The complexity of the code behind the calculator grew quite a lot, since the original base code was never really designed to handle some of the functionality it grew to support.  This ultimately ended up in long run times for calculating a map.  While probably good for the coffee, water bubbler, and gossip industries, we realized the calculator was getting too slow.

So we rewrote the tool from scratch.  Which is usually a bad idea.  But in our case, we sort of treated the old calculator as a 10 year old prototype that happened to work wonderfully in production.

The old calculator used VBA collections to hold a lot of the data from the map in memory, which made things really hard to debug.  I did modify the collection code to also store the keys for the collection within the collection, so I could at least see what data was where.  But in the end I ended up designing a simple object model using VBA Classes, which provides us an "early binding" of many of the properties we want to know about our eVSM shapes (tag ID, paths present, variables present).

The overall hierarchy we have now is a Map class at the top level, which holds all the data for a single Visio page.  Each map has a list of all the "Row" centers, which are individual process (or other type) blocks, which we store as a Tag object.  The Tag object keeps a pointer to the center shape, the tag shape, and holds a list of Paths and Data.  The Data collection is a list of NVU objects.

By putting all the page data into this object model, we found we could read the Visio page one time, and keep all the data we wanted in memory, in very convenient and (relatively) efficient objects.  The other big performance gain came from using batch array techniques to talk to the Excel application object as few times as possible.  We found we could make our VBA code as inefficient as we wanted, as long as we minimized the number of calls to Excel and Visio's application objects.

Overall we found we were getting at least 5x speed improvements, while maintaining the functionality of the tool.  We did have to find and fix many bugs, most of which were probably encountered the first time writing the old calculator.  But I'm happy we ended up rewriting from scratch, as this new object model has enabled us to do some great things, which I'll have to put in other posts.

Thursday, February 14, 2013

Server Load Balancing Simulation

I read an interesting (to me) post about scaling a web application using the Heroku stack, and how they ended up having capacity issues not from the servers, but from the load balancing router.

Heroku recently changed the router from selecting the next available server to just selecting a server from the pool randomly and passing it without blocking.  Each server has its own blocking queue, in this case.  The justification from Heroku was that for a non-blocking server (like Node.js) this wouldn't be an issue, and should improve utilization.

But, the folks at RapGenius use Ruby on Rails, which is a blocking server, meaning they can process just one request at a time.

So I was reading the article, and they were complaining that they now needed to increase the number of servers, in order to meet the demand on their website.  I was starting to get antsy when they seemed to be speculating, but then they showed results from a simulation they ran in R, and that made me really happy.  Especially since they were able to put together statistics for their request processing times and identify a distribution.

I can't help but think how easy this model would be to build, run, and analyze in any of the available discrete event simulation tools we use, rather than use R.  But, R is free to use and our favorite discrete event simulators are not so much.  But, I wonder how many of these startups would ever consider using something like Simio to model their website infrastructure, and ultimately help themselves scale up.

Here's the post.

Monday, December 17, 2012

Etch a Sketch Charting


Every metrics board on every shop floor needs one of these.  Don't like the latest metric?  Shake it until you forget...

Someone should hook one of these up to MT Connect.

Lifehacker Link

Friday, December 14, 2012

eVSM User Group

There is a user group for eVSM on LinkedIn where I and the other developers of the software are members.  We actively read the activity on the group, and it's a great place to reach us for help with mapping problems, as well as submitting feature requests.  We also form feature review groups, which help drive improvement in the software.

Even though it's on LinkedIn, it's an open group, meaning anyone can at least view the contents, without having to have a LinkedIn membership.

eVSM User Group on LinkedIn

As an aside, if you have any suggestions for eVSM you can also submit them by using the Contact Me link on the side of this page...

Thursday, December 13, 2012

Player Piano and MT Connect

I recently read Player Piano, by Kurt Vonnegut. The book is about a society that has had almost all manufacturing automated away, and the skilled labor relegated to fixing potholes, essentially being paid to do nothing.

The beginning of the book goes over how things are made. So at one point bright young engineers devised a way to record the movements of master machinists and play them back over again, obviating the need for machinists. Aside from the issue of how you'd go about making a new product or process change without a machinist to record, the idea had to be intriguing in 1952 when the book came out.

The really interesting part, though, is Vonnegut's description of the status display in the office in a factory (which had about 3 people working, total). The display simply shows a brass plaque for each machine, with a status lamp that would flash red if there was a problem.

The point of the book has little to do with how the factories worked, instead dealing with the resentment between the few people who still had meaningful jobs, and those that didn't. It did, however, get me to think a bit about what the point of MT Connect is. If you think you need MT Connect in order to keep all your machines working all the time, then that is completely misguided.

I suggest instead you read The Goal and put some thought into which machines need to be fully utilized. Spoiler: it's only a few of the many.

Which leads me to an encouraging thought: you don't need to connect all your machines in order to get value from MT Connect. You don't need a massive investment in Adapter installation and IT infrastructure. You could instead find your bottleneck resource(s) and monitor those to start correlating downtime with its root causes. Then you can fix the root causes and increase throughput without worrying about the 97% (made up statistic) of machines where it doesn't matter if they're working 100% of the time.

So the question I have now is what would you use to monitor bottleneck machines with MT Connect? Would it be as simple as a lamp for each machine that matters? Really, it could just be a smartphone app that shows a list of machines and flashes red for problems. Ideally it would route alerts to the right people automatically and keep some statistics on performance over time.

If no one beats me to it, maybe I'll build the app. As an aside, it would probably be easy to build using Firebase, which is excellent.

In conclusion, read The Goal and Player Piano and you'll be better off than you were.

Thursday, April 26, 2012

Now For Something Not Entirely Different

I've quit my job at the CT Center for Advanced Technology (CCAT), after over six years.  CCAT was the first real job I had out of college, and I learned an incredible amount of skills and worked on incredible projects with incredible people.

I couldn't help it.

I've been given the opportunity to join the folks at eVSM as a consulting developer, helping them add new features to their product, which, if you've read much of this blog, I'm a pretty big fan of.  We'll also get into some new areas which I can't discuss, but it's very exciting for me.

The reason I couldn't help but leave CCAT was, the eVSM gig allows me to be my own boss, and take on contract work with anyone I want to work with.  The upsho is that I'm already contracting with CCAT on some MT Connect work, and look forward to helping them on interesting projects in the future.  I'll post a link when this first contract is done.

At the moment I'm pretty tied up with CCAT and eVSM work, but over time as I get acclimated to working in this manner (read: when I want, where I want, and what I want) I hope to create some great software products that will help people in manufacturing (and healthcare, services, etc...) be great at their jobs.  I also hope some of the five people who read this blog might consider taking on my services at some point.

This probably means I'm not going to be doing much of anything with QUEST, though I expect to be doing something with simulation very soon (probably with Simio though).  I also plan on continuing my participation in the MT Connect standardization process, because I truly think it will be a game changer in the field of Industrial Engineering.

Thursday, March 29, 2012

MT Connecting to Twitter

Google Apps Script is awesome.  I've been using Google Apps for years, but until recently (when adding a snooze function to gmail) I had never used or really noticed Google Apps Script.  I'm going to abbreviate it to GAS, which isn't great, but I didn't pick the name.

So, GAS is basically the VBA of Google Docs, in that it's designed to help you integrate many of their services together.  I think GAS is even better than VBA because, for one, you get to write JavaScript, which, outside the browser context, is a joy to use.  It's great that you can just add new methods/properties all willy nilly, which, for someone who's been stuck with VBA for going on 9 years, is pretty great.

Another reason to love GAS (sorry) is that your scripts live "in the cloud", and can be run at periodic intervals.  Combine that with Google Docs' spreadsheet module, and you can do some interesting data collection without having to run software on your own computer.

And now I'll start getting to the point.  Because MT Connect is built on the same technologies as the web, we can use many of GAS's built-in XML tools to pretty easily parse out an Agent's data stream.  Also, remember how I said JavaScript lets you add new properties to objects willy nilly?  Well, the XML library adds XML attributes and nodes to objects, rather than making you have to call routines to check for them and get their values (I'm looking at you, VBA).

So, to show how cool this is, I put together a GAS App in a Google Docs spreadsheet that runs a function every minute, to grab the current power status of our Yasda H40i milling machine, and store the latest value in a spreadsheet cell.  Any time that value changes, the script calls a function to send a tweet on the Yasda machine's Twitter account (@CCATYasda5) denoting the new power status.  Ridiculous?  Absolutely.

At some point I may dig deeper into this tutorial and see if I can set up a UI where you can simply enter a few items (agent URL, device UUID, and data item(s) to monitor) and have it authenticate into your own Twitter account.  But the fact of the matter is, Google App Script made it very easy for me to pull in XML data over the web, pull up the previous status information, and handle authentication into Twitter with very little effort on my part, all for free, running on the web 24x7, which is why Google Apps Script is awesome.