So here's a video showing the new ProModel translator I wrote for CMSD. It also shows a little bit of the CMSD Viewer program I've been working on. Best watched in HD so you can read the text and follow along...
Thursday, April 22, 2010
Core Manufacturing Simulation Data
I'm not sure if I've mentioned this yet, but I'm pretty sure I haven't. A little over 3 years ago I attended Winter Sim 2006 in Monterey, CA (nice place by the way) and got introduced to the Core Manufacturing Simulation Data (CMSD) specification from NIST. This spec was supposed to act as a neutral data format for holding manufacturing simulation.
I watched its progress over the years, hoping at some point to be able to apply some of my past work translating Value Stream Maps into QUEST models to be generically available to any simulation package.
Cut to March of 2009, and it seemed like the spec was getting pretty close to being a standard. NIST had the spec in to SISO for review and eventual vote, and my boss saw a presentation by Swee Leong from NIST, and was impressed enough by it to allow me to start working on a CMSD translator for Value Stream Mapping.
So I got started right away, first stepping back and looking at what this translator should be. I should mention that the CMSD format is technically set to be a UML standard, but along with that will come an XML schema. So anyways, long story short, I hadn't done much work with parsing XML files, and had some experience kludging together solutions to build XML files, and didn't like the idea of implementing a parser/writer for each translator I might write.
So I decided to build a centralized CMSD XML parser/generator, utilizing Microsoft's built-in DOM libraries to do the actual file I/O from/to XML files, and I would write the code to translate XML elements into instances of CMSD classes. So I built that central tool as a COM/ActiveX class, using VB6 (I know, could I possibly use something older?). So to read a CMSD file I just have to create a new CMSD object and tell it to read the file, and I get all the CMSD data in a nice, easy to program against object model. And, I only have to write and maintain one XML I/O code base, which makes me happy.
So I called this class the CMSD API (very original, I know) and started writing translators. To cut my teeth on the standard, I started off just building something to build a QUEST model from a CMSD API object, just using a simple XML file that NIST had provided to me to demonstrate the XML format. That one translator has turned into about a dozen different translators, some proofs of concept, and some potentially useful tools.
So here, after six paragraphs, is where I get to my point. We're reaching the stage where these translators are ready for some public consumption. They're hopefully useful enough that people can use them, though because the CMSD standard isn't really set in stone yet, I'll refrain from calling them CMSD-based tools; rather they're just tools that utilize a custom XML format.
Okay, so seven paragraphs in, and my point is that I'm looking for simulation users interested in trying this software out. I can't guarantee I can take everyone, and ideally you should be a user of more than one of the following tools:
1. DELMIA QUEST
2. eVSM
3. ProModel
4. Arena
5. Excel
6. Simio
7. FlexSim
Now, if you only use one of the above-listed simulation packages, that's okay. I've also been working on a standalone CMSD editor, that hopefully in the future can take the place of a simulation package builder interface (and just use the simulation package as a solver). Also, if you're good at programming in VBA and would like to be able to build your own translators with my CMSD API, let me know, as I'd like to get feedback on the quality of the methods in the API.
So if you're interested in joining a beta test, contact me. In the meantime you can see the documentation for these tools grow as I write it (thank you, Google Docs):
Documentation
I watched its progress over the years, hoping at some point to be able to apply some of my past work translating Value Stream Maps into QUEST models to be generically available to any simulation package.
Cut to March of 2009, and it seemed like the spec was getting pretty close to being a standard. NIST had the spec in to SISO for review and eventual vote, and my boss saw a presentation by Swee Leong from NIST, and was impressed enough by it to allow me to start working on a CMSD translator for Value Stream Mapping.
So I got started right away, first stepping back and looking at what this translator should be. I should mention that the CMSD format is technically set to be a UML standard, but along with that will come an XML schema. So anyways, long story short, I hadn't done much work with parsing XML files, and had some experience kludging together solutions to build XML files, and didn't like the idea of implementing a parser/writer for each translator I might write.
So I decided to build a centralized CMSD XML parser/generator, utilizing Microsoft's built-in DOM libraries to do the actual file I/O from/to XML files, and I would write the code to translate XML elements into instances of CMSD classes. So I built that central tool as a COM/ActiveX class, using VB6 (I know, could I possibly use something older?). So to read a CMSD file I just have to create a new CMSD object and tell it to read the file, and I get all the CMSD data in a nice, easy to program against object model. And, I only have to write and maintain one XML I/O code base, which makes me happy.
So I called this class the CMSD API (very original, I know) and started writing translators. To cut my teeth on the standard, I started off just building something to build a QUEST model from a CMSD API object, just using a simple XML file that NIST had provided to me to demonstrate the XML format. That one translator has turned into about a dozen different translators, some proofs of concept, and some potentially useful tools.
So here, after six paragraphs, is where I get to my point. We're reaching the stage where these translators are ready for some public consumption. They're hopefully useful enough that people can use them, though because the CMSD standard isn't really set in stone yet, I'll refrain from calling them CMSD-based tools; rather they're just tools that utilize a custom XML format.
Okay, so seven paragraphs in, and my point is that I'm looking for simulation users interested in trying this software out. I can't guarantee I can take everyone, and ideally you should be a user of more than one of the following tools:
1. DELMIA QUEST
2. eVSM
3. ProModel
4. Arena
5. Excel
6. Simio
7. FlexSim
Now, if you only use one of the above-listed simulation packages, that's okay. I've also been working on a standalone CMSD editor, that hopefully in the future can take the place of a simulation package builder interface (and just use the simulation package as a solver). Also, if you're good at programming in VBA and would like to be able to build your own translators with my CMSD API, let me know, as I'd like to get feedback on the quality of the methods in the API.
So if you're interested in joining a beta test, contact me. In the meantime you can see the documentation for these tools grow as I write it (thank you, Google Docs):
Documentation
Thursday, April 15, 2010
OR Exchange
I've been a member at StackOverflow since it launched to the public. For those who don't know what it is, it's basically a question and answer site built the way a question-answer site should be built; namely, every page deals with questions, and answers (rather than traditional forum type sites, which are more for threaded discussion, but get question-answer sits shoehorned in). They've built a fantastic community, which means you can go on there and ask a programming question and get an answer in a pretty respectable amount of time.
About 6 months ago they launched StackExchange, which was a service that let people set up their own question-answer sites, with the same look and feel as StackOverflow, but you were responsible for getting your community in order. Apparently a lot of the sites that were created never really got going very well, and that's led the StackExchange people to change the rules a bit. From now on, if you want to start a StackExchange site, you have to put together a proposal convincing them why they should build it, and then you have to rally enough users to initialize the site with questions and answers, and hopefully get enough momentum going that they can launch your site and you end up with a community where you can get answers to your niche questions.
The point of this post, then, is to talk about OR-Exchange. OR-Exchange is a StackExchange site built for answering Operations Research problems, which involves a lot of math programming, but simulation is definitely a part of that. Unfortunately, the StackExchange people don't seem to think there's enough participation in the site, so it's got 3 months left before it's shut down (we'll call it early July is the termination date).
So I encourage you, if you're a simulation user (and why wouldn't you be if you're reading this, unless you're my mom (who doesn't read this, by the way)) to visit OR-Exchange.com and join the community, and maybe ask some general purpose simulation questions (if you know the answer, you're allowed to answer your own questions, too).
About 6 months ago they launched StackExchange, which was a service that let people set up their own question-answer sites, with the same look and feel as StackOverflow, but you were responsible for getting your community in order. Apparently a lot of the sites that were created never really got going very well, and that's led the StackExchange people to change the rules a bit. From now on, if you want to start a StackExchange site, you have to put together a proposal convincing them why they should build it, and then you have to rally enough users to initialize the site with questions and answers, and hopefully get enough momentum going that they can launch your site and you end up with a community where you can get answers to your niche questions.
The point of this post, then, is to talk about OR-Exchange. OR-Exchange is a StackExchange site built for answering Operations Research problems, which involves a lot of math programming, but simulation is definitely a part of that. Unfortunately, the StackExchange people don't seem to think there's enough participation in the site, so it's got 3 months left before it's shut down (we'll call it early July is the termination date).
So I encourage you, if you're a simulation user (and why wouldn't you be if you're reading this, unless you're my mom (who doesn't read this, by the way)) to visit OR-Exchange.com and join the community, and maybe ask some general purpose simulation questions (if you know the answer, you're allowed to answer your own questions, too).
Wednesday, March 31, 2010
Interpreting BCL Error Codes
From time to time I'll have to write an SCL macro that executes some BCL statements. It's usually pretty straightforward, just issue the command using the BCL() function in an SCL macro, and it'll return a numeric error code, including 0 for a normal return.
Sometimes though, especially when working with user-provided data, these commands can fail (often due to invalid characters or some such thing). When a BCL command fails, the BCL() function returns a non-zero number and an error message gets printed to the screen for the user to see.
However, a lot of the SCL macros I write are run pretty much autonomously, so anything printed to the screen gets lost to me. That's why I usually have a function for logging messages to a text file, for analysis after the fact, to see why some macro failed.
So, getting to my point, finally, when I am using the BCL statement, I like to have my own routine (named exec_bcl_cmd or something) that wraps all this stuff together: executing the BCL statement, evaluating any error code return, and logging that to a text file. I've created an example SCL file that shows most of this, just a procedure named exec_bcl_cmd, that executes a bcl statement string, then converts the bcl return code integer into a more descriptive error message (pulled from the bclerr.inc file).
You can download it here.
Sometimes though, especially when working with user-provided data, these commands can fail (often due to invalid characters or some such thing). When a BCL command fails, the BCL() function returns a non-zero number and an error message gets printed to the screen for the user to see.
However, a lot of the SCL macros I write are run pretty much autonomously, so anything printed to the screen gets lost to me. That's why I usually have a function for logging messages to a text file, for analysis after the fact, to see why some macro failed.
So, getting to my point, finally, when I am using the BCL statement, I like to have my own routine (named exec_bcl_cmd or something) that wraps all this stuff together: executing the BCL statement, evaluating any error code return, and logging that to a text file. I've created an example SCL file that shows most of this, just a procedure named exec_bcl_cmd, that executes a bcl statement string, then converts the bcl return code integer into a more descriptive error message (pulled from the bclerr.inc file).
You can download it here.
Wednesday, March 24, 2010
Name Change
If anyone even reads this to notice, I've changed the name of this blog from just DELMIA QUEST to DELMIA QUEST and Manufacturing Simulation. I did this because, for the last year or so, I've been dedicating a lot less time to QUEST than I previously did, and a lot more time on using Value Stream Mapping for building simulation models, as well as some other production design/analysis tools.
I'm going to continue focusing a lot of content ("a lot" is used loosely here meaning there's not much content to this blog anyway) on QUEST, but I also want to be able to talk a bit more about the other stuff I do, and a lot of that applies in some way to QUEST, ultimately.
I'm going to continue focusing a lot of content ("a lot" is used loosely here meaning there's not much content to this blog anyway) on QUEST, but I also want to be able to talk a bit more about the other stuff I do, and a lot of that applies in some way to QUEST, ultimately.
How to Populate a Value Stream Map with Simulation Data
So if you look back in the archives of this blog you might see I've done some work taking data from Value Stream Maps (VSM's) and using it to build QUEST models semi-automatically. With pretty much any simulation package, a user is presented with an interface that exposes pretty much every option imaginable. With VSM, a user only tacks on the data they have available. Some people get overwhelmed by too many dialog options at once. In essence, when you're building a VSM with the intent of using it for simulation, you're only sticking in the data you have, and ignoring all the other options you don't have information for, with a much simplified interface (the VSM software).
It's also important to note that there are two different world views for manufacturing simulation packages, really, that I've seen: resource-based, and process-based. In a resource-based world view, you see all your machines on a virtual floor with some labor objects, maybe. Parts or whatever you call them enter at a source and jump around machines and modeling elements corresponding to, for the most part, physical objects. A process-based view is different, in that you basically get a flow chart where each block usually represents a process or a decision or something. The part item arrives at a source, again, but now moves between processes in sequence through the flow chart. A process can require different resources, so that parts end up getting blocked like in real life. Essentially both world views contain the same information and provide the same outputs, but just go about getting the outputs in different ways.
I'll tell you right now that a VSM pretty much takes on a process-based world view, except that people usually name their process boxes after the resource where the process is done. So there's sort of an implicit definition there, saying that we're doing a process, and that this process requires a resource based on the name of the process.
You may or may not know about QUEST's world view, so I'll go over that quickly: it's a resource-based view, with a construct called a "process" that holds attributes on the process, like what labor resources it requires and what the cycle time is and all that. There is also a "part class" construct that holds some attributes, including the sequence of processes the part needs to be "completed". A machine in QUEST can be told what processes it's able to perform, so that when a part arrives at it, it decides what process to do on that part, and we can then route the part on to its next process (whatever machine that may be at).
So a few years ago a client presented us with a VSM they had created that detailed the process flow for a line they wanted to simulate. It contained some cycle times, but it was, otherwise, pretty much just a process sequence.
So I had to figure out, how can I take this information in an eVSM file (which has an automatic export to Excel) and turn it into a QUEST model. It was pretty easy to just do a one-to-one build of processes in QUEST; one process for each process block in the VSM. But the missing component of data, then, was how to tie a process to one or more machines in a QUEST model?
To tie a VSM process to a QUEST machine resource, I simply had to add a tag onto a process with the ID of the machine to attach it to (a many processes to one machine relationship). So the actual SCL code to do this consisted of reading through the Workstation column in the Excel file, and for each unique value in that column, just build a machine with that name. Then read through every process (row) and create a process in QUEST with that name. Then, look at the Workstation column for that process row, and assign the process to that machine (the Workstation value can actually be a comma-delimited list to specify that multiple machines can handle the process).
The next challenge was process sequencing. I could have assumed that the Excel output from eVSM was in order of the process sequence, but that'd be pretty limiting to a simulation user. eVSM requires that you provide a tag shape to each process shape, and the tag value must be unique for each process. So, I required that the sequence be encoded through just specifying a process' next operation in the sequence (using that next operations' tag text) as a process attribute. Then, the SCL to build the QUEST model just has to set the process sequence for a part class to the sequence of operations in that Excel file. This way of creating the sequence also allowed me to look at each individual process, and find whatever processes were feeding into it. So if I found two or more processes that both output to a single process, that single process must be an assembly operation, and require a part from each of them before running the process.
That brings me to my next, and final data requirement: the part type. Like I said, QUEST likes to have a part class for each unique part type in a system (there are exceptions of course, but I won't get into them). Each part type (in the type of model we're building here) should then have a sequence of operations for the purpose of routing and process execution. So to specify the part type for a process, I just required there be a Product attribute on each process, and the SCL chunks through the data and identifies the sequence of operations and assembly points and all that automatically.
But, to close out, we really need three pieces of information to successfully build a (however basic) QUEST model: what are our part types, what are their process sequences, and where do those processes get done. And a VSM is perfectly adequate at providing that information. I meant to be a lot more succinct in my explanation of how the conversion works, but I'm not sure it's possible. I'll give it a shot again some other time.
It's also important to note that there are two different world views for manufacturing simulation packages, really, that I've seen: resource-based, and process-based. In a resource-based world view, you see all your machines on a virtual floor with some labor objects, maybe. Parts or whatever you call them enter at a source and jump around machines and modeling elements corresponding to, for the most part, physical objects. A process-based view is different, in that you basically get a flow chart where each block usually represents a process or a decision or something. The part item arrives at a source, again, but now moves between processes in sequence through the flow chart. A process can require different resources, so that parts end up getting blocked like in real life. Essentially both world views contain the same information and provide the same outputs, but just go about getting the outputs in different ways.
I'll tell you right now that a VSM pretty much takes on a process-based world view, except that people usually name their process boxes after the resource where the process is done. So there's sort of an implicit definition there, saying that we're doing a process, and that this process requires a resource based on the name of the process.
You may or may not know about QUEST's world view, so I'll go over that quickly: it's a resource-based view, with a construct called a "process" that holds attributes on the process, like what labor resources it requires and what the cycle time is and all that. There is also a "part class" construct that holds some attributes, including the sequence of processes the part needs to be "completed". A machine in QUEST can be told what processes it's able to perform, so that when a part arrives at it, it decides what process to do on that part, and we can then route the part on to its next process (whatever machine that may be at).
So a few years ago a client presented us with a VSM they had created that detailed the process flow for a line they wanted to simulate. It contained some cycle times, but it was, otherwise, pretty much just a process sequence.
So I had to figure out, how can I take this information in an eVSM file (which has an automatic export to Excel) and turn it into a QUEST model. It was pretty easy to just do a one-to-one build of processes in QUEST; one process for each process block in the VSM. But the missing component of data, then, was how to tie a process to one or more machines in a QUEST model?
To tie a VSM process to a QUEST machine resource, I simply had to add a tag onto a process with the ID of the machine to attach it to (a many processes to one machine relationship). So the actual SCL code to do this consisted of reading through the Workstation column in the Excel file, and for each unique value in that column, just build a machine with that name. Then read through every process (row) and create a process in QUEST with that name. Then, look at the Workstation column for that process row, and assign the process to that machine (the Workstation value can actually be a comma-delimited list to specify that multiple machines can handle the process).
The next challenge was process sequencing. I could have assumed that the Excel output from eVSM was in order of the process sequence, but that'd be pretty limiting to a simulation user. eVSM requires that you provide a tag shape to each process shape, and the tag value must be unique for each process. So, I required that the sequence be encoded through just specifying a process' next operation in the sequence (using that next operations' tag text) as a process attribute. Then, the SCL to build the QUEST model just has to set the process sequence for a part class to the sequence of operations in that Excel file. This way of creating the sequence also allowed me to look at each individual process, and find whatever processes were feeding into it. So if I found two or more processes that both output to a single process, that single process must be an assembly operation, and require a part from each of them before running the process.
That brings me to my next, and final data requirement: the part type. Like I said, QUEST likes to have a part class for each unique part type in a system (there are exceptions of course, but I won't get into them). Each part type (in the type of model we're building here) should then have a sequence of operations for the purpose of routing and process execution. So to specify the part type for a process, I just required there be a Product attribute on each process, and the SCL chunks through the data and identifies the sequence of operations and assembly points and all that automatically.
But, to close out, we really need three pieces of information to successfully build a (however basic) QUEST model: what are our part types, what are their process sequences, and where do those processes get done. And a VSM is perfectly adequate at providing that information. I meant to be a lot more succinct in my explanation of how the conversion works, but I'm not sure it's possible. I'll give it a shot again some other time.
Thursday, February 18, 2010
Winter Sim Presentation
This past December, I was fortunate enough to be able to present some of the work I've been doing with extracting data from Value Stream Maps and turning it into simulation data. I started off using QUEST as the simulation tool for the data, and last March (2009) I got the green light to start working on implementing the Core Manufacturing Simulation Data (CMSD) format. The CMSD is meant to be a generic simulation data interchange format, sort of like STEP is for CAD.
So, here's the presentation I made at Winter Sim. If you want to talk about it, you can contact me here.
Subscribe to:
Comments (Atom)