By John Evans, Jan 31, 2012
KeyCreator Compare software from Kubotek is a sub-set of KeyCreator, the company’s direct modeling platform. The software allows you to compare parts or assemblies to find differences that may have developed over time.
The company outlines the benefits, as follows:
I arranged to get the software and after using the tool for a few days, I began to see its benefits. While the functions are there for anyone to use the tool in numerous ways, the biggest benefit I can see is determining specific versioning differences in libraries, such as Vault, and in designed components being sent to simulation and FEA (finite entity analysis) programs. Clients often want to know exactly which features were modified or eliminated in the simplification processes, and this tool clearly provides the capability.
KeyCreator Compare has a list price of $2,495, but needs to operate inside KeyCreator, and so the cost rises to $6,195. Kubotek software totes along an annual maintenance fee coming in at 22% of list. Kubotek recently announced that an introductory price of $1,495 is available for a limited time to early adopters.
Figure 1: Reported generated by KeyCreator’s translator
The KeyCreator MCAD program offers an extensive library of file conversions, including CATIA, Inventor, and STEP formats, which makes the Compare utility useful to almost anyone. Every non-native model selected is translated in a similar manner and loaded into separate windows. The importation and conversion process is automated, and a report is generated by the translator after each one is complete. See Figure 1.I receive many simulation projects in STEP format, and so this software makes it easy to analyze the differences between components I simplified, and those that were originally issued. In my review, I routinely compared STEP and Inventor formats. KeyCreator did not, however, care for Autodesk’s Fusion 2012 .dwg format, and would not open it.
|Figure 2: Entering file names of models to be compared.|
Compare evaluates the contents of two separate files after you fill in a simple dialog that contains separate fields for the IS and WAS models. A button near the top of the dialog allows you to map the file names to those of the models that are already loaded, if desired. See Figure 2
|Figure 3: Toolbar controls the comparison process. The pull-down menu shows in gray the numerous options that are not included in this version of the software.|
Once the models are loaded, the validation is run. The toolbar offers simple but effective tools to set up the views and validation process. See Figure 3.
|Figure 4: The "is" model is shown on the left, the "was" model on the right.|
Once Validate Parts is picked, the software analyzes the differences, and loads the results in windows positioned to the side and bottom of the views. See Figure 4.
On the far right is the Difference Results window, which lists only the differences, if any, that are present in the assembly or part files. As each difference is picked, it is highlighted in the graphic views. Different pages are present in the window, depending on the settings and components being analyzed.
To the bottom lies the Assembly Tree Compare, where all components are loaded, with basic information and the results of the comparison between all components. In Figure 4, the damper was only present in the assembly to the right, which is reinforced in both the Difference Results window, as well as the Assembly Tree Compare.
You can apply tolerances to the analysis, allowing specified allowances in the model precisions and variations. I used the default of 0.001 units.
|Figure 5: Mousing over differences displays a brief report in a tooltip|
Mouse-over investigation is one of my favorite features. This permits you to move the mouse over parts or assemblies, visually inspecting the differences that exist in the compared model views. When the cursor passes over an identified difference, a tooltip appears with a summary of the part and its difference. See Figure 5. When the mouse passes over positively checked components or features, nothing happens, leaving your attention to the negative details.
The Examine View functionality seemed weak and troublesome at first, but I went back and gave it another look. The approach is to combine the two parts in the coordinate space they occupy as they are evaluated.
|Figure 6: Adjusting the transparency of overlaid models.|
What this view does for you is to evaluate the differences together. It is often difficult to envision just how components differ when the differences are displayed in separate views. In this dialog, they are overlaid and highlighted in opposing colors, making the differences appear as offsets. I find this is much easier to visualize, as you can see in the figure 8, later in the “Compare in Action” section of this review. At the top of the view are two sliders. These allow you to adjust the transparency of objects, from shaded to wireframe. See Figure 6. With a little adjustment, it was easy for me to distinguish the surface mismatches without the error overlay.
The Orient Copy command can be picked from the Compare pull-down menu, which helps users orient components to a coincident spatial relationship. See Figure 3. This is quite handy when validating components that have been developed by two different companies, where the coordinate bases are likely to be different.
The tool automates the collection of a base point, and two coordinate axis within each of the Is and Was graphics windows. While I initially found the selection process a bit clumsy, after getting used to it (which really only took 20 minutes of paying attention), the process went well, and the wizard-like step-by-step automation made it easier to to get comfortable with.
|Figure 7: Dialog box for selecting configuration options.|
The analysis section is quite customizable. Picking Configuration Options from the toolbar opens the configuration dialog which contains numerous options for the analysis. By default, the analysis looks for feature changes; however, turning on functions, such as volume, force the application to populate the differences dialog with the additional factors. See Figure 7. They were all null by default, which is probably a good thing, and reduced the comparison time, which I like.
While I didn’t have time to check my theory, I wondered if careful selection might ease up one methodology and tighten up another, allowing substantial flexibility to this tool.
|Figure 8: Even the subtlest differences are highlighted.|
Compare is quite well suited for part changes, making short work of rooting out the slightest difference in a single part file. I was able to visualize minute differences between versions of the same designs in Autodesk Vault quite easily. This was actually kind of nice.
As an analyst, using the tool to delineate the steps associated with reducing the complexity of models being introduced into simulation environments was the best thing by far. It is often quite difficult to convey the differences in the analyzed components to engineers, and this tool provided a great way to do that.
By the nature of the reduced complexity, the validation process did get quite lengthy. This particular mechanical event simulation subject went through extensive simplification, resulting in 1,579 geometrical differences. See Figure 8. The downside is that I found this many differences difficult to visualize individually.
I found Walk useful for viewing every detail; while lengthy, it made reviewing smaller details easier. This feature allows you to step through all the differences that are discovered. It automatically zooms in to each item, orienting the view normal to the feature in question.
I decided to go one step further, and analyze the differences between similar components that were not mine. On GrabCAD recently, I had been struggling to determine the validity of some components, for I was faced with numerous versions of what should be the same model. KeyCreator Compare came to the rescue.
|Figure 9: Once components are aligned correctly in the same coordinate system, then the differences can be correctly displayed.|
While the tool functioned as advertised, I did take exception to a few items that bare mentioning. There were some problems with the display, most likely relating to my hardware. I have never had problems with OpenGL or DirectX on my computer’s NVIDIA Quadro graphics card. On various designs, however, Compare experienced instability, the largest of which occurred using Direct3D with hardware support. The views became unstable, and often could not display all the components in the Is or Was assemblies. (Kubotek does not recommend Direct3Dfor the current release. Disabling the ‘Quick Move’ options in the Graphics Card section of the configuration dialog helps the supported OpenGL performance.) Picking Examine View usually caused a crash to desktop with any video card setting. I had no idea why, but after many attempts, I narrowed the problem down to not having a view window active when the toolbar is picked. I don’t recommend trying this, unless you want to quit the application very quickly.
I didn’t care for how the conversion process stopped after each file was converted. This caused half the conversion process to wait while waiting for me to return to the computer. If you are converting large assemblies in the background (which might take a couple of minutes each), you’d hope the process is completed batch-style when you return to the application. (I can’t stand waiting that long, and so I find something else to do in the meantime). This is a minor inconvenience, and perhaps there is a toggle in the settings to avoid this.
The validation of part files seems to be related to the spatial similarity between the two models. When I moved a part in space and then revalidated the parts, the entire component flagged as an error. It seems that the software is not able to rectify the part features when they were not in precisely the same spatial orientation, something that does not sit well with me. While the Orient Copy tool does make orienting the Is and Was components easier, I think that some research in this area would pay off, and even sway a few prospective clients off the fence.
Reporting features in the Compare menu are all grayed-out. Kubotek explained that these features do not come with Compare, because they are reserved for the higher end analysis packages. That’s a shame, because a reporting function would seem to be quite useful. (The workaround suggested by Kubotek is to take screen grabs, and then print them out.)
One other thing that irritated me was that I had a hard time finding help. A search of the KeyCreator help system for the keyword ‘compare’ got me nowhere. Kubotek later explained that there are two ways to get help. One is select Compare Parts from the Compare menu, and then click the Help button in the dialog box; the other is review the KeyCreator Test Drive Guide at KeyCreator University; go to support.kubotekusa.com/support/kcu/keytutorials.asp and then click on Comparison Suite.
Installation was simple, with nothing outstanding. I had to also install KeyCreator, because Compare runs inside the MCAD program.
Licensing comes in four options, one of which is a license file that is generated by Kubotek using the MAC (media access control) address of your computer’s network card. I hated this option, but understood the need.
The license number I was issued initially did not work. I emailed the support team, and after testing it themselves, they sent a new license file to me very quickly. Sure enough, it worked fine the second time around.
One thing that I think we will see more of in the future is the ability to capture existing physical features, and compare them to the originally designed models. The capability is there, and has been for years. The problem is that it has been extremely expensive and quite difficult to achieve good results. Various companies are beginning to develop this functionality. KeyCreator Compare is positioned in a good place to do just that, working with any model presented to it, freeing companies from the need to be tied to expensive software that is specific to the hardware being utilized, or that only performs the comparison.
With the tolerancing and feature toggles in the configuration, this application could possibly be tuned to sniff out differences between surfaces generated from point clouds and original solid models. The main issue that exists is the need to have the existing model oriented exactly to the designed model.
KeyCreator Compare works quite simply, and the interface is straight forward. You compare two files, and the differences are identified. I think that there is significant merit to the tool, and its usefulness is easily understood.
|John Evans has 30 years experience in the aerospace industry, including mechanical engineering, design, fabrication, and CNC manufacturing processes. He expanded into MEP and civil engineering 18 years ago. He is certified AutoCAD Civil 3D and Autodesk Inventor. Along with providing data management for a civil engineering firm in northwest Florida, John works as a design consultant for Autodesk digital prototyping, and has joined forces with an emerging clean tech developer. John continues to explore the Autodesk design industry on the Design & Motion blog. John has been a regular contributor for Civil 3D and Inventor articles in AUGIWorld Magazine, and now serves as its manufacturing content editor. He has presented at Autodesk University. He speaks English and Japanese. You can contact John at firstname.lastname@example.org .|