Source: Plm Mechanic Blog

Plm Mechanic Blog Discovery: Agile PLM 9.3.2 and EDQP Integration (Part 4 - Conclusion)

This is the forth and final blog in a series about Agile 9.3.2's integration with Oracle's Enterprise Data Quality for Product Data (EDQP), which walks through the Agile PLM 9.3.2 and EDQP Integration White Paper to explore the integration's setup and capabilities. This edition will focus on section 4 of the whitepaperIntroduction: Discovery: Agile PLM 9.3.2 and EDQP Integration (Part 1)Section 2: Discovery: Agile PLM 9.3.2 and EDQP Integration (Part 2)Section 3: Discovery: Agile PLM 9.3.2 and EDQP Integration (Part 3)Section 4: Enriching the Agile PLM Dataset with EDQPThe main objectives of section 4 are to:Validate the necessary attributes exist in the databaseSet up the semantic modelUse AutoBuild to create a data lens & evaluate in Knowledge StudioEnrich the Agile dataImport the data back into AgilePrior to starting the series, Agile PLM 9.3.2 and EDQP's Admin and Transform servers were installed, so these will not be covered. An entire blog series could be written on each product and it is beyond the intended scope of the series. Validate the necessary attributes exist in the databaseThis blog, and the whitepaper, are based on the demo database provided to Oracle business partners. Prior to writing this blog I downloaded the demo database, upgraded it to 9.3.2 and reran the SQL scripts as outlined in the prior blog. This step verifies that the content in the database is as expected:Log into the Agile JavaClientNavigate to Admin tab| Settings | Data Settings | Classes and open the Capacitors subclassClick the User Interface tabDouble click the Capacitor Attributes (page 3) rowClick the Attributes: Page Three tabVerify the following attributes are defined: Material, Package, Temperature Characteristics, Tolerance, Capacitanance and Voltage.To eliminate possible issues, I changed the name on a couple of my attributes to ensure they aligned with the documentation as they contained additional text. The picture below shows what I started with and unlike the whitepaper, I only had 52 capacitors - but that should be enough to test with.If you do not have the demo databae the whitepaper has a screenshot that shows the APIName and type so you should be able to reproduce the subclass structure without any issues.Set up the semantic modelEarlier in the series, we installed the Excel add-in for EDQP; now its time to use it.Start Excel. By default a new workbook should show but if for some reason yours does not, create a new workbookNavigate to the Add-Ins tab | DataLens Tools dropdown | Set Transform ServerLog into EDQP and accept the server groupSelect Job Options from the DataLens Tools dropdownSelect the APLM_CREATE_SEMANTIC_MODEL from the DSA list, click ok.Run the Job and enter Capacitor in the popup dialog to select all the objects that are of the type Capacitor. At this point I received errors stating that a database connection could not be established. The documentation was actually incorrect. The database connection name set up prior should be APLM_CONNECTOR, not PLM_CONNECTOR!At this point, the Agile data will be populated in Excel and is ready for AutoBuildCreate a DataLens using AutoBuild & evaluation it in Knowledge StudioAutoBuild is a great feature to speed the creation of a DataLens. It can build out terms, phrases and put together some classification rules. The usage and capability of the Knowledge Studio is outside of the scope of the blog but it is a very interesting tool and AutoBuild helps one get started.Next to the DataLens Tools dropdown in Excel is a button for AutoBuild. With results above loaded in the workbook, click the buttonA series of dialog boxes will show. In this exercise, we only care about the first and last. On the first dialog, select the "Generate a new DataLens" radio button and click Next to transition through the dialogs until a name can be entered for the data lens. Give the lens a name - it looks like you must select a name from the dropdown, but it is a combo box that will accept user input. Click finishIn Knowledge Studio, you will notice some terms in orange and some in white. The terms in white are recognized and the ones in orange are not.Watch the video and see how to add an alias to CER so that ceramic becomes a recognized term. There is a lot of capability built into Knowledge Studio and recognizing terms and phrases is just the beginning. After come the standardization - it is a very powerful tool!The exercise is to help understand how one could start manipulating Agile data. The example data lens APLM_Capacitors was deployed in the earlier blogs, so this particular lens will not serve a purpose.Enrich the Agile dataThere are several steps to enriching the Agile data: extract the data, update it, and prepare it for import. Here are the steps:Extracting the dataOpen Job OptionsSelect the APLM_CREATE_PRODUCTION_BATCH DSA, click okYou are prompted for the subclass(es) and the last update date. I entered "Capacitor|200-JAN-01 00:00:00" and clicked okRun the DSA and Excel will extract Agile data and create a tab called 20-Batch_Creation_DetailsMake a note of the value in the Job Id column, you will need this for the next stepUpdating the dataWith the newly extracted data selected (tab 20-Batch_Creation_Details), open Job ObtionsSelect the APLM_CLEANSE_PRODUCTION_BATCH DSARun the job and enter the Id from step 1.5 above. This is where the custom data lens gets called to transform the data. The gap in information in the whitepaper is that a mature version of the AutoBuild lens created above could eventually make its way to be called in this step rather than the supplied lens (see picture below)The result of the job are some new tabs in the worksheet. If you look at the new content you will see that it is comprised of name-value pairs in the form of adjacent columns rather than having an attribute name as the column and its value as the cell value. Luckily there is a utility to correct this.Preparing for importAfter transformation, the last step is to prepare the data to be reimported into Agile. Click on the new worksheet created by the cleansing DSANotice that around column F is where attributes start being displayed as name-value pairs - this is where we will have the Add-In start its correctionRight click on any cell and select DataLens Services | Group Records into WorksheetsEnter the column FSelect a filename for the new contentOpen the new spreadsheet and notice that attribute names are now column headings. This data should be ready for import! Import the data back into AgileI decided not to go into Agile import because it is such a common thing to do, but at this stage, the hard part is done. Refer to the Agile Import/Export guide if you have questions on this step. Final Thoughts on the IntegrationOverall I really like what I have seen of EDQP. It is a niche in and of itself and even though I have worked with it before, I am sure I have only scratched the surface of its capabilities. When I titled this blog "Discovery" I truely meant it. I did not know where the series would lead and did very little homework first. After going through the document I am left feeling satisfied but a little empty inside. I really had hopes for a tighter coupling with EDQP, such as leveraging the extension framework for real-time validation or updating of attributes, but the capability isn't there. Instead we are left to either write our own solution or use batch processing. In the mean time, I am thankful for the tools that Oracle did provide and am still hopeful that future releases will have a tighter integration.What are your thoughts?The VideoHere is the video for section 4 of the whitepaper: ReferencesAgile documentation: http://docs.oracle.com/cd/E28664_10/otn/docset.htmlEDQP documentation: http://docs.oracle.com/cd/E35636_01/index.htmAgile website: http://www.oracle.com/us/products/applications/agile/overview/index.htmlEDQ website: http://www.oracle.com/us/products/middleware/data-integration/enterprise-data-quality/overview/index.html20-Batch_Creation_Details

Read full article »
Est. Annual Revenue
$5.0-25M
Est. Employees
25-100
CEO Avatar

CEO

Update CEO

CEO Approval Rating

- -/100