Tuesday, June 12, 2012

Free June 13 Webinar on Protecting B2B Data in Automotive Environments


Read more here: http://www.sacbee.com/2012/06/11/4552941/free-june-13-webinar-on-protecting.html#storylink=cpy
/PRNewswire/ -- The need to protect automotive business data by modernizing B2B infrastructures to meet today's complex global collaboration demands will be the topic of a free one-hour webinar at 1 pm EDT on Wednesday, June 13. The discussion – "Secure and Protect Your Business Assets" – will be led by representatives of the Automotive Industry Action Group (AIAG) and global business integration software provider SEEBURGER.
Presenters will discuss risks to data security associated with technology challenges in emerging markets such as Brazil, Russia, India and China (BRIC), plus infrastructure challenges including dated technologies like FTP, lack of standards-based security, and OEM-to-OEM product lifecycle management (PLM) data exchange.
Also covered will be strategies for mitigating those risks including:
Webinar presenters will be Akram Yunas, Program Manager for AIAG and former executive director of the Automation Society, and Brian Jolley, Information Technology Specialist for SEEBURGER, who has specialized in automotive information technology business solutions for the past 14 years.
Registration is available online at http://www.aiag.org/ or by calling AIAG at             248-358-3003      .
Click to Tweet:  Free June 13 Webinar on Protecting B2B Data in Automotive Environments – Presented by SEEBURGER and AIAG
About SEEBURGER
SEEBURGER is a global provider of business integration and secure managed file transfer (MFT) solutions that streamline business processes, reduce operational costs, facilitate governance and compliance, and provide visibility to the farthest edges of the supply chain to maximize ERP effectiveness and drive new efficiencies. All solutions are delivered on a unified, 100% SEEBURGER-engineered platform that lowers the total cost of ownership and reduces implementation time. With more than 25 years in the industry, SEEBURGER today is ranked among the top business integration providers by industry analysts, serves thousands of customers in more than 50 countries and 15 industries, and has offices in Europe, Asia Pacific and North America. For more information, visit http://www.seeburger.com/ or blog.seeburger.com/

Read more here: http://www.sacbee.com/2012/06/11/4552941/free-june-13-webinar-on-protecting.html#storylink=cpy

Monday, June 11, 2012

PLM and Multi-Tier Strategies

PLM and Single Point of Truth. You probably heard about that before. I tried to address this topic in the past. Navigate to few of my old posts about that – PLM and Single Point of Truth and PLM and Single Point of Disagreement. Earlier this year, I came back to this topic in my write-up PLM and “The Whole Truth” Problem, which raised up few comments about cost of integration and “single PLM option” as a cost-effective alternative. Interesting enough, the topic of multiple systems usage isn’t unique for PLM space only. I’ve been reading CloudAve blog post The Rise of Two Tier ERP and Larry Ellison’s NetSuite Intentions. Here is an interesting passage:
Essentially it’s a further nod by NetSuite to the notion of two-tier ERP, the idea that organizations can continue to use their existing ERP systems at a corporate level, but enable individual business units to innovate with secondary solutions. It’s a smart idea and one which is a natural fit for NetSuite that had traditionally had a hard time selling into the largest corporates who were generally seen as invested in one or other of the large ERP vendors. At the event NetSuite was keen to tell attendees about the case studies of large corporates who have moved to NetSuite for individual business units, all tied to traditional ERP solutions at the corporate level.
Even if the ERP topic and local accounting is a bit different topic, I can see a clear trend to “optimize” IT environment opposite to unification of everything under a single umbrella.
PDM/PLM multi-tier optimization
The idea of multi-tier strategy for PLM implementation sounds as something we might see more in the near future. The main reason is the same – optimization. With a large amount of IT systems already implemented, companies can think about the combination of systems to re-use existing assets and implementation and optimize cost. Cloud can play an additional role in this future optimization by providing companies an easy path to the missing functionality or coverage of remote and geographically separated divisions.
What is my conclusion? How to optimize IT assets? I believe we will be hearing about it more and more these days. When cost of global deployment is skyrocketing, companies will be looking how to leverage multi-tier strategies to optimize the future of PLM deployment and implementation. Large PLM vendors be aware, since it provides an opportunity to niche players and startups. Just my thoughts…

Source: http://beyondplm.com/2012/06/09/plm-and-multi-tier-strategies/

Kenesto vs PLM 360: Apples to Apples?

 

We have built a product which works the way people want it to without forcing them to change how they work and which delivers technology in precisely the way people want to access it. With Kenesto 2012 we have reached an objective no PLM system has achieved before: practically zero implementation costs for deployment or training.

Autodesk PLM360 vs. Kenesto. Data management is a difference.

 Randal Newton of GraphicSpeak coined an interesting term – unPLM. Navigate to the article to read more. Randal is driving some parallels between Autodesk PLM 360 and Kenesto. He is speaking about the amount of processes / solutions as one of the differences, but not only. Here is an interesting passage from the article:
The first product that comes to mind when looking at Kenesto is Autodesk 360 PLM, the cloud-based product lifecycle management system introduced earlier this year. Look past the cloud deployment, the openness about pricing, and the browser-based work environment and there are many differences. Autodesk 360 is aimed at the traditional PLM market, engineering departments; Kenesto is looking at wider deployment in the enterprise.

The comparison between “traditional PLM market, engineering department” and “wider enterprise deployment” is something that caught my attention. I navigated to the previous GraphicSpeak article – Autodesk launches cloud-based PLM. Here is a quote with examples of PLM 360 apps:
Autodesk says 360 is made up of more than 140 apps so far, which can either be used as-is or modified by users. The apps so far fall into ten categories: Quality, Supplier Management, Engineering, Program Management, Service and Support, Operations, Sales and Marketing, Manufacturing, Executives, IT professionals.

It looks like PLM 360 is clearly focused on how to cover all organizational activity and not stopping on the level of PLM for engineering department. When Kenesto is focusing on business automation, PLM 360 provides much more by allowing to manage data and processes beyond engineering. In my view, flexible and adaptable process managament is probably a differentitation. Kenesto can provide it on top of Autodesk PM 360 in order to serve broader set of end users in organization focusing on processes rather than on data management. For the moment, PLM 360 cannot support such a level of flexibility, and it creates an opportunity Kenesto can use. The right question to ask – for how long?

What is my conclusion?  

Process automation is an important function of many systems and applications. It can be part of PLM, CRM, ERP and other solutions. From that standpoint, Kenesto can potentially serve a much broader audience. However, PLM 360 is managing data. Kenesto is not doing so (at least, for the moment). So, Kenesto can come to the play when data-management problems  already solved (or less important). Just my thoughts…

Source : http://beyondplm.com/2012/06/08/kenesto-vs-plm-360-apples-to-apples/

 

PLM 12: 9the International Conference on Product Life Cycle Management  


The 9th International Conference on Product Lifecycle Management (PLM) will be held in Montreal from July 9 to 11, 2012 at École de technologie supérieure (ÉTS). For the first time in North America, researchers, developers, and users will gather to share and update their knowledge of PLM. Industry Day, held on Tuesday, July 10, will be of particular interest to current and potential users of PLM, who will have an opportunity to learn and benefit from the experience of major companies such as Pratt & Whitney, Bombardier Aerospace,
Hydro-Québec, Johnson & Johnson, and others (see below the agenda for Industry Day).

http://www.plm-conference.org/images/uploaded/file/invitation%20PLM12%20eng.pdf


 Timing is Everything


 I think a lot about time.It's, incessant, unstoppable and irreversible.It's integral to the measurement of distance, whether it's the ping of sonar, the blip of radar, or the clocking of GPS data.  And it's often the denominator in measures of performance, be it the 40 yard dash or the quarterly accounting period.  I measure things with time every chance I get, because it is such a universal benchmark.  There's no English time vs. Metric time, and no Euro time vs. Dollar time.  And everybody knows the meaning of second, minutes, hours, days, weeks, etc. 

 At the recent Indianapolis 500, I sat in Turn 1 with a stopwatch hung around my neck, and watched an interesting element of the greatest spectacle in motor racing. Coming at us was an anachronistic squadron of warplanes, 2 P-51s, an F-16 and a Warthog. I couldn't help but wonder how they timed the exact moment of the fly-over to the final stanza of the Star Spangled Banner.  As the planes roared (and rumbled) over the speedway, I noticed the sign on top of the jumbotron - "Official Timekeeper: TAG Heuer".  At that moment, I was reminded of the role Swiss timekeepers played in mass production and the imperative of product data management brought about by an industrial revolution that started over a century ago.
 Long before there was a Silicon Valley in California, there was Watch Valley in Switzerland.  Central Europe had advanced metallurgy, mechanical engineering and manufacturing technology to the point that the world's first pocket watch, the "Nuremberg Egg", had become a popular jewelry fashion statement as well as an instrument of timekeeping.  Swiss craftsmen flourished in the watchmaking trade, because it suited their need to be productive with indoor activities when they were snowed-in during the Alpine winters.  Demand for watches was increasing dramatically, but the craft-mode production techniques were not keeping up.
 Around the period chronograph watches were introduced by the likes of Heuer, there was a new approach to the manufacturing industry in which "interchangeable parts" were introduced, especially in the making of firearms.  The watchmakers adopted this principle for their products too, realizing that it was key to mass production.  With interchangeable parts made to a precise specification, the watchmakers could maintain quality, engage sub-suppliers, allow for best-of-breed specialization, and ramp-up production.  Documentation became paramount, and tolerances really mattered.  With detailed and unambiguous design specifications, the various pieces of a watch movement, from escapement wheel to winding wheel, from hairspring to mainspring, could be outsourced to a plethora of craftsmen scattered across the Swiss countryside.  The entirety of Switzerland became noted as masters of precision, and as we all know, the root of precision is in the specifications.
 Nature drove the release control cycle. In the Autumn, the various watchmakers would come to a place like La Chaux-de-Fonds to get their finalized drawings and specifications, and then return to their various villages and towns for a long and hopefully productive winter, during which time the design was frozen.  After the thaw in the Spring, they would meet in Geneva and submit their components to the final assembly process and the now famous Geneva watch show.  The designer/brand-owner with the best specs and tolerances had the best opportunity to achieve the highly desirable "Chronometer" certification, indicating that they were the world's best devices for the measurement of time.
 Which brings us back to timekeeping in general.  Indy cars race against time.  So do companies.  Competing on the time domain is like playing on a level playing field.  Everybody has the same amount of time.  But with effective comprehensive Product Data Management and related Product Lifecycle Management, some companies do better with their time allotment than others.  They consume less development resources, have more predictability, and are able to take more market share.  Be one of those companies - improve the collaboration process, promote design re-use, and shorten development cycles to accelerate product launch and gain market share.  Eliminate the physical meeting config control board "signing party" and replace it with the virtual CCB.  Prepare to design anywhere - build anywhere.  Use time to your advantage.  You can't buy more time - so use it effectively to optimize your development process, clarify your communications with production, put discipline into your product plans, and help your company run like a Swiss Watch. 

Source : http://www.zerowait-state.com/blog/491-zws-plm-blog-timing-is-everything

PLM, Semantics Technology and Data Federatio

 
I’m in a deep technological mood these days. As you probably noticed, I’m attending Semantic Technology & Business conference in beautiful, but cold San Francisco. SemTech 2012 covers an interesting technological space that covers a variety of topics related to data, data management, big data, semantics, linked data and semantic web. So, the environment of the conference and some presentations made me think about some modern trends in data management related to data federation. It probably goes a bit beyond the technological level of this blog, but I found it interesting and insightful.
Distributed Data Architecture
Our world is getting more and more distributed. The time when you  was able to concentrate the data in a single computer and/or databases almost became a history. We are moving towards something bigger that can scale to the level of web. The following two examples show a potential role of semantic technologies in support of federated data environment:
Andrew Sunderland of Spry Inc presented enterprise data management options. Here is the interesting quote explaining his presentation:
Companies are looking for methods to quickly expose data sources for federated data access, while at the same time developing a robust, executable enterprise ontology. Data profiling tools can be leveraged to profile data sources and bootstrap ontologies and mappings. This talk will showcase how Spry is leveraging these tools to quickly expose data sources, while in developing an enterprise ontology

Another example is coming from FluidOpsTransformation of Enterprise Data Islands into Linked and Living Knowledge. Information Workbench environment coming from FluidOps. The discussion focus was on the transformation of enterprise data islands into linked and living knowledge and elaborates on the costs and benefits of managing information in a unified semantic space.
The following picture shows Information Workbench architecture and the role of semantic technologies to achieve the role of data unification.

Data Federation and Asymmetric Computing
I had a chance to attend the presentation of Bryan Thompson of Systap discussing the bigdata® architecture. His presentation was focused on the computing side of distributed data environment and federation. The following slide presents the role of RDF and graph as a unified model for heterogeneous data sources.

How is that related to PLM?
Now, you can ask me- how it is related to PDM and Engineering and Manufacturing world. Here is my take. IT infrastructure of manufacturing companies is extremely complicated these days. It includes existing data management and enterprise systems, content and document management vaults, unmanaged files and other data sources. Nowadays, cloud and web are coming as an additional data places companies target for data. The overall environment is global and distributed. Existing PLM systems are striving towards centralization of data into a singe data. The single database architecture might be not sufficient, cost of data transition might be too high, cloud and globalization is another dimension of complexity. Distributed and federated data management capable to scale to the level of web – logically and physically can be an interesting platform option to discover.
What is my conclusion? The history gave us many examples when large companies missed new technological trends, and it cost them to lose their leadership position. At the same time, we can see how web companies built their infrastructure and disrupted many existing domains. What will be the technological foundation that can support challenges manufacturing and engineering companies are facing today? What will be the role of semantic data technologies in the future of these systems is a right question to ask these days.

Source : http://beyondplm.com/2012/06/07/plm-semantics-technology-and-data-federation/

Important PLM Blogs:

TeamCenter Blogs:

http://teamcenter.blog.com/2012/06/05/open-question-and-answers-in-teamcenter/