As the 3D product development technology industry is now around 30 years old, it’s a good time to take a look back at how far we’ve come. Reviewing developments is Gavin Quinlan, managing director of Honeycomb Solutions, which provides Irish companies with access to best practice product development technology and solutions.
Designers and engineers can now create intelligent three dimensional models that define the form of a product, how it functions, and how it interacts with other products and the user, as well as how it is manufactured. Alongside this evolution of the tools designers and engineers use, there are still challenges needing to be solved if we want to make the most of the technology.
Data Re-use & Workflow
The first relates to who uses 3D design and engineering tools and to a lesser extent, when. Despite the advances, 3D design tools require a high level of proficiency to use them effectively. When design is at the formative stages, an engineer doesn’t need the heavy overhead of parametrically driven modelling tools, preferring something more fluid and suited to capturing ideas and concepts quickly and efficiently. Perhaps pulling in existing products or sub-systems where appropriate, re-adapting them to suit a new application. It’s here that direct modelling, with its drag-and-drop approach to geometry manipulation, and lesser reliance on complex processes, is king.
Today, these models are then used as reference for the CAD literate user to develop a more comprehensive product model that better represents the manufacturable form, often requiring creation from scratch. Much effort is put into the process to add intelligence and design intent that can be updated and maintained as the design progresses and which can then be re-used. However, consider if the original engineer needs to make a change, or perhaps the same subset of easier to use tools are used downstream in the process, in manufacturing for example. All of the intelligence and design intent would, using traditional tools, be lost. That would mean more rework and greater potential for error all in all. Different users require different tools, but there’s a huge disconnect and disparity between the various systems most appropriate for each stage’s use and the flow of data between them.
Externally sourced data
Another issue in many organisations, considering the prevalence of outsourcing and supply chain-based working practices, is that data is often non-native. Consider that it is common for a supplier to carry out not only production and assembly, but also development work. The problem is data is often in non-native format and can make it difficult for those who must consolidate a final master model.
Alongside problems in pure geometry creation and editing, another potential bottleneck is in managing the flow of data. While the spread of data management, workflow and ECO techniques is growing in the design world, its extension into the wider enterprise and beyond (often referred to as Product Lifecycle Management) is reserved for either large multi-national organisations or those with highly complex products or traceability requirements. But this isn’t to say the benefits of having central data storage and access, automated routing of data through Engineering Change orders or simply ensuring anyone involved in a product’s development has access to the latest data, isn’t applicable to all. It is, but is perceived as being a costly for small and medium companies.
Structure driven design
Products in many industries are now highly modular, either in a holistic nature or sub-system. Common sub-assemblies are regularly pulled together, combined with custom components for a customer’s requirements or to achieve specific requirements in terms of function and performance.
This gives those tasked with managing the process headaches when trying to rationalise BOM data either as designed, as manufactured or as is found in service. Surely a better way would be to have an environment in which the customer’s requirements are filled in as required, then the system pulls together the required data, assembles it and gives the designer and engineer the tools to quickly create the custom components or adapt existing parts?
A storm approaches
PTC is currently working on the next generation of its products. Pro/ENGINEER has become an industry standard and benchmark against which many others are judged, even though it is two decades since its initial ground breaking release. Alongside Pro/ENGINEER, PTC recently acquired CoCreate, developers of the CoCreate modelling system, which in itself spun out from its early days as the in-house design tool at Hewlett Packard. Even with this wealth of tools, it is clear that disconnected, disparate point solutions can’t continue as users look to increase their efficiency across all areas. Enter Project Lightning, the code name for PTC’s next generation platform. This will see the benefits of both direct, freeform modelling combined with the power of parametric modelling tools as found in Pro/ENGINEER built onto a common data model. It will allow an organisation to deploy the appropriate solutions required by the various members of a team and ensure that data integrity is maintained. The conceptual engineer will be able to define a concept with easy-to-use tools, then pass that onto the team responsible for more structured CAD work. The model will retain all of that formative work, enhanced with a more complete definition using parametric modelling tools.
Of course, product development is not a one-way process, so the common data model will allow for users of direct modelling tools to make changes where needed while maintaining all information and intellectual property already in the part. Add into this the ability to work with data from any source and to ensure the work your organisation carries out is done in a controlled, managed, flexible environment using industry leading technology, and you have something that looks to revolutionise how products are developed once again.