I admire PTC for bringing the ‘Wow!’ factor back into engineering software.
It has done this by investing, over a number of years, in technologies to support the new world of the ‘Internet of Things’ (IoT).
PTC has acquired companies to obtain technology, people, know-how, and customer bases in:
smart product and sensor connectivity
plant-floor machine connectivity
The result is the ability to offer and demonstrate tools to build new digital ways of interacting with real-world objects.
This allows PTC to present a vision, and support it with examples. The demonstrations show how engineering of smart connected products will evolve in the IoT era. And it helps that the demos are interactive, visual, and easy to understand.
At PTC’s ThingEvent in Boston in January 2016, PTC’s view of its role in the IoT was clear – provide the tools to enable the digital and physical worlds to work together.
The company used the event to show how recently acquired Vuforia would become part of this offer.
Smart Connected Products
PTC is not alone in seeing how improvements in the technology and economics of sensors, electronics, and communications are allowing products of all descriptions to be re-invented as smart, connected products, part of the IoT.
But it is first of the engineering software giants to invest with a series of acquisitions to grow its IoT business, and first with product (not lab) demonstrations of new ways of interacting with the connected ‘Things’.
PTC can demonstrate the technologies and explain how they will fit into its vision. Across the engineering software industry, the broad IoT concept is widely talked about and frequently understood, but PTC is unique in its initiative to demonstrate the vision, live.
Do you want to see sensor readings from an instrumented bicycle?
PTC can demonstrate live readings not just on a dashboard, but real time, superimposed on a live video of the moving bike, or driving animation of the CAD model of the bike.
At ThingEvent, the focus on Vuforia was achieved by looking at engineering use-cases for Augmented Reality.
Vuforia includes software that connects into a camera video input processing stack.
It converts the camera into a digital eye, scanning the field of view and recognizing the objects it has been taught about. This allows other software to use Vuforia to identify objects in the camera’s image stream, and then respond to the identity, position, orientation and movement of the object.
This is the first step to AR.
The next step is to overlay additional information onto the image, and Vuforia technology also supports alignment of the overlay with the recognized object, handling camera movement, object movement, and occlusion of the object by something else in the scene.
Customers Can Demonstrate
At ThingEvent, PTC put customers on stage to showcase their use of PTC’s IoT and Vuforia-based augmented reality technologies.
Some of the demos depended on a new graphical emblem called a ‘Vu Mark’.
The Vu Mark is a bar code with style. Its boundary pattern can, like a bar code, contain product information such as model, serial number and so on. But, unlike a bar code, the design is quite flexible, allowing use of a brand image in the central part of the design.
In some environments, the Vu Mark would have been worthy of its own product launch, but at ThingEvent it was just part of the story.
Vu Marks were used to simplify object recognition, and also to enable a camera to determine object details which may not be apparent from the visible geometry.
The first customer example was maintenance for a smart connected motorbike.
PTC customer KTM supports its racing teams with advanced engineering tools, and demonstrated the use of a camera and screen of a tablet computer to:
View and identify the motorbike
Select, from the connected datastore, the right set of diagnostic sensor readings for this bike
Identify and report anything unexpected in the sensor readings
They also use augmented reality to help a technician find and fix a problem by:
Highlighting potential root-cause components. This is done by overlaying a colored component representation onto the video (yes, as the tablet moved and the view changed, the highlights held their positions exactly on the real components)
Allowing the technician to step through a troubleshooting guide one step at a time, using overlays of instructions, and highlighting relevant attachment points and layout of the 3D assembly in the engine area of the motorbike.
Here’s a PTC video showing where the ‘WOW’ factor comes from:
Furthermore, Sysmex showed the technology applied to a machine from its blood analysis product line, and Schneider Electric used a power management unit from its Micro Data Center line.
Like KTM, both Sysmex and Schneider Electric used a hand-held tablet computer with graphics overlaid on the live image of the product.
The ‘demonstrator-technician’ was able to call up product information, and step-by-step procedures to find or solve a problem.
It worked. Wow again.
The Root of the Wow Factor
Of course live graphics are always more likely to attract attention than the razor sharp admin of engineering data and process management in a PLM system.
Yes, managed access, version control and the ability to make sure everyone is always looking at consistent information is truly vital for efficient operation of every engineering team in every organization. But only quality engineers say wow, even to globally distributed data management.
Think about the CEO of an engineering team’s organization.
The CEO doesn’t pitch razor sharp admin as a differentiator or source of value when talking to investors or prospects, or trying to recruit the best and brightest engineers. The CEO wants to talk about new sources of value, and how this value is delivered to customers.
This is the root of the Wow factor that PTC is generating.
It’s also my defense against any accusation of naivety that my use of the word “Wow” will bring. It’s not just the new capabilities, but it’s their scope, and also the fact that these capabilities are or will be delivered by software products that can be deployed by most manufacturers.
These functions extend beyond the impact of better in-house engineering, and create opportunities for direct, new customer added-value – initially in service, and potentially leading to new ways for users to interact not only with products, but also with products-in-the-context-of-their-surroundings.
The siloes are changing, and opening up whole new application areas.
Service is Just the Beginning
The demonstrations are proof-of-concept that visual interaction with smart connected products can deliver a new channel of communication.
It offers engineers and others across development, production and in-service phases of a product’s lifecycle the opportunity to find new ways of doing things, and ways of doing new things.
The business case that is the ‘low-hanging fruit’ for IoT and AR is to support maintenance, repair, and service, so it’s no surprise that the examples developed by PTC’s customers have this as the central part of their scope. Sysmex, for example, pointed to the possibility that the system gave enough guidance to allow the user to perform some service procedures which currently require a technician visit.
This is something customers want, so may be a chance to reduce costs and improve customer satisfaction.
But rethinking service is just the beginning.
Engineers have relied on their own senses plus perhaps some separate readings from instrumentation and reports from users of their products for many years. Yes, it’s early days for the AR view for design engineering outside of the high-end environments used for specific environments such as automotive, and training for process-plant operators.
But it is reasonable to extrapolate from PTC’s customer demonstrations, and look forward to the chance to see sensor data, design and maintenance information, as well as system diagrams and simulation – all in the context of a live image of a product or prototype.
Design engineers who use this technology for product development will be the first to see how the buyers and users of their products might benefit from an AR view of product information.
Engineering managers trying to firefight a problem will definitely want the AR view of a product to be available for the technical meeting that will diagnose the problem – they’ll expect to be able to overlay CAD models, fault reports, sensor readings, and simulations.
Teams developing the new version of a product will want to integrate their drawings, sketches and models for the new version, comparing geometries, performance, stress and so on with the real data from the existing product.
And it’s not just the engineers. A sales rep will want to help a customer visualize how an optional add-on would fit into the proposed installation. Promotional marketing will welcome these new views of products and product plans. The CEO will want the images added into the company presentation.
Of course it’s not a one-way street, there are plenty of questions left to answer:
Will glasses and goggles ever catch on for hands free working?
Is the development of AR overlays a new and extra cost, or will tech pubs handle it within existing budgets?
Will anyone solve the problem of making documents (and AR overlays) that can handle variable experience levels of technicians?
Will service engineers have access to enough bandwidth at customer sites?
If you are an innovator, you’ve already thought about this, and you probably have reports and recommendations from your own research group.
If you are an early adopter, you need to arrange a demo, now, you need to show this capability to retain your reputation for being at the leading edge.
If you prefer to be in the early majority, make sure your data gathering is sensitive to signs that your customers have started thinking about products that work this way.
If your place on the technology adoption chart is in the late majority, then look again soon, this technology could trigger significant change in short timescales.