Theorem Solutions Unveils Extended Reality Offering
In this release, augmented reality users can now benefit from new embedded QR Code features, which allow for model loading.
Virtual Reality (VR) and Augmented Reality (AR) News
Virtual Reality (VR) and Augmented Reality (AR) Resources
Theorem Solutions
August 4, 2023
Theorem Solutions has updated with the new release, TheoremXR (2023.2). Theorem's suite of extended reality (XR) products targets those focused on engineering and manufacturing with spatial computing.
This release introduces enhancements such as Time and Motion capabilities, specifically designed for mixed and virtual reality users. Additionally, augmented reality users can now benefit from new embedded QR Code features, which allow for model loading.
The integration of Time and Motion capabilities into TheoremXR for virtual and mixed reality devices enables ability to optimize existing processes and help businesses to improve efficiency, productivity, and safety in their operations.
“Users can now leverage the power of VR and MR to conduct accurate time and motion studies, enabling them to identify bottlenecks, streamline workflows, and improve process efficiency through simulation,” says Ryan Dugmore, consultancy director at Theorem Solutions.
Users can create virtual environments that closely resemble real-world settings, allowing for realistic simulations and accurate measurements of tasks and movements. By tracking and analyzing the time taken for different motions and paths taken, users can identify areas for improvement and implement changes to optimize processes and reduce inefficiencies.
These features can be used for various circumstances. Analyzing paths allows users to compare multiple methods of moving around a space to determine the most efficient route. It can also be used to help understand if an environment needs to be redesigned, for example if operators are having to move around an object or move back and forth between two workstations, Time and Motion users can analyze whether design changes could boost efficiency. The same principle can also be used in planning a new layout for a factory or other workspace. Potential layouts can be compared to determine which would be most efficient to work in.
Proximity Grab, Path Tracking and Ghost Path Review further develop user experience and deliver valuable insights as part of the time and motion capabilities.
Proximity grab is a feature that mimics real-world interaction by having users pick up nearby objects rather than interacting from a distance, enhancing object manipulation, realism, collaborative design and accessibility in VR or MR environments. The intuitive nature and direct manipulation capabilities, while essential to create accurate Time and Motion timings, also provide a more immersive and engaging user experience, opening new possibilities for spatial computing applications.
The Path Tracking feature enables businesses to optimize the movement of a specific path, to improve safety planning, coordination, and operator training. This offers valuable insights, reducing risks, and increasing overall efficiency. Users can compare paths using path analysis (e.g., how many objects were moved, and how much time it took) to streamline their processes.
The Ghost Path Review feature provides users with visual references of their previous paths supporting improved navigation, training, process optimization, collaboration, and spatial awareness within VR or MR environments. Ghost Path Review adds an extra layer of analysis.
Comparing paths at different angles using the “birds-eye view” capabilities provides context, resulting in more intelligent decision making, the company notes.
For more product details, click here.
Sources: Press materials received from the company and additional information gleaned from the company’s website.
More Theorem Solutions Coverage
Subscribe to our FREE magazine,
FREE email newsletters or both!About the Author
DE EditorsDE’s editors contribute news and new product announcements to Digital Engineering.
Press releases may be sent to them via DE-Editors@digitaleng.news.