how randstad digital helped.
The Randstad Digital team was called upon to resolve this data connection and set the system up for continued success. A short project with long lasting positive effects on the organization.
The initial focus was on ensuring any new data generated would be compatible with the existing software for overlay, as well as making it easy to change, trim and analyze, prior to the export back into the legacy format.
The team then determined that there was a need to create two separate tools to solve the two overarching problems. Firstly, a tool for converting and manipulating data, and secondly a tool to improve existing scripts which were being used to generate the plot points for old sensor data.
a three-pronged plan of action:
interface
Identify the best solution for the GUI interface to be used across both tools.
converter
Create the solution to manage the differences between the new files and the old ones.
manipulation
Find a way to manipulate and visualize the data for optimal analysis and insight.
-
the right GUI
Using MATLAB’s intuitive features, the team was able to rename sensor data to predefined names so that old and new data matched, provide better data readability for the in-house team going forward, trim files where needed, calculate baseline values from a set of newly defined scoring, and display the data of multiple sensors concurrently in one dataset for cross-sensor analysis.
-
translation and consolidation
The old data files were structured as a two-column table with timestamps in one and the sensor values in the other. The new files had only one column with the data values, each connected to a description of measurement frequency, start time, and end time.
Based on the frequency and starting time, the Randstad Digital data scientists were able to generate an extra first column for the new data that includes timestamps, meaning that the old table could be overlayed on the new table without any loss in context or insight.
-
primed for continued success
The most important part of the tool was the manipulation capability. A new script needed to be defined that achieved the following:
- Rename the columns generated by the conversion to predefined names, making it more intuitive for users and easier to export changes to a new file or location.
- Trim and visualize the data. On the below tab, the team could add-to, trim, mark, and correlate data points in time with locations on the map.
There was still a requirement to increase the usability of the data, as it needed to be easy for the team to analyze. Fortunately, the new data format had GPS data points within it, so the team created a map for more precise and accurate analysis. Additionally, trimming features were added – for example, there is now the ability to trim data based on parts of the track, which can be selected from a map.
The map implementation was done in Javascript, being the most compatible with Google Maps APIs. Because of this integration, the map had further UX benefits such as automatic zoom and restoring initial views after point selection.
who supported on this project?
Our team of data engineering experts can design, implement, or continuously manage your technical data infrastructure and pipelines so that you can better ingest, store, process, and transform your data capability. Improve your handling of data, unlock better analytical decision-making, and drive your business forward.
set up your team for data-driven success with the right processes, tools and infrastructure, today.