
Industry solutions


DigitalGate offers algorithm development solutions customized for embedded systems so that the required performance can be met on a resource-constrained embedded platform. By providing flexible solutions, we ensure that our customers can easily port the implemented algorithms to new hardware platforms, thus enabling them to adapt to ever-changing market needs and industry standards.
We have deep expertise in implementing algorithms along the entire processing pipeline, from low-level data filtering and enhancement up to high-level data fusion and dynamic modeling of systems.
With a deep understanding of the low-level internals of various embedded platforms, we implement algorithms for low-level filtering that provide the required support for the high-level algorithms by providing fast and performant APIs for signal filtering and denoising as well as various feature detectors.
We develop image clustering algorithms using different techniques such as K-Means clustering, Mean Shift clustering, DB Scan clustering, and Hierarchical clustering, based on image similarities. As a result, the higher-level algorithms in the pipeline can reliably further process the data.
We implement various low-level feature detectors for a large array of signals ranging from 1D signals up to 2D and 3D signals such as images and point clouds. Also, our team has successfully developed feature detectors for features as varied as HOG, SIFT, Viola-Jones, corners, edges, etc. Moreover, our algorithms are optimized for real-time embedded applications that can run reliably on a resourced-constrained embedded platform.
We develop image clustering algorithms using different techniques such as K-Means clustering, Mean Shift clustering, DB Scan clustering, and Hierarchical clustering, based on image similarities. As a result, the higher-level algorithms in the pipeline can reliably further process the data.
We implement all the low-level layers of computer vision systems so that the frames/images are acquired and streamed through the entire image-processing pipeline. Moreover, we do that in accordance with strict application requirements and embedded hardware platform limitations thus overcoming the limitation imposed by low-power image-processing hardware platforms.
We develop object tracking algorithms that ensure a reliable object detection framework by compensating for various irregularities in the detection algorithms, such as occlusion, absence of detections, or false detection. Our solution provides models for tracking objects in image coordinates as well as in 3D coordinates. We employ state-of-the-art models for object tracking such as feature-based tracking, optical flow-based tracking, as well as 3d estimators, and dynamic models for tracking various types of dynamic and static objects based on the Bayesian framework.
We are assisting the development of object detection algorithms by developing the required tools for simulation, data labeling, and training. We do it so that the optimal deep learning algorithms can be implemented and trained on custom sensor data, thus providing our customers the needed support for successfully developing and deploying their object detection algorithms on an embedded platform.
By implementing signal processing algorithms for various types of MEMS sensors such as Accelerometers, Gyroscopes, IMUs, etc. We ensure that our customers have the required framework at their disposal for implementing various motion detection features for their embedded systems, such as shock detection, motion/movement detection, and gesture control, on a custom embedded platform with limited resources and strict low power consumption requirements.
We develop camera calibration algorithms that compute the intrinsic and extrinsic parameters of the vision system. We integrate several calibration methods such as pattern-based and feature-based calibration, online or offline camera calibration, stereo camera calibration, and camera rig calibration for 360-degree panoramic image stitching, using feature-based calibration methods.
Our team implements high-level data fusion algorithms for various automotive applications. We address all aspects of a sensor fusion algorithm from the data acquisition step by implementing reliable and scalable interfaces up to the specific implementation of estimators for various types of processes such as object trackers and dynamic models for 3d objects. Thus we provide, in the end, the fused data for further processing.