Signal Processing

Navigating the complex world of radio waves for effective, reliable communication

Signal processing reveals all kinds of valuable information hidden in data from a variety of sources – from audio signals, video and images to sensor systems.

It might involve collecting information from the sensors in a fitness tracker and reporting back on sleep patterns, for example, or converting the information from ultrasound, radar and cameras on an autonomous vehicle into data needed to control navigation.

In a factory setting, signal processing is typically used to collect and analyse the data from sensors attached to production line machines to detect when the equipment needs remedial maintenance work.

Signal processing techniques also remove unwanted disturbances, outliers and artifacts from ‘noisy’ real-word data, resulting in cleaner and more reliable datasets – essential for accurate modelling, predictions and other advanced data analysis tasks.


Real-world challenges

Dismounted Position & Navigation Sensor (DPNS)

To address the challenge of how to navigate when Global Navigation satellite System (GNSS) signals, such as GPS, are unavailable, Plextek developed a boot-mounted inertial navigation system. This device contains inertial sensors and a microprocessor containing signal processing algorithms to processes the sensor signals to estimate the motion of the boot. This in turn is used to provide accurate position estimation in the absence of GNSS.

The signal processing algorithms filter out unwanted noise, estimate biases, and automatically detect different phases of the user’s gait to derive an accurate measurement – all within the meagre resources of a small microprocessor.

Ubiquitous Radar Platform

Plextek’s PLX-U16 radar platform incorporates an advanced signal processing chain that takes in raw radar signal data and outputs detections with estimates of target location, velocity and even its classification. The processing chain contains numerous signal processing steps such as multi-dimensional Fourier transforms, windowing operations and interpolation. Constant False Alarm (CFAR) algorithms are used to dynamically adjust detection thresholds as noise levels vary to ensure the rate of false alarm remains consistent. A track-before-detect methodology is used where candidate detections are fed into a tracker, which observes how those detections evolve over time. Detections which evolve along credible trajectories are retained and identified as genuine targets. Such a track-before-detect approach enables the use of a lower detection threshold, which increases detection probability, without compromising the false alarm rate.

Ubiquitous Radar

Key skills

  • Expertise in real-time system design

    Specialising in the design of systems, such as radar, that require non-stop operation over extensive periods.

  • Reliability engineering

    Ensuring high system reliability for critical applications, essential in fields such as autonomous vehicle navigation and industrial monitoring.

  • Expenditure optimisation

    Delivering complex signal processing solutions within strict budgetary limits, optimising expenditure without sacrificing performance.

  • Intelligence at the edge

    The system understands its surroundings, allowing autonomous operation where communication to central processing is limited or unavailable.

  • Python code implementation

    Employing Python for data processing pipelines, providing practical code examples for the software development community.

  • Signal interpretation

    Translating complex raw data from varied sources into actionable insights, crucial for making informed decisions.

  • Noise filtering techniques

    Sophisticated signal processing methods to minimise unwanted disturbances and artefacts, resulting in cleaner and more dependable datasets.

  • Predictive maintenance

    Using signal analysis to forecast when equipment requires maintenance, significantly reducing downtime and operational expenses by detecting issues before they cause failure.

  • Low size, weight and power (SWAP) operation

    Obtaining optimal processing performance per watt when power budgets are highly constrained.


When someone says ‘signal processing’, many people instantly think of complex mathematics or hard to program computer systems and then run a mile. It is better to think of signal processing as being the enabling technology that allows us to interpret, change and generate information. Signal processing allows us to understand the output of sensors and receivers, to extract and derive meaning from what would otherwise just be a mess of data to give us useful outputs. It drives our modern world.

Dr Peter Debenham, Senior Consultant
Dr Peter Debenham

Senior Consultant


What sets us apart when it comes to signal processing?

Our signal processing expertise enables us to transform complex data into actionable insights with precision and clarity. Our areas of expertise include:

  • Clean datasets
  • Continuous operation
  • Data modelling
  • Data processing pipeline
  • Image processing
  • Modelling and predictions
  • Noise reduction
  • Non-expert operation
  • Predictive maintenance
  • Python code programming
  • Radar systems
  • Real-time data capture
  • Reliability
  • Low SWAP
  • Sensor systems
  • Signal extraction
  • Video analysis
Contact Plextek

Contact Us

Got a question?

If you have got a question, or even just an idea, get in touch

Related Technical Papers

View All
an image of our technical paper
Sensing Auditory Evoked Potentials with Non-Invasive Electrodes and Low-Cost Headphones

This paper presents a sensor for measuring auditory brainstem responses to help diagnose hearing problems away from specialist clinical settings using non-invasive electrodes and commercially available headphones. The challenge of reliably measuring low level electronic signals in the presence of significant noise is addressed via a precision analog processing circuit which includes a novel impedance measurement approach to verify good electrode contact. Results are presented showing that the new sensor was able to reliably sense auditory brainstem responses using noninvasive electrodes, even at lower stimuli levels.

an image of our technical paper
GPU Computing

Power limits restrict CPU speeds, but GPUs offer a solution for faster computing. Initially designed for graphics, GPUs now handle general computing, thanks to advancements by NVIDIA, AMD, and Intel. With hundreds of cores, GPUs significantly outperform CPUs in parallel processing tasks. Modern supercomputers, like Titan, utilize thousands of GPU cores for immense speed. NVIDIA’s CUDA platform simplifies GPU programming, making it accessible for parallel tasks. While GPUs excel in parallelizable problems, they can be limited by data transfer rates and algorithm design. NVIDIA’s Tesla GPUs provide high performance in both single and double precision calculations. Additionally, embedded GPUs like the NVIDIA Jetson TX2 deliver powerful, low-power computing for specialized applications. Overall, GPUs offer superior speed and efficiency for parallel tasks compared to CPUs.