The Quick Demo of Synchronizing Driving Data
This is a quick demo of how car vibration and steering events can be used to synchronize driving data. The video itself is a visualization of the data streams we're working with, which includes audio you're hearing in the background, besides my voice, which is on my shotgun microphone placed behind the rear right tire. The middle column has three images each from a different webcam: front dashboard and face, and the dashboard video has an overlaid steering wheel icon that is visualizing the position of the steering wheel as supported by the cam Network.
The top left image shows the dense optical flow in the video of the forward roadway, which provides valuable information about the movement of the car. The bottom left just shows our location and a map, indicating where we are in beautiful Cambridge Massachusetts. The rest of the plots show the tenth second window around the current measurement of various sensors on the left, including the horizontal optical flow in the front video and the steering wheel position on the right, as well as the audio energy from the shotgun microphone.
The goal is to synchronize all these senses either online or offline as a post processing step. To achieve this, we use a method that involves synchronizing the video of the forward roadway with a CAN network by looking at steering events. When you make a turn, like the one coming up here, the horizontal optical flow will be negative if it's a right turn and positive if it's a left turn. The coming up here is a left turn, and as such, in the top left image, the dense optical flow will light up all the same color, indicating a positive value since the left turn. This allows us to determine the optimal shift for synchronizing the steering wheel with the forward video.
To compute this optimal shift, we use the cross correlation function, which has a maximum value that determines the shift. This process is similar to how we synchronized the rest of the sensors with the video of the forward roadway using vibration events. The right column shows five plots, including audio energy, the white component of the optical flow for the three webcams, and the z-axis of the accelerometer, each capturing the vibration of the car caused by the road.
Steering and vibration provide a signal that can be used for passive synchronization, resulting in a synchronized data set. This is important both for the analysis of driver behavior and for the design of data systems that use decision fusion to make real-time predictions based on multiple sensor streams. The paper, along with a sample dataset and source code, are available in the description.
"WEBVTTKind: captionsLanguage: enthis is a quick demo of how car vibration and steering events can be used to synchronize driving data the video itself is a visualization of the data streams we're working with the audio you're hearing in the background besides my voice is on my shotgun microphone placed behind the rear right tire the middle column has three images each from a different webcam front dashboard and face the dashboard video has an overlaid steering wheel icon that is visualizing the position of the steering wheel as supported by the cam Network the top left image shows the dense optical flow in the video of the forward roadway the bottom left just shows our location and a map we're in beautiful Cambridge Massachusetts and the rest are plots showing the tenth second window around the current measurement of various sensors on the left are the horizontal optical flow in the front video and the steering wheel position on the right are the audio energy from the shotgun microphone the Y component of the optical flow from the three webcams and finally the z axis of the accelerometer what we would like to do is to synchronize all these senses either online or offline as a post processing step we do this by for synchronizing the video of the forward roadway with a can network by looking at steering events when you make a turn like the one coming up here the horizontal optical flow will be negative if it's a right turn and positive it's a left turn the coming up here is a left turn and you will see in the top left image the dense optical flow will light up all the same color it will be a positive value since the left turn we can then determine the optimal shift the synchronization between a steering wheel and the forward video by computing the cross correlation function the maximum value for the cross correlation function to determine a shift and the same way we synchronized the rest of the sensors with the video of the forward roadway using vibration events on the right are five plots showing the audio energy the white component of the optical flow for the three webcams the z-axis of the accelerometer each capturing the vibration of the car caused by the road a few examples are coming up shortly here so steering and vibration gives us a signal that we can use for passive synchronization the result is a synchronized data set which is important both for the analysis of driver behavior and for the design of a data systems they use decision fusion to make real-time prediction based on multiple sensor streams the paper along with a sample data set and source code are available in the descriptionthis is a quick demo of how car vibration and steering events can be used to synchronize driving data the video itself is a visualization of the data streams we're working with the audio you're hearing in the background besides my voice is on my shotgun microphone placed behind the rear right tire the middle column has three images each from a different webcam front dashboard and face the dashboard video has an overlaid steering wheel icon that is visualizing the position of the steering wheel as supported by the cam Network the top left image shows the dense optical flow in the video of the forward roadway the bottom left just shows our location and a map we're in beautiful Cambridge Massachusetts and the rest are plots showing the tenth second window around the current measurement of various sensors on the left are the horizontal optical flow in the front video and the steering wheel position on the right are the audio energy from the shotgun microphone the Y component of the optical flow from the three webcams and finally the z axis of the accelerometer what we would like to do is to synchronize all these senses either online or offline as a post processing step we do this by for synchronizing the video of the forward roadway with a can network by looking at steering events when you make a turn like the one coming up here the horizontal optical flow will be negative if it's a right turn and positive it's a left turn the coming up here is a left turn and you will see in the top left image the dense optical flow will light up all the same color it will be a positive value since the left turn we can then determine the optimal shift the synchronization between a steering wheel and the forward video by computing the cross correlation function the maximum value for the cross correlation function to determine a shift and the same way we synchronized the rest of the sensors with the video of the forward roadway using vibration events on the right are five plots showing the audio energy the white component of the optical flow for the three webcams the z-axis of the accelerometer each capturing the vibration of the car caused by the road a few examples are coming up shortly here so steering and vibration gives us a signal that we can use for passive synchronization the result is a synchronized data set which is important both for the analysis of driver behavior and for the design of a data systems they use decision fusion to make real-time prediction based on multiple sensor streams the paper along with a sample data set and source code are available in the description\n"