Freelance AvailableNovember 2024

03
Ford ADAS

Client
Ford
Agency
ANML
Role
Development Consultant
Technologies
React Redux MQTT
Overview

Building the next generation of Advanced Driver Assistance Systems

With stiff competition from Google and Apple, Ford was looking to discover how they could better leverage a car’s sensors to provide their drivers with more assistance when using guided navigation. I created React-based rapid prototypes in collaboration with D-Ford’s UX team for use inside car bucks and R&D vehicles. This allowed D-Ford to perform user testing with real customers.
background

Before we get started, what the heck is a buck?

A car buck, as used in this project, is basically a skeleton of the car, with three screens one for the instrument cluster, one for the center stack, and a final large screen used to show the “drive”. UX researchers at D-Ford use this apparatus to safely test new versions of the software in a safe, controlled environment.
challenges

Solving anxiety in the newest way to cruise.

BlueCruise is Ford’s active driving assistance system and on certain highways allows drivers to take their hands off the wheel and be navigated to their destination. While this is a very powerful feature it can also produce anxiety-inducing situations if not handled properly. How do you communicate to a driver at 60mph that they are in a BlueCruise zone? How much notice should you give a driver engaging or entering a zone? How much ambient traffic should be visualized and in which display? The primary goal of this project was to help answer these questions.
approach

Gathering data, so that we can gather other data.

Using an old Ford Explorer, tricked out with the latest sensors the drive was recorded for use in the buck — MQTT, an IoT messaging standard was used to record almost everything the car did. From speed and lane changes to the curvature of the road all of this data would be consumed by the instrument cluster and referenced by the center stack video.
To create the screens, we used two different methods. The first involved using 3D artists and After Effects animators who referenced the recorded drive to determine how the center stack inside the car would behave. The second approach was a React-based instrument cluster. Working closely with the UX team, I would produce daily new versions of the cluster, which allowed for more iterations in a time-efficient manner.
features

The process of keeping code and videos in sync.

While the instrument cluster was created in code, the drive and the center stack were videos and they all needed to be synchronized in order to have a realistic driving experience in the buck. I utilized MQTT over WebSockets to allow three browsers, one on each screen, to react to events at the same time.
When working within the cluster, my top priority was optimizing the app for refactoring. This was necessary due to the frequent changes that were implemented on a daily, if not hourly, basis during the rapid prototyping phase. Additionally, I had to maintain a balance between frame rates and precise notifications that transitioned smoothly between various states. These transitions were triggered by logging events that were recorded at varying frequencies.