Ford Motor Co. is showing off a series of research projects at the mammoth CES 2016 – formerly the Consumer Electronics Show – in Las Vegas this week aimed at connecting aerial drones and homes to vehicles thru the OEM’s SYNC onboard communications system.
First, Ford with drone developing DJI as part of the DJI Developer Challenge to create drone-to-vehicle communications using its SYNC AppLink or OpenXC protcols.
The goal of this project is to develop a surveying system for the United Nations Development Program that would allow first responders to quickly deploy drones to survey and map areas hit by earthquakes, tsunamis or other natural disasters – all from the cab of an F-150.
Developers are tasked with creating software that would allow an F-150 and a drone to communicate in real time, with challenge winner receiving $100,000.
Ford said this rapidly deployable U.N. forest surveying system would ideally work like this: In a disaster, an emergency response team would drive an F-150 as far as possible into an emergency zone, with the driver using the SYNC touch screen to identify a target area and launch a drone by accessing an app projected through Ford SYNC AppLink. The drone would follow a flight path over the zone, capturing video and creating a map of survivors with associated close-up pictures of each.
Using the driver’s smartphone, the F-150 would establish a real-time link between the drone, the truck and the cloud, so vehicle data can be shared, Ford noted. Data will be relayed to the drone so the driver can continue to a new destination, and the drone will catch up and dock with the truck, the OEM added.
Developers will be able to use vehicle data available through SYNC AppLink or the OpenXC platform to create a seamless drone-to-vehicle communications experience, Ford said.
The OEM hopes that this project eventually would allow for drone-to-vehicle applications in agriculture, forestry, construction, bridge inspection, search and rescue, and other work environments in which vehicles are space-, height- or terrain-limited.
Ford also noted at CES 2016 that it’s exploring linking smart devices like Amazon Echo and Wink to its vehicles to allow consumers to control lights, thermostats, security systems and other systems in their homes from their car, and to stop, start, lock, unlock and check their vehicle’s fuel range from the couch.
Ford said it is experimenting with its new SYNC Connect technology to link vehicles with the Amazon cloud-based voice service Alexa, which would allow customers to access their vehicle from inside their home.
Using Amazon Echo, a hands-free speaker and voice command device that interfaces with Alexa, Ford owners could request assistance with various functions of their car including: Starting/stopping the vehicle; lock/unlock; range and charge status of their electric vehicle; and check fuel level/miles-to-empty, among others.
Alexa would also provide a link to a variety of Internet-enabled smart devices, such as lights, home security systems, automatic garage doors and more, via a steering wheel-mounted voice recognition button, allowing the driver to make requests of connected smart devices or functions of Alexa, including weather reports, music, shopping lists and more.
Ford said it is also is working with Wink, the smart home platform that brings together smart home devices from many different companies. Compatibility with SYNC AppLink would enable Wink users to easily control and automate their smart home device on the dashboard of their car or through voice control, the OEM noted.
Finally, Ford said at CES 2016 that it is tripling its fleet of fully autonomous Ford Fusion Hybrid test vehicles – adding 20 of them to expand its self-driving test fleet to 30 vehicles in all – and plans to use a new-generation sensor technology as well.
Those latest self-driving hybrid sedans represent Ford’s third-generation autonomous vehicle development platform as well, noted Raj Nair, the OEM’s executive VP for global product development and chief technical officer, in statement.
Ford added that its fully autonomous cars will take to the streets of California this year while it will continue them at its proving grounds, on public roads in Michigan, and at Mcity – a 32-acre, full-scale simulated urban environment located at the University of Michigan. Ford noted that the objective of its self-driving vehicle fleet is to test many of the computing and sensor components required to achieve fully autonomous driving capability, as defined by SAE International Level 4, which does not require the driver to intervene and take control of the vehicle.
Last summer, Jim McBride, Ford’s technical leader for autonomous vehicles, said the OEM transitioned from the research phase of development to the advanced engineering phase and that its third-generation self-driving vehicles represents a key step of the OEM’s Ford Smart Mobility strategy – the plan to take Ford to the next level in connectivity, mobility, autonomous vehicles, the customer experience, and data and analytics.
McBride noted that the OEM’s third-generation autonomous Fusion Hybrid sedans will have supplemental features and duplicate wiring for power, steering and brakes – acting as “backups,” if needed – as well as the latest LiDAR [Light Detection And Ranging] sensors dubbed the Solid-State Hybrid Ultra PUCK Auto for its hockey puck-like appearance from Velodyne.
He said Solid-State Hybrid Ultra PUCK Auto sensors boast a longer range of 200 meters [over 600 feet], making them the first auto-specific LiDAR sensors capable of handling different driving scenarios.
”Adding the latest generation of computers and sensors, including the smaller and more affordable Solid-State Hybrid Ultra PUCK Auto sensors helps bring Ford ever closer to having a fully autonomous vehicle ready for production,” McBride added.