LiDAR Steering SmartCAR

Automotive Robot with Lidar sensor, Radar sensor and Steering System

  • Adopts Arduino, an open hardware platform for controlling robot subsystems such as motors and sensors.
  • LiDAR sensor configuration for autonomous driving.
  • Robot Operating System (ROS) training, a robot middleware.
  • Simultaneous localization and mapping (SLAM) training.
  • Obstacle detection using multi-ultrasonic sensor.
  • Speed measurement with LiDAR.
  • Line tracer drive using infrared sensor.
  • Control of driving part operation using DC Encoder Motor.
  • Providing Java-based OpenCV solution to utilize Android for vision robot research.
  • Intelligent control using Accelerometer, Gyroscope sensor.
  • Using smartphones and tablets as robots’ brains.
  • Provides AndroX StudioTM integrated development environment for robotic system service development.

Introduction

LiDAR STEERING SmartCAR is developed to support the research of ICT convergence service using intelligent mobile robot and the training of high value human resources. With LiDAR Sensor and Steering System, it is educational device to learn about LiDAR, various sensor, autonomous driving, ROS (robot operating system) and SLAM (Simultaneous localization and mapping). Designed to enable smart phone and PC to be used as robots’ brains for high-performance vision processing, it combines data from acceleration, magnetic, and gyroscope sensors with vision, including 12 ultrasonic sensors and 8 infrared sensors, It can be used to develop innovative autonomous navigation algorithms and application services for mobile robots.

Features

  • This is a moving robot with an autonomous LiDAR sensor. It contains examples of collision avoidance and examples of position tracking, so you can learn about ROS and SLAM.
  • With the integrated development environment, anyone can easily and quickly implement firmware for electronic device control. The Arduino integrated development environment is based on the environment using processing / wiring language which is effective for developing interactive objects, easy operation of microcontroller, and easy programming via USB.
  • By supporting the ADK-based electronic device development environment, the Google Smart Device Peripheral Design Platform, you can quickly and easily develop applications that work with Smart Devices with the Google Android platform.
  • With 12 ultrasonic sensors and 8 infrared sensors, obstacles can be avoided and missions can be performed on a given route.
  • By incorporating acceleration and gyroscope sensors, it is possible to develop intelligent robots that autonomously travel by detecting and judging the acceleration, vibration, shock and motion information of the robot by itself.
  • DC geared motor has built-in encoder, so it can detect the operation status of motor and can calculate rotation direction and speed.
  • Accurate steering control using servo motor is possible and it is able to change the rotation axis of front wheel for forward direction.
  • Built-in Bluetooth communication module enables remote control based on SPP profile through PC, notebook, smartphone, tablet etc. that support Bluetooth communication
  • Smart phones and tablets can be used as the brain of mobile robots, enabling the implementation of mobile robot-based ICT convergence services using high-performance processors and Wi-Fi communication environments provided by smartphones and tablets.
  • We provide AndroX StudioTM, an integrated development environment for Android-based robot image processing and high-end service development.

Block Diagram

Integrated Development Environment AndroX Studio

Configuration and Name

Hardware Specifications

Software Specifications

ROS

Robot Operating System (ROS) is robotics middleware (i.e. collection of software frameworks for robot software development). Although ROS is not an operating system, it provides services designed for heterogeneous computer cluster such as hardware abstraction, low-level device control, implementation of commonly used functionality, message-passing between processes, and package management.

SLAM

Simultaneous Localization and Mapping (SLAM) is a concept used in robotics and so on. It is a technology that the mobile robot moves around in arbitrary space, searches for the surrounding area, and maps the space and estimates the current position.