LiDAR SmartCAR

Automotive Robot with LiDAR Sensor

  • Adopts Arduino, an open hardware platform for controlling robot subsystems such as motors and sensors.
  • LiDAR sensor configuration for autonomous driving.
  • Robot Operating System (ROS) training, a robot middleware.
  • Simultaneous localization and mapping (SLAM) training.
  • Obstacle detection using multi-ultrasonic sensor.
  • Line tracer drive using infrared sensor.
  • Control of driving part operation using DC Encoder Motor.
  • Providing Java-based OpenCV solution to utilize Android for vision robot research. Intelligent control using Accelerometer, Gyroscope sensor.
  • Using smartphones and tablets as robots’ brains.
  • C programming support using CodeVision.
  • Provides AndroX StudioTM integrated development environment for robotic system service development.

Introduction

LiDAR SmartCAR is developed to support the research of ICT convergence service using intelligent mobile robot and the training of high value human resources. With LiDAR sensor, it is educational device to learn about LiDAR, various sensor, autonomous driving, ROS (robot operating system) and SLAM (Simultaneous localization and mapping). Designed to enable smart phone and PC to be used as robots’ brains for high-performance vision processing, it combines data from acceleration, magnetic, and gyroscope sensors with vision, including 12 ultrasonic sensors and 8 infrared sensors. It can be used to develop innovative autonomous navigation algorithms and application services for mobile robots

Features

  • This is a moving robot with an autonomous LiDAR sensor. It contains examples of collision avoidance and examples of position tracking, so you can learn about ROS and SLAM.
  • With the integrated development environment, anyone can easily and quickly implement firmware
    for electronic device control. The Arduino integrated development environment is based on the
    environment using processing / wiring language which is effective for developing interactive objects, easy operation of microcontroller, and easy programming via USB.
  • By supporting the ADK-based electronic device development environment, the Google Smart Device Peripheral Design Platform, you can quickly and easily develop applications that work with Smart Devices with the Google Android platform.
  • With 12 ultrasonic sensors and 8 infrared sensors, obstacles can be avoided and missions can be performed on a given route. By incorporating acceleration and gyroscope sensors, it is possible to develop intelligent robots that autonomously travel by detecting and judging the acceleration, vibration, shock and motion information of the robot by itself.
  • Two of the four independently driven DC geared motors have built-in encoders that can detect the motor’s operating status and calculate the direction and speed of rotation.
  • Built-in Bluetooth communication module enables remote control based on SPP profile through PC, notebook, smartphone, tablet etc. that support Bluetooth communication
  • Smart phones and tablets can be used as the brain of mobile robots, enabling the implementation of mobile robot-based ICT convergence services using high-performance processors and Wi-Fi communication environments provided by smartphones and tablets.
  • We provide AndroX StudioTM, an integrated development environment for Android-based robot image processing and high-end service development.

Block Diagram

Integrated Development Environment AndroX Studio

Configuration and Name

Hardware Specifications

Software Specifications