Back to Products
A multi-disciplinary turnkey laboratory that can accelerate research, diversify teaching, and engage students from recruitment to graduation.
Autonomous Systems & Applied AI Autonomous Vehicle Control Mobile Robotics
The Quanser Self-Driving Car Studio is the ideal platform to investigate a wide variety of research topics for teaching and academic research in an accessible and relevant way. Use it to jump-start your research or give students authentic hands-on experiences learning about the essentials of self-driving. The studio brings you the tools and components you need to test and validate dataset generation, mapping, navigation, machine learning, and other advanced self-driving concepts at home or on campus.
Request a Quote
Learn More
Request Information
Documents
Product Details
- Overview
- Features
- Specifications
- Workstation Configuration
At the center of the Self-Driving Car Research Studio, the QCar, is an open-architecture scaled model vehicle, powered with NVIDIA® Jetson™ TX2 supercomputer, and equipped with a wide range of sensors, cameras, encoders, and user-expandable IO.
Relying on a set of software tools including Simulink®, Python™, TensorFlow, and ROS, the studio enables researchers to build high-level applications and reconfigure low-level processes that are supported by pre-built modules and libraries. Using these building blocks, you can explore topics such as machine learning and artificial intelligence training, augmented/mixed reality, smart transportation, multi-vehicle scenarios and traffic management, cooperative autonomy, navigation, mapping and control, and more.
Dimensions | 39 x 21 x 21 cm |
Weight (with batteries) | 2.7 kg |
Power | 3S 11.1 V LiPo (3300 mAh) with XT60 connector |
Operation time (approximate) | ~2 hours 11 m (stationary, with sensors feedback) |
30 m (driving, with sensor feedback) | |
Onboard computer | NVIDIA®Jetson™ TX2 |
CPU: 1.2 GHz quad-core ARM Cortex-A57 64-bit + 1.2 GHz Dual-Core NVIDIA Denver2 64-bit | |
GPU: 256-core NVIDIA Pascal™ GPU architecture, 1.3 TFLOPS (FP16) | |
Memory: 8GB 128-bit LPDDR4 @ 1866 MHz, 59.7 GB/s | |
LIDAR | LIDAR with 2k-8k resolution, 10-15Hz scan rate, 12m range |
Cameras | Intel D435 RGBD Camera |
360° 2D CSI Cameras using 4x 160° FOV wideangle lenses, 21fps to 120fps | |
Encoders | 720 count motor encoder pre-gearing withhardware digital tachometer |
IMU | 9 axis IMU sensor (gyro, accelerometer,magnetometer) |
Safety features | Hardware “safe” shutdown button |
Auto-power off to protect batteries | |
Expandable IO | 2x SPI |
4x I2C | |
40x GPIO (digital) | |
4x USB 3.0 ports | |
1x USB 2.0 OTG port | |
3x Serial | |
4x Additional encoders with hardware digital tachometer | |
4x Unipolar analog input, 12 bit, 3.3V | |
2x CAN Bus | |
8x PWM (shared with GPIO) | |
Connectivity | WiFi 802.11a/b/g/n/ac 867Mbps with dual antennas |
2x HDMI ports for dual monitor support | |
1x 10/100/1000 BASE-T Ethernet | |
Additional QCar feautres | Headlights, brake lights, turn signals, and reverselights (with intensity control) |
Dual microphones | |
Speaker | |
LCD diagnostic monitoring, battery voltage,and custom text support |
Vehicles
- QCar (single vehicle or vehicle fleet)
Ground Control Station
- High-performance computer with RTX graphics card with Tensor AI cores
- Three monitors
- High-performance router
- Wireless gamepad
- QUARC Autonomous license
Studio Space
- Set of reconfigurable floor panels with roadway patterns
- Set of traffic signs
- Applications
- QCar Specifications
- Supported Software and APIs
- Teaching Resources
- Research Resources
- Outreach Resources
Dimensions | 39 x 21 x 21 cm |
Weight (with batteries) | 2.7 kg |
Power | 3S 11.1 V LiPo (3300 mAh) with XT60 connector |
Operation time (approximate) | ~2 hours 11 m (stationary, with sensors feedback) |
30 m (driving, with sensor feedback) | |
Onboard computer | NVIDIA®Jetson™ TX2 |
CPU: 1.2 GHz quad-core ARM Cortex-A57 64-bit + 1.2 GHz Dual-Core NVIDIA Denver2 64-bit | |
GPU: 256-core NVIDIA Pascal™ GPU architecture, 1.3 TFLOPS (FP16) | |
Memory: 8GB 128-bit LPDDR4 @ 1866 MHz, 59.7 GB/s | |
LIDAR | LIDAR with 2k-8k resolution, 10-15Hz scan rate, 12m range |
Cameras | Intel D435 RGBD Camera |
360° 2D CSI Cameras using 4x 160° FOV wideangle lenses, 21fps to 120fps | |
Encoders | 720 count motor encoder pre-gearing withhardware digital tachometer |
IMU | 9 axis IMU sensor (gyro, accelerometer,magnetometer) |
Safety features | Hardware “safe” shutdown button |
Auto-power off to protect batteries | |
Expandable IO | 2x SPI |
4x I2C | |
40x GPIO (digital) | |
4x USB 3.0 ports | |
1x USB 2.0 OTG port | |
3x Serial | |
4x Additional encoders with hardware digital tachometer | |
4x Unipolar analog input, 12 bit, 3.3V | |
2x CAN Bus | |
8x PWM (shared with GPIO) | |
Connectivity | WiFi 802.11a/b/g/n/ac 867Mbps with dual antennas |
2x HDMI ports for dual monitor support | |
1x 10/100/1000 BASE-T Ethernet | |
Additional QCar feautres | Headlights, brake lights, turn signals, and reverselights (with intensity control) |
Dual microphones | |
Speaker | |
LCD diagnostic monitoring, battery voltage,and custom text support |
Supported Software and APIs | QUARC Autonomous Software License |
Quanser APIs | |
TensorFlow | |
TensorRT | |
Python™ 2.7 & 3 | |
ROS 1 & 2 | |
CUDA® | |
cuDNN | |
OpenCV | |
Deep Stream SDK | |
VisionWorks® | |
VPI™ | |
GStreamer | |
Jetson Multimedia APIs | |
Docker containers with GPU support | |
Simulink® with Simulink Coder | |
Simulation and virtual training environments (Gazebo, QuanserSim) | |
Multi-language development supported with Quanser Stream APIs for inter-process communication | |
Unreal Engine |
Autonomous Systems & Applied AI Autonomous Vehicle Control Mobile Robotics
QCar
Sensor-rich autonomous vehicle for self-driving applications
QCar, the feature vehicle of the Self-Driving Car Studio, is an open-architecture, scaled model vehicle designed for academic teaching and...
Learn More
QDrone
This product is no longer available.
Click here for information on QDrone 2. The Quanser QDrone autonomous air vehicle is a midsize quadrotor equipped with a...
Learn More
Autonomous Systems & Applied AI Aerospace Control Autonomous Vehicle Control
Autonomous Vehicles Research Studio
Jump-start Your Autonomous Vehicles Research
Quanser’s new Autonomous Vehicles Research Studio is the ideal solution for academics looking to build an indoor multi-vehicle research lab...
Learn More
Quanser Studios
Accelerate teaching and research in modern academic fields
Combined with rich academic content and a vibrant community of international peers, our studios make developing and validating your teaching...
Learn More
Autonomous Systems & Applied AI Mobile Robotics Self-Driving Vehicle Control Software Virtual Experiments
QLabs Virtual QCar
High fidelity digital twin in an interactive driving world
The Quanser Virtual QCar is a fully instrumented, dynamically accurate digital twin of the Quanser QCar system. It behaves the...
Learn More
Autonomous Systems & Applied AI Aerospace Control Autonomous Vehicle Control
QDrone 2
Quanser innovation unleashed in the autonomous vehicle research space
The Quanser QDrone 2 autonomous air vehicle is a midsized quadrotor equipped with a powerful on-board NVIDIA Jetson Xavier NX...
Learn More
Get Started Right Now
Even if you just have an idea or question, we are more than happy to connect with you.
Contact Us