Sensor Taxonomy: Proprioception vs. Exteroception
Robot sensors divide into two fundamental categories. Understanding this split is the first step in building your sensor suite because each category answers a different question: "What is the robot doing?" versus "What is the world doing?"
Proprioceptive sensors measure the robot's internal state -- joint angles, joint velocities, motor currents, end-effector forces. They tell the controller where the robot is and how much effort it is exerting. Without proprioception, no closed-loop control is possible.
Exteroceptive sensors measure the external environment -- object positions, surface geometry, proximity, temperature, sound. They enable perception-driven behaviors like obstacle avoidance, object grasping, and scene understanding.
| Category | Sensor Type | Measures | Typical Update Rate | Price Range |
|---|---|---|---|---|
| Proprioceptive | Encoder (incremental) | Joint angle, velocity | 1-10 kHz | $15-$200 |
| Proprioceptive | Encoder (absolute) | Joint angle (no homing) | 1-10 kHz | $50-$500 |
| Proprioceptive | IMU | Orientation, angular rate, acceleration | 100-8000 Hz | $2-$2,000 |
| Proprioceptive | Force/Torque (F/T) | 6-axis wrench at wrist | 100-7000 Hz | $1,500-$12,000 |
| Proprioceptive | Motor current sensor | Torque estimate (via current) | 1-10 kHz | Built-in |
| Exteroceptive | RGB Camera | 2D color image | 30-160 Hz | $50-$2,000 |
| Exteroceptive | Depth Camera (RGB-D) | 3D point cloud + color | 30-90 Hz | $200-$1,500 |
| Exteroceptive | 2D LiDAR | Planar distance scan | 5-40 Hz | $100-$1,200 |
| Exteroceptive | 3D LiDAR | 3D point cloud (long range) | 10-20 Hz | $400-$75,000 |
| Exteroceptive | Tactile Sensor | Contact pressure, slip, texture | 30-1000 Hz | $200-$5,000 |
| Exteroceptive | Proximity (ToF/IR) | Distance to nearest object | 50-100 Hz | $5-$50 |
Encoders: The Foundation of Robot Control
Every servo-driven robot joint requires an encoder. Encoders convert mechanical rotation into digital position data that the motor controller uses for closed-loop position and velocity control. Without accurate encoders, no control algorithm can function.
Incremental vs. Absolute Encoders
Incremental encoders output pulse trains (A/B quadrature signals) as the shaft rotates. The controller counts pulses from a known reference (home position) to determine angle. After power loss, the robot must re-home -- driving each joint to a limit switch or index pulse. Common choices:
- Broadcom AEDT-9810 (optical): 5000 CPR (20,000 counts/rev after quadrature decoding), $45. Standard for research arms.
- CUI AMT102 (capacitive): Configurable 48-4096 CPR, $30. Compact, no alignment needed, good for custom builds like OpenArm.
Absolute encoders report the true shaft angle immediately on power-up, eliminating homing. They use either optical code disks or magnetic Hall-effect sensing. More expensive but essential for collaborative robots where the arm could be manually moved while powered off.
- RLS Orbis (magnetic, rotary): 14-bit (16,384 positions per revolution = 0.022 degree resolution), SPI/SSI output, $85. Used in many cobot joints.
- US Digital MA3 (magnetic): 12-bit, analog or PWM output, $55. Simple integration for prototype arms.
- Renishaw RESOLUTE (optical, linear/rotary): 32-bit, 1 nm resolution, $800+. Industrial-grade for CNC and high-precision robotics.
Decision rule: Use absolute encoders for any arm that will be deployed in a collaborative or unattended setting. Use incremental encoders when cost is the primary constraint and homing on startup is acceptable (research prototypes, competition robots).
Inertial Measurement Units (IMUs)
An IMU combines accelerometers and gyroscopes (and sometimes magnetometers) in a single package. In robotics, IMUs serve three roles: base orientation estimation for legged/mobile robots, vibration monitoring for predictive maintenance, and complementary sensing for SLAM when fused with visual or LiDAR odometry.
IMU Comparison Table
| Model | Accel Range | Gyro Range | Gyro Bias Stability | ODR (max) | Interface | Price | Best For |
|---|---|---|---|---|---|---|---|
| Bosch BMI088 | ±24g | ±2000°/s | ~2°/hr | 1600 Hz | SPI/I2C | $8 | Drones, legged robots |
| TDK ICM-42688-P | ±16g | ±2000°/s | ~1.5°/hr | 8000 Hz | SPI/I2C | $6 | VIO/SLAM, high-rate control |
| Analog Devices ADIS16470 | ±40g | ±2000°/s | ~0.36°/hr | 2000 Hz | SPI | $180 | Industrial robots, precise dead-reckoning |
| VectorNav VN-100 | ±16g | ±2000°/s | ~0.5°/hr | 800 Hz | SPI/UART/USB | $800 | Navigation-grade AHRS, AGVs |
| Unitree G1 built-in | ±16g | ±2000°/s | ~3°/hr | 400 Hz | SDK | Included | Humanoid balance control |
Key specification: Gyro bias stability. This measures how much the gyroscope output drifts over time with no rotation. A 2 deg/hr drift means the orientation estimate will be off by 2 degrees after one hour of stationary operation. For mobile robots running SLAM, you need <1 deg/hr. For arm-mounted vibration monitoring, the BMI088 at $8 is sufficient.
The Unitree G1 humanoid (available for lease from SVRC at $2,500/mo) includes a built-in IMU accessible via the Unitree SDK that provides pre-filtered orientation quaternions for balance control.
Force/Torque Sensors
A 6-axis force/torque (F/T) sensor mounted between the robot's wrist flange and the gripper measures interaction forces and torques in all directions. This enables force-controlled tasks like polishing (constant force), peg-in-hole insertion (compliance), and safe human interaction (force limiting).
F/T Sensor Comparison
| Sensor | Force Range (Fx/Fy) | Torque Range (Tz) | Resolution | Rate | Interface | Price |
|---|---|---|---|---|---|---|
| ATI Gamma | ±65 N | ±5 Nm | 1/80 N | 7000 Hz | EtherCAT/Ethernet | $8,000-$12,000 |
| ATI Mini45 | ±290 N | ±10 Nm | 1/16 N | 7000 Hz | EtherCAT/Ethernet | $6,000-$9,000 |
| OnRobot HEX-E | ±200 N | ±10 Nm | 0.2 N | 100 Hz | USB/Tool I/O | $3,500-$5,000 |
| Bota SensONE | ±1000 N | ±40 Nm | 0.3 N | 800 Hz | EtherCAT/USB | $2,500-$4,000 |
| Robotous RFT40 | ±200 N | ±6 Nm | 0.1 N | 1000 Hz | RS485/USB | $1,500-$2,500 |
ATI Gamma is the gold standard in research labs. Its 7 kHz sample rate and sub-Newton resolution enable high-bandwidth force control loops. However, it requires a dedicated NetBox DAQ ($2,000+) and an EtherCAT master, adding integration complexity.
OnRobot HEX-E is significantly easier to integrate -- it plugs directly into UR, ABB, and Fanuc cobots via built-in tool I/O adapters. The 100 Hz sample rate is sufficient for assembly and polishing tasks but too slow for high-bandwidth impedance control.
Budget alternative: If your application only needs collision detection (not force control), use motor current-based torque estimation. The OpenArm 101 provides joint torque estimates from motor current at no additional sensor cost, sufficient for simple contact detection and gravity compensation.
Vision Sensors: Cameras, Depth, and LiDAR
Vision is the richest exteroceptive modality. For a deep dive into camera selection, see our dedicated Robot Vision Systems Guide. Here is a summary of the decision framework:
Camera Selection by Application
- Manipulation / data collection: GigE Vision camera (Basler ace2 a2A1920, $650) with hardware trigger for deterministic frame delivery. See our Camera Setup for Teleoperation guide for the full 3-camera configuration.
- Mobile robot navigation: Stereo camera (Intel RealSense D455, $350) or 2D LiDAR (RPLIDAR A1, $100) for SLAM. Add a 3D LiDAR (Ouster OS0-128, $3,500) for outdoor or high-speed environments.
- Quality inspection: Machine vision camera (FLIR Blackfly S, $500-$1,500) with fixed focal length lens and ring light for consistent imaging conditions.
- Budget research: USB webcam (Logitech BRIO, $200) is acceptable for single-camera setups at <15 fps where latency jitter is tolerable.
Depth Sensing Technologies
Structured light (Intel RealSense D435, Azure Kinect): Projects a known IR pattern and computes depth from distortion. Works indoors, struggles in sunlight and with reflective surfaces. Range 0.3-3 m for manipulation tasks.
Time-of-flight (ToF) (Azure Kinect DK ToF mode, PMD Flexx2): Measures photon round-trip time. Faster than structured light, less susceptible to texture-less surfaces. Range up to 5 m indoors.
Stereo vision (RealSense D455, ZED 2i): Two calibrated cameras compute disparity. Works outdoors, degrades on texture-less surfaces. Range up to 20 m with a 12 cm baseline.
LiDAR (Velodyne VLP-16, Ouster OS0): Laser-based time-of-flight with mechanical or solid-state scanning. Range 10-100+ m, works in all lighting. Essential for outdoor mobile robots and autonomous vehicles.
Tactile Sensors for Manipulation
Tactile sensors are the fastest-growing sensor category in robotics. They provide contact information that cameras cannot: grip force magnitude, contact geometry at sub-millimeter resolution, slip detection, and surface texture classification. For detailed product comparisons, see our Tactile Sensor Comparison guide.
When to Add Tactile Sensing
- Required: Handling fragile objects (fruit, electronics, glass), deformable objects (fabric, cables), or objects with uncertain weight/friction.
- Recommended: Any dexterous manipulation task where grip success rate needs to exceed 95%. Tactile feedback enables reactive grasp adjustment that vision alone cannot provide.
- Optional: Simple pick-and-place of rigid, known objects where a parallel-jaw gripper with a single binary contact switch is sufficient.
SVRC stocks Paxini tactile sensor arrays that integrate with the OpenArm and Orca Hand. Paxini sensors provide 16x16 taxel arrays at 100 Hz with ROS2 drivers, priced at approximately $800 per fingertip unit. Contact our team for evaluation units.
Sensor Fusion: Combining Multiple Modalities
No single sensor provides complete state information. Sensor fusion combines data from multiple sensors to produce more accurate and robust estimates than any individual sensor alone. The two most common frameworks in robotics are the Kalman filter and the particle filter.
Extended Kalman Filter (EKF) Basics
The EKF is the workhorse of sensor fusion in robotics. It maintains a probabilistic estimate of the robot state (position, velocity, orientation) and updates it whenever a new sensor measurement arrives. The key idea: each sensor has different noise characteristics, and the EKF optimally weights them based on their reliability.
A typical mobile robot EKF fuses:
- IMU (prediction step): High rate (200+ Hz), provides angular velocity and acceleration. Integrates to estimate position/orientation but drifts over seconds.
- Wheel odometry (update step): 50-100 Hz, provides velocity. Accurate short-term but suffers from wheel slip.
- Visual odometry or LiDAR scan matching (update step): 10-30 Hz, provides absolute position correction. Eliminates drift but computationally expensive and intermittent.
In ROS2, the standard sensor fusion package is robot_localization, which implements both EKF and UKF (Unscented Kalman Filter) nodes:
# Launch EKF node for IMU + odometry fusion
ros2 launch robot_localization ekf.launch.py
# Configure in ekf.yaml:
# odom0: /wheel/odometry (x, y, yaw velocity)
# imu0: /imu/data (roll, pitch, yaw, angular vel, linear accel)
# odom0_config: [false, false, false, # x, y, z position
# false, false, false, # roll, pitch, yaw
# true, true, false, # x, y, z velocity
# false, false, true, # roll, pitch, yaw velocity
# false, false, false] # x, y, z acceleration
Sensor Fusion for Manipulation
For arm manipulation, sensor fusion often combines:
- Joint encoders + F/T sensor: Encoder positions feed forward kinematics; F/T sensor provides wrench at the end-effector. Together they enable impedance control -- the robot behaves like a spring-damper system, compliant in force while tracking position.
- Vision + tactile: Camera detects object pose for approach; tactile sensor provides closed-loop feedback during grasp and manipulation. This is the architecture used in SVRC's data collection pipelines for learning contact-rich policies.
- Vision + F/T + proprioception: The full multi-modal stack for complex tasks like cable routing, food preparation, or assembly. Requires careful timestamp synchronization -- see our camera setup guide for synchronization methods.
Sensor Selection by Application
Use the following decision matrix to determine which sensors your specific application requires. "Required" means the application will not function without it. "Recommended" means significant performance improvement. "Optional" means nice-to-have.
| Application | Encoders | IMU | F/T | Camera | LiDAR | Tactile | Est. Sensor Budget |
|---|---|---|---|---|---|---|---|
| Pick-and-place (rigid) | Required | -- | Optional | Required | -- | -- | $500-$1,500 |
| Assembly/insertion | Required | -- | Required | Recommended | -- | Recommended | $3,000-$8,000 |
| Deformable handling | Required | -- | Required | Required | -- | Required | $5,000-$12,000 |
| Indoor mobile robot | Required | Required | -- | Recommended | Required | -- | $500-$2,000 |
| Outdoor AMR / delivery | Required | Required | -- | Required | Required | -- | $4,000-$15,000 |
| Humanoid (Unitree G1) | Built-in | Built-in | Recommended | Built-in | Optional | Recommended | $1,000-$5,000 (add-ons) |
| Data collection (IL) | Required | Optional | Recommended | Required | -- | Recommended | $2,000-$6,000 |
ROS2 Sensor Integration Quick Reference
Most sensors publish data on standardized ROS2 message types. Knowing which message type to expect simplifies integration and lets you use existing visualization and processing tools.
| Sensor | ROS2 Message Type | Typical Topic Name | Key Driver Package |
|---|---|---|---|
| RGB Camera | sensor_msgs/Image | /camera/color/image_raw | usb_cam, pylon_ros2 |
| Depth Camera | sensor_msgs/PointCloud2 | /camera/depth/points | realsense2_camera |
| IMU | sensor_msgs/Imu | /imu/data | imu_filter_madgwick |
| F/T Sensor | geometry_msgs/WrenchStamped | /ft_sensor/wrench | ros2_ati_netft |
| 2D LiDAR | sensor_msgs/LaserScan | /scan | rplidar_ros, sllidar_ros2 |
| 3D LiDAR | sensor_msgs/PointCloud2 | /lidar/points | ouster_ros, velodyne |
| Joint State | sensor_msgs/JointState | /joint_states | ros2_control |
| Tactile Array | sensor_msgs/Image (pressure map) | /tactile/pressure_image | paxini_ros2 |