E&RT Systems Public Page
CSCI P545  Computer Science Department, School of Informatics, Indiana University Thu Jun 23 14:27:04 EDT 2011 [SDJ]

P545 Embedded & Real-Time Systems – ERTS

The ERTS robotic vehicle is developed by and for our Embedded & Real-Time Systems course. Its overall functional goal is autonomous real-world navigation, but the real mission of the ERTS project is to serve as a platform for collaborative research in areas ranging from cognitive robotics (e.g. human-robot interaction) to safety-critical systems (e.g. formal implementation synthesis), as well as other focus areas (e.g. programming languages, robotic vision) at Indiana University.

The ERTS mission demands a great deal of flexibility, relative simplicity, and open visibility. In cognitive robotics, for example, we are deploying large existing experimental systems for human-robot interaction. Typically, these systems regard the vehicle as a black-box object with relatively high-level functionality. Often, researchers also need project-specific sensory capabilities. We need to configure ERTS and integrate research systems very quickly in order encourage early engagement.

In contrast, embedded systems research may delve deeply into the underlying architecture to the level of fundamental software and hardware components. As dictated by the research objective ERTS must be configured "vertically" to instantiate an abstract design model while exposing underlying subsystems and hardware resources.

Most students and researchers have minimal experience with embedded technologies, yet they have only a few months to reach a productive level of development. For this reason—and because we have a bottom-up teaching philosophy—we prefer to postpone the use of complex tool chains, specialized design environments, and language-specific interfaces. Because the core ERTS development group is small, we must also keep the ERTS code base small and generic.

One avenue we are exploring to address these challenges is a homogeneous resource-access interface through underlying file system. The file space "API" exploits the the operating system's directory services to bind resources in its global, hierarchical name space. The result is a shared, language-independent image region through which ERTS components communicate. It is hardly a new idea, arising in its purest form in Bell Labs' Plan 9 operating system.

On the systems side, we are exploring tools and methods for synchronous, or "time-triggered," software design. This aspect of our research is also reflected in the file system interface through Sychronous-Sequential File System (SySeFiS), a globally synchronized mode for networked file transactions.

Information, Announcements, Links
Photo
[HTM] Photo Gallery
[HTM] ERTS Test Field, April 3, 2007
Video
[MP4] (5 Mb) Drive-by-voice simulation (Nov. 27, 2011)
(Shenshen Han)

Shenshen Han's P545 project, a prototype drive-by-voice component. A higher resolution version is posted on YouTube.

[WMV] (21 Mb) Drive-by-voice demonstration (Apr. 14, 2011)
(Paul W. Schermerhorn)

The control architecture shown in the demo was a variant of the DIARC robot architecture, implemented in the ADE distributed agent infrastructure ADE provides mechanisms to facilitate communication and reliability. ADE accesses gps, compass and vehicle control components via ERTS's SyncFS filesystem. DIARC/ADS were developed either at the HRI Labs at IU and Tufts or by Thinking Robots, Inc.

The robot architecture was run on two laptop computers, one on the cart and one with the operator, communicating via a standard wifi network. The main DIARC components used here were:

  • a dialogue management component that maintains models of each agent participating in the the dialogue, including belief tracking mechanisms that allow it to detect and correct inconsistencies in its own and others' beliefs (e.g., deducing that the operator must believe that the transport had a goal at the start of the demo)
  • an action management component responsible for dispatching and monitoring the actions undertaken in pursuit of the agent's goals
  • a navigation component that uses the heading and location readings obtained by the cart interface component to determine any adjustments to throttle and steering radius that are required to make progress toward the goal location
  • speech input and output components that provide text of the operator's speech to the dialogue component and translate the text generated by the dialogue component into audible speech
[AVI] (5.3 Mb) Simulation, GPS following with basic obstacle avoidance (Dec. 17, 2008)
[MP4] (13 Mb) GPS tour, high perspective (Nov. 9, 2007)
[MP4] (14 Mb) GPS tour, ground-level (Nov. 9, 2007)
[MP4] (12 Mb) GPS tour w/ safety-driver (Nov. 9, 2007)
[MP4] (6 Mb) Vehicle walk-around (Nov. 9, 2007)
Info
[HTM] IUCS Autonomous Golf Cart Project repository
[PDF] ERTS Architecture [B. Himebaugh, 2008]
[PDF] Poster, SOI Open House, September 2008
[PDF] CVE'09 presentation, October 2008
[PDF] IRVMCS'09 presentation, June 2009
Field Trials
[HTM] Field Trial Safety Rules
[HTM] 2009 Class Project Field Trials, December 7, 2009
[HTM] 2008 Class Project Field Trials, December 17, 2008
[HTM] 2007 Class Project Field Trials, April 27, 2007

About P545
CSCI P545 Catalog Description. Design and implementation of purpose-specific, locally distributed software systems. Models and methods for time-critical applications. Real-time operating systems. Testing, validation, and verification. Safety-critical design. Related topics, such as resiliency, synchronization, sensor fusion, etc. Lecture and laboratory.
Laboratory Description. The course laboratory is a golf car modified for computer control and under development to serve as a research platform. The class project goals include:
  • implementing frameworks for autonomous vehicle navigation, for instance using the Global Positioning System (GPS);
  • implementing frameworks for local tactical guidance, such as obstacle avoidance.
  • exploring embedded-system design methods and configurable architectures in support of advanced robotics research.
  • engaging others working in areas such as vision, artificial intelligence, situated congnition, learning, etc.
ERTS Vehicle Description.
  • Modified EZGO® golf car
  • Computer controlled actuators; all mechanical controls intact
  • On-board LAN of Linux compute nodes.
  • SySeFiS Sychronous-Sequential FIle System interface.
  • TCP/IP bridge to higher experimental systems
  • Language independent prototyp development
  • Configurable sensor data netwok; GPS, compass, SICK LMS-100 ranging sensor, IMUs, vision, ranging, etc.
Prior course projects and development goals
  • Fall 2009 see Year of ERTS announcement.
  • Fall 2008
    • Conversion from a client/server interface to a file-system interface
    • Re-implementation of GPS following (4 weeks)
    • Obstacle detection using short-range laser range-finder
    • Preliminary obstacle avoidance Scheme API.
  • Fall 2007
    • Refinement of GPS navigation. Path planning.
    • Architecture. On-board LAN. Sensor platform/networking.
  • Spring 2007
    • System Development & Testing. Synchronous communication infrastructure;
    • Basic GPS navigation. Point-to-point waypoint following; Turning maneuvers; Situation-based speed control.


©2009 Steven D. Johnson