Author Archives: joeyzero

Robots Rescue Friday Dinner

The following vignette is inspired by the recent advancements in robotic food preparation and delivery. I for one, would welcome our new robot sous-chefs.

Alone in her office, Jennifer was growing weary. The conference call had been plodding along for several hours. It was a painful, tedious call. The CFO just used the word “candidly”. It occurred to her that he used that word often and it annoyed her now. Maybe she was just bothered because it was 5:30pm on a Friday. Oh no. Dinner. It was her responsibility to provide dinner for her family of four tonight and this lingering conference call was posing a challenge. It was even too late for a take out plan. Deep breath, no problem…she would utilize her new best friend for this situation, a company called RobotoDine.

As the call carried on and the folks from operations started describing their definition of a severity one vs. severity two problem, Jennifer clicked the RobotoDine app on her mobile phone. She continued to commit an appropriate level of attention to the call as the app opened and displayed the specials. Chicken Marsala caught her eye. She used to make that entree for her husband. He told her he would never tire of it. But that was three promotions and two children ago, when she had the time to do things like prepare dinners. Using the app, she added two orders of the Chicken Marsala and considered the options for her younger, more selective eaters. Jennifer quickly found previous orders and selected two grilled cheese dinners. She decided to include the french fries. It was Friday after all. But in a commitment to promoting healthy dinners, she also added two cups of freshly cut fruit. She placed the order and was prompted for a delivery time. Doing some quick estimates on the longevity of the call and the drive home, she selected a 6:00-6:15 delivery time. Delivery to the driveway — that was a very nice aspect of the service.

Back on site at RobotoDine, Jennifer’s order landed in the system. The facility was divided into two main rooms separated by a smaller corridor. One room contained the ingredients that needed to be kept cold in refrigeration appliances and the other room was dedicated to the cooking. Robot arms in the middle corridor moved ingredients from the cool room to the cooking room as necessary. Sliding compartments that opened into the the corridor from each room enabled the transfer. The hot room contained the range tops and super-hot ovens that could cook meals very quickly. The old adage that states “if you can’t stand the heat, stay out of the kitchen” would be very relevant to this room that typically reached temperatures over 100 degrees. The robots and equipment could tolerate that temperature much more than a human and the heat prevented prepared foods from cooling excessively. The absence of humans allowed for some other conveniences. For example, the pots and pans did not require the awkward and space-consuming handles that human required.

A variety of robot structures carried out their coordinated instructions. Manipulator arms moved ingredients across staging areas and ovens, chopped ingredients on demand, sliced and hammered meats, dispensed sauces and other ingredients, and stirred dishes. Parallel robots placed ingredients into the appropriate preparation and cooking areas. Similar to the scene of an automated warehouse, four-wheeled rovers moved meals from one staging area to another. The operation borrowed the trade secrets of the best restaurants, staging and pre-cooking food as appropriate and acceptable.

The Chicken Marsala preparation was in progress. The mushrooms, which like most of the produce in the facility were supplied by a local vertical farm, had been chopped. The pounded and floured chicken breasts were placed in a skillet and timed using previous recordings based on weight and cooking temperatures. The chicken temperature would be verified with a precision three-prong thermometer and the results would be added to the algorithms for cooking estimates. The angel hair pasta was placed in boiling water that was periodically filtered and re-circulated. When the dish was completed it was packaged. When Jennifer’s complete order had been cooked and packaged, the meals were placed into two containers: one for the hot meals and one for the fruit cups and side salads that were prepared on demand by robotic appliances RobotoDine had bought from another robot manufacturer. The biodegradable containers were specially designed to keep foods hot and cold while minimizing the “soggifying” effect that some meals were prone to.

More RobotoDine robots placed these dishes into their appropriate destinations. Some were quickly cooled and frozen for sale while others, like Jennifer’s order, were loaded into autonomous electric vehicles. The small car with Jennifer’s order was loaded to its appropriate capacity and then left the facility. Jennifer received a text message indicating: “Your order is on its way. Estimated time of delivery: 6:05PM”. RobotoDine used mapping and traffic report web services to support its delivery time estimates. They also utilized a fleet management platform to keep track of their vehicles and let customers know when to expect arrivals. When Jennifer arrived at home she read the message and smiled with relief. She opened the RobotoDine app and saw that her delivery was a mile from her house. She walked from her garage to the driveway and opened the RobotoDine app to check the map for the location of the delivery vehicle. Seeing that the delivery car was down the street, Jennifer waited for its arrival. A cheerful chirp emitted from the car as it entered the driveway. It navigated to the top of the driveway and stopped to remove the dinner boxes.

Jennifer received a text message indicating her order had been delivered and she clicked the “I got it!” button. Several minutes later, the entire family thoroughly enjoyed their meals; Jennifer enjoying the convenience as much as the quality. Her husband indicated that the Chicken Marsala would never match hers but she wasn’t sure if he was being sincere or strategically diplomatic. Some time later, Jennifer would rate her meal and experience using the app which subsequently provided her with an account credit towards her next order. She looked at interest at the promotional banner announcing the upcoming automated breakfast diner from the company behind RobotoDine.

Back at the RobotoDine facility, an empty delivery car returned to be stocked for its next mission. From the top of the building a drone quadcopter departed with a small container carrying two dozen Buffalo Wings. It was Friday.

Is This Robot Worth It?

Eight inches of new-fallen snow covers my driveway. As the final flakes drop, I ponder how and when I will address the situation. Suddenly, there is a knock at my front door. I open it and a man with a snow shovel says, “Hello. I will shovel your driveway for $40 if that offer pleases you.” Well, it does. I agree to the terms and he begins the task. When he finishes, I gladly pay him the amount he proposed. Putting the money in his pocket he then says, “For another $40 I will come into your house and dance while playing my accordion.” I try to present a polite expression as I answer, “I’m sorry…it’s nothing against you or accordions, it’s just that it’s not worth it to me.” He looks towards the ground pensively and nods. Then he turns to me and says somewhat contemplatively, “This is really how things are with service robotics, right?”
My eyes widen, “Yes!”, I respond. “You’re exactly right!”
Then I suddenly awake from the dream. Somewhat disoriented. Enlightened. A little hungry…

In order to maintain the content on my website, The Robot Database, I follow the news on robotics daily. There seems to be consensus on the prediction that the market for service robotics is poised for tremendous growth over the foreseeable future. The term “service robotics” is a catch-all category involving robots other than the manipulator arms and similar structures that weld, paint, palletize, build circuit boards, etc. This latter group of machines is referred to as “industrial robots”. I expect the definition of service robots to evolve and be refined over time. Right now the category is being subclassed somewhat informally into field robotics, healthcare robotics, consumer robotics, etc. I wouldn’t be surprised if the term “service robots” becomes somewhat meaningless in the not-too-distant future due to the scope of applications it eventually covers.

Robotics is not a fad market. Consumers and businesses will eternally be attracted to the promise that robots deliver. They afford us more quality time by performing work we don’t want to do. They save corporations money by automating manual labor, often more effectively. But
in order to fulfill the optimistic projections in growth, service robots must match market demands in capability and price. I see value of service robots described in three levels: utility value, niche value, and novelty value.

The robots that succeed in providing utility value have the widest market and will be appropriately priced. This market includes products like robot vacuum cleaners and personal assistants or home automators. Companies like iRobot had a good year in 2016 as more people are sold on the value of robot vacuum cleaners. As far as personal assistants and home automation, I’m pretty sure I placed my order for an Amazon Echo the day I found out about it. I was working on a software application that would help me manage my calendar and to-do list and could perform tasks like reporting the weather. When I saw a product that could provide these services being sold for under $200 the value proposition was very compelling to me. I haven’t bought a robot vacuum cleaner yet but that is mostly due to the fact that my house is too cluttered for the robot to be reasonably effective. There are robots that will be very interesting to watch. One such robot is a lawn maintenance and snow removal machine from the Kobi Company priced at around $4000. Another is a laundry-focused robots such as Laundroid and FoldiMate.

Robots that provide niche value have a smaller market but will be typically priced much higher due to the specialized capability they provide. These robots are often in markets that offer high-priced services to their end customers. Robot surgical systems are a good example of this category. These devices can cost several million dollars and incur ongoing service contracts on top of that. But the success of the incumbent companies in this space and the emergence of new competitors validates the market. There are at least two companies creating bricklaying robots. Fastbrick Robotics recently put a $2million price tag on its flagship robot. These robots are not as mature in the market as their surgical counterparts. That being said, if they prove capable, they will disrupt the industry. At the time of this writing, Fastbrick Robotics is trading around $0.10 a share (#wishICouldInvestOnTheAustralianExchange…). Other robots that may provide niche value include mobile security robots, germ-zapping robots, and the impressive humanoids and quadrupeds from companies like Boston Dynamics that haven’t yet seemed to find a reasonable market outside of research.

Some robots are intended for novelty value. Cozmo, the intelligent little truck from Anki sold very successfully in the 2016 holiday season for about $180. I saw some videos and read reports on Cozmo and it sounded really neato. Now I am not ready to spend $180 on “neato” but it’s clear that other people are. The robots that provide novelty value will require a market with enough disposable income to get past the price tag. So that “purchase threshold” will be driven by how much novelty the robot provides and how expensive it is. I mentioned that some robots are intended to provide novelty value. Some robots that are intended for utility value or niche value may have to settle for novelty value if they can’t provide the appropriate capability for the cost. Interesting examples of robots that currently straddle the niche/novelty divide are the humanoid robots from Alderbaran and Softbank such as Nao and Pepper. The price of these robots runs into the thousands of dollars but they cannot perform any type of labor that a human can. Don’t get me wrong…if somebody handed me a Nao robot and said, “It’s all yours…no strings attached…”, I would be incredibly happy and would probably be up all night just trying to make it do silly things like dance and play the accordion. But due to their sticker price I can’t justify a purchase because, for my needs, it just isn’t worth it. However, these robots are being applied to niche applications including concierge services, education, and even to help children with autism. Most of these applications are experimental so I believe it’s fair to say that the question of “niche or novelty” is still open.

Some closing thoughts on service robots and their value… Recently my seven-year-old son asked me if I could play a game with him and I explained I had to first finish washing the dishes. A bit disappointed, he asked me if there was a robot that could do all the housework and how much would it cost? His question caused me to ask myself: how much would I pay for a robot that could do all my chores effectively? After considering it for a bit I arrived at the qualitative answer: a lot. Seriously, a whole lot of money. I’m talking about “nice car” kind of money. Things aren’t very different for robots than they are for any sellable product. The market is just not well-tested for many of these revolutionary machines so the companies making them may not have a good idea of the marketable price. In order to successfully sell their robot products, these companies will need to understand the capabilities and level of value they offer, and price them appropriately. Like many people that are interested in these developments, I’m betting on the eventual ubiquity of service robots. It will be interesting to see just how and when the events occur.

The Farm of the Near Future

The morning sun began to creep gently over the east side of the corn field. The farm’s control system orchestrated a discussion between the weather sensors on the premises and the web services providing the daily forecast. It made the decision to turn on the irrigation for the full cycle today.

A small flock of geese settled into the soybean field looking for a meal. Their arrival was greeted with the shrill kree-arrrr call of a red tail hawk. The cry sounded again and it was clear the hawk was approaching. The geese burst from the ground in a panic as a small drone quadricopter flew into view and blarred one last hawk call from its speakers.

Back in the corn fields, a small rover robot made its morning rounds probing the soil and visually inspecting the crops. All data was sent back to the control system via its wifi connection. The rover stopped suddenly in its six-wheeled tracks and zoomed in on one particular cornstalk. It took several photos and sent the urgent message to the system along with the GPS coordinates of the problem.
Back at his desk, the farm assistant received the photo images. He recognized the problem as some type of blight but was not familiar with this particular strain. He called his supervisor who said she was not familiar with this blight either. She suggested they put in a call to have it analyzed by an expert.
An hour later, a small driverless car arrived at the farm. The trunk of the car opened and a bipedal robot emerged and walked towards the farmhouse. The supervisor and assistant led the robot to the infected corn. When they reached the site of the infected cornstalk, the robot focused the high-resolution cameras on the problematic area. A voice emitted from the robot’s speakers: “Yep, I know what this is and I know what you have to do…” The two farm employees took on smiles of relief.
Far away from this cornfield in Yankton, South Dakota, Bob sat in the small room of his house that served as his office. Using the program on his laptop computer, he maneuvered his mechanical avatar at the farm to take some more pictures of the cornstalks in this area. Bob accessed the farm’s control system and began reviewing relevant data. This assignment was typical for him. He knew a lot about raising corn. The circumstances of his life that moved him from Iowa to Yankton challenged him to find employment in his area of expertise, but that changed when he learned of the opportunity to utilize teleoperated robots to be his feet on the street…or in the fields as it were. Now he was quite busy, especially in this season, and living quite well. He was looking forward to opportunities to consult farmers in emerging markets across the world. He would be part of a group that would trial these services while employing language translation. In a short matter of time, Bob would deliver instructions to this farm and the blight that could have been catastrophic would be eradicated.
At edge of the cornfield several deer approached the crops for a free meal. The thunderous bark of a doberman instantly scattered them. Still barking, the drone quadricopter rose above the cornstalks and ensured the interlopers were driven far away.

Peace of Mind on Vacation

Like most people interested in the advancements in robotics, I am very interested in JIBO who is receiving a lot of recent attention. JIBO is an example of a produt in the market niche known as social robots. It seems to me that vendors in this space often emphasize the personal or emotional value these robots bring to their owners. Honestly, I’m an overwhelmed hard-working guy who would just love to see these things help me out. If I can love a shop vac that performs above and beyond expectations I may just go out and get a heart tattoo on my well-intentioned bicep for a personal robot who can take a load off Daddy. The story below depicts a scenario in which a JIBO armed with appropriate third party hardware and applications provides exceptional emotional value to a typical parent of a family.

The long drive was over and the smell of salt water filled the air. The brief evaluation was over. Walter was happy with the beach house. He had rented it sight-unseen from the owners. It was slightly smaller than he had expected but he thought it should be perfect for his family that included his wife and two young boys. There were, however, a couple of somewhat troubling thoughts on his mind that had to be reconciled before he could really settle peacefully into vacation mode. While his wife and sons sat on the second story deck and watched the ocean waves roll in, Walter broke away to make a phone call.
“We left in such a hurry”, he spoke into his smartphone, “there are a couple of things I want to check.”
“I am hear to help”, answered a pleasant but somewhat inhuman voice.
Walter continued, “For starters I’m not sure I locked the door to the deck. Could you send me a picture of the door handle?”
“I’m on my way”, the voice answered.
“Thanks, JIBO”, Walter replied.
Back at Walter’s home, the small robot with a conical head wheeled towards the deck door. The robot was mounted on a small wheeled platform that allowed it to navigate throughout the first floor. Small LEDs on the platform blinked to indicate functional behavior of various on-board sensors.
“Also, I wanted to set the thermostat. I think I did but I’m not positive. What’s the temperature in the house?”
“For the past twelve hours the house has maintained an average temperature of 77 degrees.” The temperature sensor included in the mobile base allowed the robot to take the reading.
“Great, that’s what it should be.” Walter had been talking about getting smart thermostats for quite a while. I was time to make the move. “You know what, see if you can find me a high-rated smart thermostat for around $100. Send me a form with five options and I’ll pick the one I want you to buy.” This occasion would not be the first time that Walter had JIBO do the legwork for an online purchase. Although Walter never felt overwhelmed by doing this kind of shopping, he actually enjoyed the incredible simplicity of just receiving a display with suitable options and clicking “go” without the hassle of searching and price comparing. Walter briefly recalled times when one of the boys’ friend’s birthdays was approaching an JIBO conveniently presented (and reminded) him with gift options for the upcoming party. JIBO integrated with Amazon and other e-commerce sites and could make purchases on behalf of Walter. A smartphone app managed the communication between the two of them for these transactions and other similar ones.
“Your parents called this morning”, JIBO offered without a leading request. He had been programmed to alert Walter of these events.
Walter was a little alarmed by the infrequent incident. “Could you play me the message?”
This capability was provided by a simple app that did not require any additional recurring fees. JIBO could make phone calls as his platform integrated with the house landline phone. He simply had to dial into the voicemail system, find the appropriate message, make an audio recording of it, and play it back for Walter.
“This will take a little while,” JIBO reminded Walter. “I’m fast but it takes some time to navigate your voicemail service.”
“Don’t worry, you’re not getting fired today”, Walter assured the intelligent robot. “And I got the photo of the deck door. It’s locked, thanks.”
“I have sent you five options for smart thermostat purchases”, JIBO informed Walter.
“I wish I could multitask like you”, Walter said.
“Perhaps you need a firmware upgrade.”
“I taught you that one.” Walter jogged his brain for anything else that would prevent him from relaxing. “The security system is on?”
“I can confirm that the security system is activated and that there have been no alerts since the time of your departure. I will continue to turn on the ZigBee® lights at random night time hours. I can now play back the voicemail message from your parents’ call.”
The message from Walter’s mother indicated that one of his cousins had just had a baby girl. Walter was relieved to hear there was no emergency and would make the appropriate follow up calls when he could.
Walter saw that it was almost sunset. He felt at ease now and knew he could call JIBO whenever he had to. It was nice to have a reliable helper that didn’t sleep.
“Thanks for everything, JIBO”, Walter said. “Recharge when you have to and don’t throw any wild parties while we’re gone.”
“Noted. I will remind the toaster oven.” JIBO answered.
“I taught you that one.”

Categorizing Robots

I’ve started working on an open source project that involves the design of a database schema to support the collection and searching of information related to robotics. I ran into a challenge that is fundamental to the database’s design and value: how to construct and populate categories for the robots. Naturally, I searched the web for relevant resources that could provide insight on these values. Although I am admittedly much more of an application designer than robot expert, I was not comfortable with the way information was organized in many cases.

Some examples of the items that confused me:

  • One prominent, robotics-focused website listed these values as “market segments” (or at least implied that categorization):
    Airborne, Education & Research, Engineering, Humanoid, Industrial, Manipulator, Maritime, Medical & Assistive, Mobile, Software. But I would argue that the values: Humanoid, Manipulator, Mobile, and Software do not really identify market segments (a group of individuals or companies that comprise a categorical market for your products and services).
  • I’ve also seen content that differentiated between tethered vs. autonomous robots. This differentiation seems to address the mobility of robots, but I think the “autonomous” term should be used to described how well a robot can “think and act on its own” regardless of whether or not it is tethered at all. And if I had a humanoid robot in my house that could walk around but required a very long tether for power and/or communication, I would consider that robot mobile.
  • Some terms like “field robotics” seem ubiquitous and important but I’m having trouble finding a structured home for them…

Rather than elaborate with more examples, I would like to (humbly and carefully) propose a categorization scheme that I believe would effectively support a user’s search for robots based on defining criteria. I am very much interested in getting feedback from others on this effort.

For starters, I am currently using these properties to categorize a robot:

  • Structure Type
  • Category
  • Market Segment
  • Applications
  • Features
  • Qualifiers

I’ll explain what each of these properties means and what the expected values would be.

Structure Type

  • This property defines the basic physical structure of the robot.
  • For ease of entry and searching, I will attempt to maintain structure types as a flat list without a hierarchy.
  • A robot in the robot database should be associated to exactly one structure type.
  • Proposed values for this property include:
    • Humanoid
    • Rover
    • UAV/Drone
    • Manipulator/Arm
    • Aquatic Submersible
    • Aquatic Surface Vessel
    • Exoskeleton
  • I know this categorization isn’t perfect. How do you classify a wheeled robot that has an articulated arm? I would classify it as a Rover, since that is the primary defining structure, with a feature of an arm (more on that when I cover features). How would you classify Baxter on a wheeled platform? I would classify it as a Manipulator/Arm with a mobile feature for similar reasons.

Category

  • This property defines the general area or domain for the robot.
  • The category will be maintained as as a flat list without a hierarchy.
  • A robot in the robot database should be associated to exactly one category.
  • Proposed values for this property include:
    • Fields Robotics
    • Medical Assistive
    • Industrial
  • On a side note, when designing applications in general, I try to avoid or qualify properties with names like “type” or “category” because all of these properties are designed for typing and categorizing. However, I will use them when no better terms come to mind.

Market Segment

  • The market segment indicates the target group of persons or companies that would utilize the robot.
  • For ease of entry and searching, I will attempt to maintain market segment as a flat list without a hierarchy.
  • A robot in the robot database can be associated to many market segments.
    The proposed list of market segments includes:

    • Aerospace
    • Agriculture
    • Defense
    • Healthcare
    • Manufacturing
    • Private Consumer
    • Retail
    • Transportation

I expect I will find some reason to modify this list as I collect data.
Some examples related to these market segment values:

  • I would indicate that robots that automate warehouses belong in the Retail market segment (maybe others). I would include “Warehouse automation” as one of their applications, but more on applications later…
  • I would say that autonomous automobiles primarily apply to the Private Consumer and Transportation markets segments. Note that since market segments categorize the target market, the transportation market segment in this case would identify the likes of cab companies that might want an autonomous automobile. If we were discussing a robotic bus, I would argue that the transportation market would make sense but the private consumer would not since the average person is not in the market for a bus (as they may be for an automobile).

Applications

  • Applications will identify specific uses for a robot. These values will typically line up with market segments but the database will not link applications to market segments at this point — just to robots.
  • Applications will be managed as a flat list without a hierarchy.
  • A robot in the robot database can be associated to many applications.
  • I imagine the list of values could become extensive. Here is an abbreviated list of proposed values:
    • Aerial photography
    • Autonomous driving
    • Building construction
    • Bomb disposal
    • Caretaking
    • Electronic component production (IC, PCB, etc)
    • Education
    • Entertainment
    • Floor vacuuming
    • Fruit-picking
    • Home/business security
    • Lawn maintenance
    • Medical surgery support
    • Mining
    • Packaging
    • Painting
    • Personal mobility assistance
    • Personal service assistance
    • Research
    • Rescue support
    • Soldering
    • Surveillance
    • Telepresence
    • Transport/haulage
    • Warehouse automation
    • Welding
  • Note that with this approach “Agriculture” would not be a good application as it is too broad. The model prefers more specific values like “Weed control”, “Fruit-picking”, “Pest control”, and so forth.

Features

  • Features will be used to identify certain characteristics of a robot.
  • These features will be geared to support valuable robotic searches; values like “Elegant” and “Plastic” are not appropriate values for this list.
  • A robot in the robot database can be associated to many features.
  • I imagine the list of values will also become extensive. Here is an abbreviated list of proposed values:
    • Articulated arm
    • Autonomous
    • Bipedal
    • Collaborative
    • Differential drive
    • Hexapod
    • Mobile
    • Quadruped
    • Swarm
    • Tethered
    • Treaded

Qualifications

  • An all-encompassing definition of what constitutes a robot seems to be elusive. There are however certain characteristics that exemplify what many will consider to be a robot. This property presents a predefined set of criteria to help determine how the individual robot measures against qualifications.
  • The database defines these qualifications to be evaluated per robot:
    • Has Sensors (does the machine have environmental sensors?)
    • Has Actuators (does the machine include motors and devices that move?)
    • Sensor/Actuator Controlled (does the machine include logic to drive actuators based on sensor input?)
    • Programmable (can the machine be programmed with instructions?)
    • Autonomous (does the machine employ algorithms to handle challenges without user-intervention?)
  • Every robot may be rated for each of these qualifications with one of these values:
    • Demonstrates
    • Partially demonstrates
    • Does not demonstrate
  • We can evaluate the robot-ness of machines using this scale. Take these examples:
    • Remote controlled car
      • Has Sensors : Partially demonstrates (has an RC receiver)
      • Has Actuators : Demonstrates (has motors that drive wheels)
      • Sensor/Actuator Controlled : Partially demonstrates (user sending RC signals drives motors)
      • Programmable : Does not demonstrate
      • Autonomous : Does not demonstrate
    • da Vinci Surgical System (best guesses…)
      • Has Sensors : Demonstrates
      • Has Actuators : Demonstrates
      • Sensor/Actuator Controlled :Partially demonstrates
      • Programmable : Does not demonstrate
      • Autonomous : Does not demonstrate
    • Mars Curiosity Rover
      • Has Sensors : Demonstrates
      • Has Actuators : Demonstrates
      • Sensor/Actuator Controlled : Demonstrates
      • Programmable : Demonstrates
      • Autonomous : Partially demonstrates

Finally, I plan on including some fields in the robot table that are commonly used to describe a robot including physical dimensions and weight and degrees of freedom.

Once again, the structuring and population of all the classification properties described above are intended to best support categorical searches for robots. I kept that goal in mind when writing this content. That being said, I am very interested in feedback from anybody, especially experts in robotics.

Thanks!

 

In the Den of the Lionfish

Armed with a fresh cup of Colombian coffee, Ethan opened his laptop and reviewed the latest data. The numbers were looking very good. He optimistically envisioned a lot of smiles and nods from the audience at the upcoming presentation. His brief daydream was snapped by an on screen message from Unit 7. Ethan clicked the alert and a screen popped up to display the image Unit 7 was sending from its underwater location in the nearby reef. Ethan reviewed the message and easily recognized that it was a lionfish indeed. Unit 7 claimed that there were no divers in the vicinity and sent a panoramic image to support that claim. Ethan had seen enough information from the robotic submarine and clicked the confirmation button on the application page. According to Ethan’s recent numbers that action would seal the fate of this particular lionfish.

Common Lionfish

Unit 7 was one of many robotic submarines operating off the coast in an effort to control the invasive lionfish population. The environmental and economic effects of the species had reached critical levels. The robots communicated using on-board underwater modems that sent their signals to local buoys which then relayed the communication to the larger wireless network. Human operators received and sent signals to the units over the same The coverage of the swarm was coordinated by a centralized server in order to optimize coverage of the area.
Upon receiving confirmation of its current mission, Unit 7 locked in on the image of the lionfish and began its pursuit. Although the lionfish is a relentless predator and impervious to the attacks of most predators, it is not a very quick or agile swimmer. Advantage: the robotic submarine units. As Unit 7 closed within four feet of its target, it algorithmically analyzed the swimming pattern of its target and fired the small harpoon with a burst of compressed air. The tethered harpoon pierced the fish. As the fish struggled, the Unit 7 moved towards the surface and relayed an indication of its achievement. The message was a request for a rendezvous with a differently purposed robotic vehicle in the area: one that would transport the captured fish to the base station.

Upon rendezvous, Unit 7 and the transport carried out their aquatic choreography. The transport spotted the captured fish in tow and extend a sinewy hand that collapsed around the fish almost like a mechanical Venus flytrap leaf. With a quick signal to the harpoon, the barb was retracted and the submarine retracted the tether carrying it. The transports end effector pivoted towards the service and tossed the fish into its central storage unit designed to adequately preserve it. Although the fin rays of the fish are venomous, it is edible if prepared properly. There were several local charities that were benefiting from the captured fish.
The fleet of robotic submarines were part of a larger effort that utilized mobile robots to control invasive species across the country. When the public was made aware of the efforts there was some level of trepidation towards the thought of semi-autonomous robots hunting their prey amid the residents. It took some trials, time, and numbers to substantiate the proclaimed safety of the robots in general. One of the most attractive aspects of the robot programs was the “global off switch”. More organic control methods involved the introduction of species and substances, but those approaches ran the risk of introducing a new imbalance to the environment. Conversely, once their goal numbers were met, the robots would be switched off and removed with no ongoing impact or evidence of their historic presence.
This was the second killing for Unit 7 in the past hour. It returned to the floating base station and docked for a well-deserved inductive recharge.
Back at his desk, Ethan received a text message from his friend, Thomas, who was managing a team of land-based robots that were hunting Burmese pythons in the Miami-Dade area. The message included a picture of Mongoose Nine with its latest capture.

Simple tutorial on rosbridge and roslibjs

This tutorial demonstrates how to create a simple web page that communicates with ROS using rosbridge and roslibjs.
No previous knowledge of ROS is really necessary to try out the code in this tutorial but I will not go into great detail on the concepts of ROS and its commands and utilities.

Configuring your system for the tutorial

This section will step you the installations required to carry out the demo.

1. Install ROS

If you do not have ROS installed on your computer, install it following the instructions here.
I am using the hydro release on my computer. Other releases will probably work but the names and structures of topics and messages might be slightly different. Also, this tutorial will use the turtelsim demo program so you will need that installed. The desktop-full installation of ROS includes this demo. I chose to set my environment as indicated in section 1.6 to add the ROS environment variables to all new terminal windows. Finally, this tutorial will assume you are running ROS on Linux.

2. Install rosbridge

Open up a terminal window and type

 sudo apt-get install ros-hydro-rosbridge-suite

Detailed instructions can be found here, but note that they relate to the groovy release, not hydro.

Turtlesim introduction

Before I get into the details of rosbridge, I will use the turtlesim demo to introduce some fundamental ROS concepts and commands. If you encounter problems in this section, your computer is probably not configured correctly. Refer back to the installation links for details on setting everything up.

1. Run roscore

Open up a terminal window and type

 roscore

This command runs the ROS master program that runs the services and coordinates communication between publishing and subscribing nodes. More detail on those terms and concepts will follow.
You can minimize this terminal window after you start roscore.

2. Run the turtlesim simulator window

Open up a terminal window and type

 rosrun turtlesim turtlesim_node

This command will launch the application that displays the simulated turtle robot. ‘turtlesim’ is the name of the ROS package and ‘turtlesim_node’ is the name of the application within the package that will be executed. The turtle icon in the window is essentially listening for movement message instructions.
You can minimize the terminal window used to launch the simulator.

3. Run the turtlesim control window

Open a terminal window and type

 rosrun turtlesim turtle_teleop_key

This command will run the ‘turtle_teleop_key’ application node within the ‘turtlesim’ package. In order to send commands to the ROS master, this terminal will need to have focus as you type the left, right, up and down arrow keys.

4. See the list of ROS topics

Open a terminal window and type

 rostopic list

This command will list the currently available topics driven by roscore and the nodes that were launched. Topics are essentially channels that node applications can publish and subscribe to. The concept pub/sub messaging systems is a common model in software systems. I will not go into the ideas in detail but quickly say that the concept is fundamental to ROS and very relevant to a system were sensors, actuators, and so forth may be interdependent and interchanged.
One of the topics that you should see displayed by this command will be of particular interest to us. It’s fully qualified name is:
/turtle1/cmd_vel

5. Find the message type for the relevant topic

In the same terminal window type

 rostopic info /turtle1/cmd_vel

This command will display information about the topic including the type of messages that will be published and consumed. Messages are objects in the sense that they can be composed of primitive values and other structures containing primitive values. The message type for the /turtle1/cmd_vel topic is indicated as
geometry_msgs/Twist

6. Investigate the message structure

In the same terminal window type

 rosmsg show geometry_msgs/Twist

You will see this output

 geometry_msgs/Vector3 linear
   float64 x
   float64 y
   float64 z
 geometry_msgs/Vector3 angular
   float64 x
   float64 y
   float64 z

This output indicates that the geometry_msgs/Twist message structure is composed of two structures of another ROS-type: geometry_msgs/Vector3. The properties of this type within the geometry_msgs/Twist type are named linear and angular.
If you run this command in the terminal window

 rosmsg show geometry_msgs/Vector3

you will see that type is composed of three float64 properties named x, y, and z.

7. Monitor messages sent to the relevant topic

In the same terminal window type

 rostopic echo /turtle1/cmd_vel

This command will display information related to the messages published to the named topic, in this case: /turtle1/cmd_vel
The command will run in the terminal window until it is terminated with a Ctrl+C.

8. Run the demo

Finally, time to take the turtle for a ride. The three relevant terminal windows for this step are the simulator display (step 2), the control window (step 3), and the window that will echo messages sent to the relevant topic (step 7). Make sure all three windows are visible on your desktop.
Click in the control window to give it focus. Then use the arrow keys to rotate and move the turtle.
Observer the output in the topic echo window. Note how the values change depending on the keys you press. For example, if you press the up arrow you should see this output:

 linear:
   x: 2.0
   y: 0.0
   z: 0.0
 angular:
   x: 0.0
   y: 0.0
   z: 0.0

Before we start on the next section that investigates how rosbridge works, I will summarize the important points of this section:
The roscore master was started in order to manage the communication of messages between publishing and subscribing nodes
The simulator window node was launched as a subscriber to the topic relevant to the demo
A terminal window was opened to publish messages to the topic relevant to this demo

Controlling turtlesim from a web page

In this section we will build a minimal html page to control the turtle in the simulator.
The section will use rosbridge which includes a set of tools that provide a JSON API for communication with the ROS server. I should point out that I am fairly new to ROS in general. One of the first things I learned was that node applications were typically written in C++ or Python: two languages that I am not proficient in. So I was interested in the idea of rosbridge that would allow ROS communication using tools like JavaScript over WebSocket. This section will also use the ROS JavaScript library, rosblibjs. Much of what I am writing in this section is based on what I learned in this tutorial.

1. Launch rosbridge

Open a terminal window and type this command

 roslaunch rosbridge_server rosbridge_websocket.launch

This command will run rosbridge and open a WebSocket on port 9090 that our web page will use to communicate with ROS.

2. Create an html file control panel

This web page is intended to demonstrate how roslibjs and rosbridge can be used to communicate with ROS. The page will not employ best practices like the use of style sheets or JavaScript libraries like jQuery. I will annotate the web page with comments that will explain the important lines.

<!DOCTYPE html>
<html>
<head>
<!-- Based on demo found here:
http://wiki.ros.org/roslibjs/Tutorials/BasicRosFunctionality
http://wiki.ros.org/roslibjs/Tutorials/BasicRosFunctionality
-->

<!--
The next two lines bring in the JavaScript files that support rosbridge integration.
-->
<script type="text/javascript" src="http://cdn.robotwebtools.org/EventEmitter2/current/eventemitter2.min.js"></script>
<script type="text/javascript" src="http://cdn.robotwebtools.org/roslibjs/current/roslib.min.js"></script>

<script type="text/javascript" type="text/javascript">

// This function connects to the rosbridge server running on the local computer on port 9090
var rbServer = new ROSLIB.Ros({
    url : 'ws://localhost:9090'
 });

 // This function is called upon the rosbridge connection event
 rbServer.on('connection', function() {
     // Write appropriate message to #feedback div when successfully connected to rosbridge
     var fbDiv = document.getElementById('feedback');
     fbDiv.innerHTML += "<p>Connected to websocket server.</p>";
 });

// This function is called when there is an error attempting to connect to rosbridge
rbServer.on('error', function(error) {
    // Write appropriate message to #feedback div upon error when attempting to connect to rosbridge
    var fbDiv = document.getElementById('feedback');
    fbDiv.innerHTML += "<p>Error connecting to websocket server.</p>";
});

// This function is called when the connection to rosbridge is closed
rbServer.on('close', function() {
    // Write appropriate message to #feedback div upon closing connection to rosbridge
    var fbDiv = document.getElementById('feedback');
    fbDiv.innerHTML += "<p>Connection to websocket server closed.</p>";
 });

// These lines create a topic object as defined by roslibjs
var cmdVelTopic = new ROSLIB.Topic({
    ros : rbServer,
    name : '/turtle1/cmd_vel',
    messageType : 'geometry_msgs/Twist'
});

// These lines create a message that conforms to the structure of the Twist defined in our ROS installation
// It initalizes all properties to zero. They will be set to appropriate values before we publish this message.
var twist = new ROSLIB.Message({
    linear : {
        x : 0.0,
        y : 0.0,
        z : 0.0
    },
    angular : {
        x : 0.0,
        y : 0.0,
        z : 0.0
    }
});

/* This function:
 - retrieves numeric values from the text boxes
 - assigns these values to the appropriate values in the twist message
 - publishes the message to the cmd_vel topic.
 */
function pubMessage() {
    /**
    Set the appropriate values on the twist message object according to values in text boxes
    It seems that turtlesim only uses the x property of the linear object 
    and the z property of the angular object
    **/
    var linearX = 0.0;
    var angularZ = 0.0;

    // get values from text input fields. Note for simplicity we are not validating.
    linearX = 0 + Number(document.getElementById('linearXText').value);
    angularZ = 0 + Number(document.getElementById('angularZText').value);

    // Set the appropriate values on the message object
    twist.linear.x = linearX;
    twist.angular.z = angularZ;

    // Publish the message 
    cmdVelTopic.publish(twist);
}
</script>
</head>

<body>
<form name="ctrlPanel">
<p>Enter positive or negative numeric decimal values in the boxes below</p>
<table>
 <tr><td>Linear X</td><td><input id="linearXText" name="linearXText" type="text" value="1.5"/></td></tr>
 <tr><td>Angular Z</td><td><input id="angularZText" name="angularZText" type="text" value="1.5"/></td></tr>
</table>
<button id="sendMsg" type="button" onclick="pubMessage()">Publish Message</button>
</form>
<div id="feedback"></div>
</body>
</html>

 What Did We Do?

So what did we accomplish in this tutorial? Something pretty cool in my opinion: we created a new controller for the existing turtlesim node without modifying that code
at all. The decoupled publish/subscribe approach that ROS supports made this accomplishment possible. I could argue that the simple node we created is superior in some ways to the command window that comes with the complete ROS installation:

  • It seems that the arrow keys always send a 2 or -2. We can send any values using our web page to make the movements greater or finer grained.
  • As much as I tried, I could not send linear and angular values in the same message by pressing the keys simultaneously. We can do that with the web page which allows the turtle to travel in arc paths.

Of course we only published a message in this tutorial. I should point out that there is much more you can do with roslibjs including:

  • Subscribing to topics in order to receive messages
  • Utilizing services hosted within ROS
  • Retrieving a list of current topics within the ROS server

Next Steps

So what’s next? I think I’m going to get myself one of those Baxter robots for $25K, build the appropriate web application and never wash dishes again. Ok, maybe not yet…soon, but not just yet. There are probably a couple of other tracks I can progress on first.
Implementation on Raspberry Pi
I have another long term goal to build a disruptively affordable mobile robot platform and implement the first one as an outdoor rover. I imagine that the robot will be controlled by an SBC like a Raspberry Pi and involve an Arduino board. I have heard that some people have found it challenging to run ROS on the Raspberry Pi but it looks like there have been some successes as well. I imagine I would start by just running the minimal amount of ROS on the Raspberry Pi and use my desktop computer for development and debugging, etc. I imagine I could install Apache or Tomcat on the Raspberry Pi, but it may make sense to build a lightweight http server using libraries like Node.js and socket.io. I also want to try to use Cylon.js for tasks like communicating with the Arduino.

Better UI

Ok, I feel like we’re pretty close friends now so I will tell you this: the web page built in this tutorial is not all that attractive or slick. There are a lot of options for incorporation:

  • jQuery UI has a number of great widgets
  • jQuery mobile makes it very easy to develop applications for mobile devices
  • I know some great developers that are favoring Ember.js in order to create ambitious web applications

Looking forward to seeing what others to with rosbridge and roslibjs. Many thanks to everyone involved in these projects.

turtle

Robotic Explorations

The following story depicts a fictitious (at this point in time and as far as I know) company. If you happen to create a company based on this story please offer me at least a free tour when you make the inevitable fortune.

Outback_view_from_Chambers_Pillar

I experience one of those brief heart-stopping moments as the email appears in my Inbox. It is the name of the sender that catches my attention: Robotic Explorations. This email is the online invitation to my first self-guided tour. And I am going to explore the desolate reaches of the Australian Outback.

I had done one group tour in a South American rainforest with Robotic Explorations previously. The price was much lower mainly due to the larger number of virtual tourists, but after taking that tour I was determined to put myself in the driver’s seat and make my own path. But I am getting ahead of myself…I should probably explain what Robotic Explorations is all about.

As they say on their website, they offer their customers a unique virtual touring experience aided by a semi-autonomous robot. Basically they have a fleet of these very mobile rover-type robots. They are equipped with some kind of hybrid engine that provides lots of mileage very quietly (which is important when you want to spy on the local wildlife). Their website provides the interface to your robot which their local team deploys to your starting spot. Below is a picture of the interface you access from the web page.

rover-control

So I’m getting all ready to start my trip. I chose a 24 hour time period. I have my computer hooked up to some nice speakers so I can get a good listen into the environment and I bought a Google Chromecast so I can see what the robot is seeing on my HD big screen.

Ok, I just got the message: “Congratulations, your rover is activated. Happy trails!” And the picture is coming in…wow, so cool. I am virtually in the Australian Outback. I was wondering about bandwidth issues, especially since these tours are often in the middle of nowhere. Apparently Robotic Explorations has addressed that challenge in its touring areas. I have heard explanations ranging from line of sight laser signals beamed from towers to hovering broadband repeaters…whatever it is, the picture and sound quality are great.

The robot is asking me where we should go. Let me ask him for a quick video pan of the area. Alright, we are going to head northwest. The terrain map indicates some type of small forest. Maybe we will see some cool creatures. So I’ll just click on that area and my robotic ambassador will route there as best as he can.

My control panel just flashed a “Rough terrain…” indicator. Uh oh…our first set back. It seems that the rover has tumbled down a hill and is on its side. The panel indicates: “Rover overturned; activating outriggers…”. I can see by the camera picture that he is righting himself, and now we are back on our wheels again. The panel displays “Reattempting previous route with increased torque.” Very carefully the robot ascends the hill and reaches more level ground. Fantastic effort.

I have the side cameras activated as we are driving in case something catches my eye. Wow, something just caught my eye. Not an animal but a really great view of the landscape. Let me stop him now and take a snapshot. Forgot to mention, my trip is linked to my Facebook acThe_World_Factbook_-_Australia_-_Flickr_-_The_Central_Intelligence_Agency_21count so I can post these photos as I take them. The application also puts a pin in my map where I took the snapshot. I’m adding the caption: “Enjoying the late afternoon view of the Australian Outback (with a beer).” Ok, onward…

Hold on, I just saw an alert icon on my display. The laser sweep picked up some movement a little to our east. Let’s check it out. I just clicked on the alert icon where it appeared in the map view and the rover is now re-routing and approaching. One nice thing about their app: you can have different web URLs for your control panel vs. your camera views. So I have the camera views on my big screen and I’m actually using their iPad app to control the robot. The rover is getting close to the source of the motion so I got the message: “Switching to stealth mode”. I clicked “Ok”. I think this runs the motor on all battery to make him real quiet on approach. I think I see something. I’m stopping the rover and zooming in. It’s a bunch of birds around a small pond. Strange looking birds, cool. Going to snap another photo for Facebook. Just noticed my friend posted a comment on my last picture: “What the hell are you doing in the Australian Outback?”

I’m back…I’ve been a little lazy about updating this post because I’m really enjoying just wandering around and feeling like I’m out there in the great abandon. Another movement alert up ahead. It’s horses…three small horses wandering by. I click on a horse in the camera display and select “Track with rover”. This action makes the rover follow them from an adjustable offset. We follow them for a ways as they head towards a large rock formation. They enter a narrow ravine between two steep rocks and we continue after them. The robot’s camera adjusts for the dimmer light and activates the picture stabilization. Suddenly one of the horses lets out a snort and they race off. I don’t think we will be able to catch them. I zoom in on the satellite picture to see if we can make it through this ravine. The rover’s LIDAR is optimistic about a way out so we continue.

My attention is drawn to the left side camera. What I first thought was just discolorations on the rock walls that line this ravine are actually…paintings. They remind me of prehistoric cave paintings. It seems to depict some sort of creature with claws being held off by people.
rock-paintingIt’s a little difficult to describe. I’ll take some photos and post on Facebook to see what other people think of it. I remember that my contact at Robotic Explorations indicated my rover was being deployed to a very remote part of the Outback seldom hiked by people due to its desolation. My rover was delivered to its starting point by a small helicopter. Could it be that my photographs are the first to be taken by an outsider? Am I the first visitor to this land to make this discovery? Although my photos will record the latitude and longitude I am definitely marking this point with a trip pin.

It’s getting dark now. I think I am going to take a look at the sky with the panoramic. The robot camera is adjusting for the dim light. Now I can see stars…lots of stars. Another photo opp. Beautiful night in the Australian OutbackWhat’s that? Rover picked up significant sound reading not too far off. I click on the direction to indicate my approval. Let’s go.

Once again we go into stealth mode as we approach objects in motion. Too dark so night vision is enabled. Not as brilliant as day time shots but what are you going to do? Ooh, I just saw the reflection of some glowing eyes (enough moonlight for that I guess). Wow, about four…nope, at least eight dog things. Maybe these are dingoes. Did I mention I’m not an expert on the Australian Outback? Doesn’t matter to me; makes it even more interesting and surprising in some ways. Yeah, I’m seeing a pack of these dogs eating something on the ground. Looks like feathers, some kind of very big bird. I’m going to tell Rover to move in very slowly. This warrants video; I have enough allowance for some footage on this trip. Audio is good too. Can hear them yelping at each other as they compete for the best eating spots. After a while, they disperse into the darkness.

I should mention I really can’t go anywhere I want. The map has boundaries, I guess limited to where Robotic Explorations has worked out touring rights. If I try to go beyond the boundaries, the robot won’t let me. But it doesn’t matter, the area is bigger than what I could probably see in a year. Speaking of that, this tour route (the route of points to where I actually go) will be saved with my account. So if I do happen to want to come back here, I can overlay previous trips to revisit key pinned places or make sure I explore new areas.

I’m only a couple of hours into this trip but I can’t explain how cool it is to feel like I’m exploring this part of the world “on my own”. I should mention that Robotic Explorations has another option that is more expensive but sounds unbelievably cool. You can rent a rover that is also a docking station for a UAV drone. The drone uses the fuel powered rover as a charging station when it has to. Basically you can release the drone and get an aerial view (and photos and videos) from where you are. You control the altitude with the control panel and click the destination as well – just click on the map and the drone will go there. There is also a feature to track an object that the drone camera recognizes it. So the drone will follow your target with its camera if it moves. With one click you can return the drone to the docking station rover. I have also heard rumors of plans for submarine tours in the future. Going to start saving my pennies for the next trip.

So how should I answer all these “where ARE you??!!’ comments popping up in Facebook?

( 8^])X

Photos that were not purchased were taken from Wikimedia Commons and friends of mine kind enough to share.
Wireframe mockups were developed with draw.io.