Fun with Computer Vision and JavaFX 2.2

Experimenting with computer vision has been a hassle every time I tried playing around with it, mainly because of poor camera apis/interfaces/drivers. Historically, the VFL driver in Linux has been everything but stable when using PCI cards, and plugging in a USB camera normally resulted in a broken system or at most just nothing, at least when I tried it.

But nowadays almost every USB-camera I plugin just works, both in Ubuntu and Mac OS X. Amazing!

So, I decided to build something fun around computer vision.

The hardware I used was an old kit robot arm powered by five R/C servos and one of those working modern USB web cameras. I replaced the robot arm grip with the (working) camera and started building some kind of robot application with my Intellij. An idea was to try making the robot arm behave like some kind of pet, using the extra degree of freedom all unnecessary servos and joints on the robot arm is giving. By using those extra servos it seemed possible to have the robot express some kinds of emotions, like curiosity, defensiveness or ignorance.

DSC_0039b

I started by downloading and compiling OpenCV. After some fighting with OpenCV and JavaCV I was able to run the Java demo application, and it was built the way I wanted it with Maven. I went with the Face detection algorithm.

A serious robot application should of course not be a desktop application, it should be an embedded realtime application, but I made it a desktop application anyway because it is more easy that way to integrate visual display and controls.

I selected JavaFX 2.2 for the user interface, and at the same time learned a how to CSS-style a JavaFX application, that was straightforward and looked nice.

At a late evening hacking event with the CAG Java group we later built the foundation of the robot application with two tick frequencies, one for controlling the servo controller (MiniSCC) and one for vision control, and the design for managing robot arm poses, and finally of course also a brain module.

screen10

The robot can be manually controlled from the UI using sliders or buttons for selecting fixed poses, or it can be set into one of tree auto modes:

Searching – The robot looks in random directions and random intervals until a face is detected and is reasonable stable in the view.

Tracking – here the servos are trying to follow a selected face in the camera view, until it looses it.

Self aware – a mode that is not completely coded yet. However there is a simplified version in place right now now which automatically switches between the other two modes. For human safety reasons I am planning to hard wire the Self Aware mode to be automatically deactivated after a few minutes – as a dead man’s handle. Just in case.

Om Daniel Marell

IT Consultant at CAG Contactor. Software developer and architect focused on Java EE/SE.
Det här inlägget postades i Java, JavaFX, Uncategorized. Bokmärk permalänken.

Kommentera

E-postadressen publiceras inte. Obligatoriska fält är märkta *

Följande HTML-taggar och attribut är tillåtna: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>