A generic discussion about Linux, NI and the role of digital control

The thread about «Plite» has been moved here: Plite - a new software for kite

This thread is more free-form by now

1 Like

Machine Vision support would be a powerful feature to monitor system-state and support airspace sense-and-avoid requirement. NASA LaRC did a nice proof-of-concept.

A METAR parser and system logic to land and relaunch according to weather events is another operationally advanced and powerful feature.

This project starts more like a worthy conceptual test-bed than a product idea.

Its not a product. Just a springboard for anyone who would like to do AWE at small scale with little effort.

It could become part of a product if someone coupled it with a kite and sensors/actuators.

Machine vision probably requires more hardware. The METAR parser would be simple to implement on top of this (10 minutes of coding…)

Getting a complete system is the main problem. This software just acts as a starting point for those not bothered to do the quite difficult low level interfacing

1 Like

Machine vision mostly just needs the camera, if central processing resources are ample enough to run vision software.

METAR parsing is simple. The complex part is how to interpret the data for a fly/no-fly response.

Both of these capabilities are also useful for pilot-supervised flight of ground-based control (no comm-link dependence).

Here’s the Linux-compatible vision software NASA used (NI-Rio), developed in Austin. Many AWE players use NI embedded control-


A compatible ruggedized camera-


You could maybe use that and then use «plite» if you need to interface any RC gear or load sensor

The NI platform also natively supports servos and sensors, like Beaglebone. Its not so much what digital platform controls an AWES, but what AWES architecture is most worth controlling.

Get the AWES architecture right, and even a flawed digital control implementation won’t matter much. Get the architecture wrong, and even a perfect digital implementation won’t save it.

This sentence should be taught in universities involved in AWES.

1 Like

Let anyone prove that the NI system is suitable for AWE at mini scale (or any scale). Anyone other than me that is. Why would you sell NI in this thread at all? LabView is windows based and I would think it would not run well on a beaglebone.

Also - what would be the cost of the NI system? engineering costs?

So, let anyone choose their tools of choice. If you are keen on beaglebone and Linux, «Plite» may be a good place to be.

1 Like

NASA LaRC found NI suitable at mini-scale, and Makani and others at larger scales. Not sure what suitability proof standard you propose to apply equally to Beaglebone. Its true that NI and Beaglebone do not start very compatible. I like them both, for their respective advantages.

Austin’s Silicon Labs embedded controllers have been surging as well in recent years, with strong Windows/Linux cross-support. Austin’s famous digital-controls culture began in the 70’s, with TI and Motorola microcontrollers in assembler. NI emerged from this community. I was there. KiteShip recruited me for my connections to this background. Our software culture happily emulates Linux on Windows and Windows on Linux, as needed.

For pioneering R&D, we eat anything on the menu, and clean-up later :slight_smile:

The simple question is: How much competency and money is required to experiment with digital control in AWE. Right now many many things are still untried. One hurdle is getting the software to a point where an algorithm could be implemented.

The «Plite» is right now a hobbyist project. And it will remain so. I have a few experiences with NI and none are good. This is coming from a software developer. For someone less into software NI may be good choices. But the NI hardware is pretty bulky I believe, leading to quite large initial designs.

The beaglebone blue wireless gives us a CPU capable of running a real operating system (as opposed to Arduino or other c++ based systems). This gives you wifi, file storage, and the option of any Linux supported software. It has two PRUs which are separate microprocessors that handle realtime without the operating system. It has a battery plug for 2S LiPo batteries, and a voltage bus directly soldered to the 5V pins that use standard hobby servos. The 5V is of course software switched on/off.

To create this from nothing takes a lot of effort. My goal is to reuse this capable and cheap design, adding in particular support for load sensors which I find are useful for AWE purposes.

1 Like

The good news is that very little money or competency is required to experiment in digital control in AWE.

The bad news is that it will take billions of dollars and decades of work to perfect full automation of industrial scale AWES. This is evident not just from current AWES MTBF statistics, but from automation similarity cases like driverless vehicles, which currently require driver intervention every few kilometers, even after decades of R&D. Makani just spent 300USD to prove lots of money was not enough, yet.

The reasons for slow progress are the same in essence. Discrete sensors tend present a very reduced “soda straw” picture of system state. Top-down AWES control logic is too crude. Machine learning is too opaque for aviation safety.

It does not matter whether the control engineer trains with NI or BeagleBoard, Linux or Windows CE, or any of the other embedded control environments, they are not going to “solve” AWE’s current race to best define an optimal physical architecture, while automation in general matures elsewhere.

The best a software architect can do right now is define class-based AWES application meta-logic, and pseudo-code that in principle can run on any OS. Playing with sensors, servos, comm-links, and kites is more student-level or hobbyist participation in AWE, welcome, but not the key challenge.

For “advanced” active control research examples, an AWES METAR interpreter could begin by watching barometric data for a sudden drop in pressure, or a WEA interpreter watch for warnings, to trigger landing command. That’s a far higher level of system programming than debugging any chosen hardware/OS platform.

Machine vision is going to be a very powerful AWES state-estimation tool. NASA LaRC using NI LabView-RIO for student AWE collaboration is just one success case (nominal sweeping). Surely there are machine vision hackers adding vision functionality to BeagleBoard. Even though both platforms are from Texas, I do not recommend either for AWE. When the time comes, let the CAN Bus prevail.

Finally, “Linux vs NI” is not a well-formed topic; NI has a full pure Linux product line, or you can kludge Linux with other supported OSes across a system architecture, as Makani seems to have done (with custom FPGA logic as well)-