So, I've been working on the autopilot for the last few days, now that my undergraduate classes are over. Since my last post, I've managed to make a fair bit of progress.
A brief rundown of changes to my autopilot:
--lowered the saturations to ~+/- 40-50 degrees for each control surface and 0.1-0.5 throttle
--added an airspeed hold function based on the altitude hold
--created a desired heading -> desired bank angle function
--created a desired GPS location -> desired heading function
Bumping the saturations down is helping a fair bit. The airplane doesn't try to do too many stupid things now--like try to deflect the elevator so much that it goes unstable. But it still has this funny problem where the steady state aileron command during level flight is nonzero (and negative). I haven't figured out what's causing it yet...but it's being annoying. Fortunately, it's not affecting the results too much (I think). The airplane model still reacts in a fairly sensible-looking manner.
The airspeed hold is a bit of a question mark. We need to figure out how better to integrate it into our command structure. Like...when speed command is issued, give it priority or something.
Jerry worked out the basic heading -> desired bank angle algorithm earlier in the month, but I wasn't able to incorporate it into the autopilot until this afternoon. We were encountering a problem having it turn left, however; it would simply refuse to do so, and dive into the ground when given a command to turn in that direction. Several hours of debugging later, I noticed that our preset ranges for sideslip angle were wrong; it was [0 10] degrees...and not something like [-20 20]. Changing that fixed our problem.
I also worked out the math for the GPS coordinates -> desired heading function. Given a starting GPS location and GPS coordinates for a waypoint, I can issue a command to track that point. The airplane will then turn and attempt to reach that point. Here's a nifty plot of my initial test of this function:
The airplane starts at the red circle and attempts to reach the black diamond. As you can see, it misses the waypoint and gets confused; subsequently, it begins to circle about that point in an attempt to reach it.
Very rudimentary...but it's a great proof of concept. I'm pretty excited about it, anyway. The next step from here is to set up some more control logic so that the airplane has a preset default command in the event that no waypoints can be found. For example, it could just level itself out after reaching a waypoint. But perhaps my only disappointment is that I'm graduating; next year, I'll have to leave this work to someone else. Maybe I should try to finish as much of this as I can....
Friday, June 13, 2008
Monday, May 19, 2008
Visions of the Future
So, since I completed the maiden voyages of the UCLA AIAA blimp project, I've been hard at work designing an autopilot for the UCLA AIAA Autonomous UAV project.
It was slowgoing, and it's only recently that I've made some progress. In fact, we've developed the skeleton for the prototype version of the autopilot. Demonstration images will be forthcoming as we develop the control logic and fine-tune the control systems.
Introduction
Our autopilot makes extensive use of PID (proportional, integration, and derivative) controllers. A [haphazard] method of multiple loop closure was used to develop the control systems. PID loops were used on many of the primary aircraft state elements (e.g. yaw rate, pitch rate and angle, altitude). These loops fed into commands for the aileron, elevator, rudder, and throttle.
The autopilot was designed in Simulink with the aid of the Aerosim Blockset. This third-party blockset for Simulink provides six-degree-of-freedom models for aircraft dynamics, including provisions for modeling the dynamics of piston engines. It also comes with demo models and programs showcasing how to use the blockset.
Current Situation
Currently, the autopilot is capable of responding to commands changing the airplane's bank angle and altitude. In other words, we are capable (in theory) of responding to commands to change heading and altitude. These are the basic elements--as far as I can tell, anyway--required to perform the waypoint navigation tasks required for the AUAV contest we are currently registered for.
Setting up Aerosim to model our airplane turned out to be quite difficult. The chief difficulty lay in the fact that there was little data for us to use to model the engine and propeller. Fortunately, a makeshift solution was provided with the software. One of the demo aircraft included with the blockset was the Aerosonde, a UAV remarkably similar to the one we designed. The engines were of comparable size, as well. But from what I read of the design, the engine used in this aircraft was heavily modified. Thus, using the data for this engine will most definitely throw off any performance results using the Aerosim model. It was decided to use these data, however, for the purposes of enabling the design of the autopilot.
In developing the controls, I made extensive use of a set of notes from an MIT course on aircraft dynamics and controls. Jon provided those, so he has the citation. They were very helpful in understanding what I was looking to do with each control system. Implementing a prototype lateral (heading) controller proved to be rather straightforward. On the other hand, longitudinal (pitch and altitude) control turned out to be much more difficult. Given a fixed throttle, the elevator had trouble maintaining an altitude alone. I was forced to add into the altitude control system (contrary to the notes I was using) a throttle element. This allowed the control system to modify the throttle setting to a more "manageable" level at which the elevator could effectively contribute to maintaining level flight. Unfortunately, motor dynamics are slow, which causes the model to take a longer time to respond to altitude commands (compared to the heading commands).
Next, we'll think about how we can issue speed hold commands that do not conflict with the altitude control such that the airplane is destabilized. We'll also look, as mentioned earlier, at developing the control logic that will be used to generate the appropriate commands. For example, when issuing a heading change command, we may want to have the aircraft respond by going through differently sized incremental heading changes depending on how large the difference between current and desired heading is.
The responses are rather sensitive, however. We need to do some detailed analysis of the system to see where it fails. We would also like to find out its responses to some complex signals that are more representative of what we would ask it to do. Concurrently, I suppose, we will have to develop the control logic determining the commands issued to the autopilot. The end product, ideally, will be an interface for inputting a set of GPS coordinates and altitudes which will be translated into commands such that the autopilot can navigate a set of waypoints. If it works in simulation, we'll go out and test it. We're still waiting on the sensor fusion, though, so flight testing is still a long ways away.
But speaking of testing....Jon and I have both purchased an Easy Star. I'm expecting mine--the kit version, actually--this week. Ideally, it would be a good platform on which to mount the prototyped version of our electronics and control systems. We could then go out to a large park (possibly the RC airfield in Van Nuys) and test out our integrated systems. In the end, though, we'll have to test it in the contest airplane, and that'll be a whole different ballgame. I'll need to get a better feel for the tuning of our control system before we do that.
Anyway, I'm rather proud of what we've accomplished so far. It's a whole lot more than a lot of students at UCLA ever set out to do in college. I'll be a little more relieved if we see some positive, concrete results, though. But we have some hurdles to jump before we reach that point. Building the contest plane, for example. There are some manufacturing and design problems that cropped up there...but those shall wait for another post. Wish us luck!
It was slowgoing, and it's only recently that I've made some progress. In fact, we've developed the skeleton for the prototype version of the autopilot. Demonstration images will be forthcoming as we develop the control logic and fine-tune the control systems.
Introduction
Our autopilot makes extensive use of PID (proportional, integration, and derivative) controllers. A [haphazard] method of multiple loop closure was used to develop the control systems. PID loops were used on many of the primary aircraft state elements (e.g. yaw rate, pitch rate and angle, altitude). These loops fed into commands for the aileron, elevator, rudder, and throttle.
The autopilot was designed in Simulink with the aid of the Aerosim Blockset. This third-party blockset for Simulink provides six-degree-of-freedom models for aircraft dynamics, including provisions for modeling the dynamics of piston engines. It also comes with demo models and programs showcasing how to use the blockset.
Current Situation
Currently, the autopilot is capable of responding to commands changing the airplane's bank angle and altitude. In other words, we are capable (in theory) of responding to commands to change heading and altitude. These are the basic elements--as far as I can tell, anyway--required to perform the waypoint navigation tasks required for the AUAV contest we are currently registered for.
Setting up Aerosim to model our airplane turned out to be quite difficult. The chief difficulty lay in the fact that there was little data for us to use to model the engine and propeller. Fortunately, a makeshift solution was provided with the software. One of the demo aircraft included with the blockset was the Aerosonde, a UAV remarkably similar to the one we designed. The engines were of comparable size, as well. But from what I read of the design, the engine used in this aircraft was heavily modified. Thus, using the data for this engine will most definitely throw off any performance results using the Aerosim model. It was decided to use these data, however, for the purposes of enabling the design of the autopilot.
In developing the controls, I made extensive use of a set of notes from an MIT course on aircraft dynamics and controls. Jon provided those, so he has the citation. They were very helpful in understanding what I was looking to do with each control system. Implementing a prototype lateral (heading) controller proved to be rather straightforward. On the other hand, longitudinal (pitch and altitude) control turned out to be much more difficult. Given a fixed throttle, the elevator had trouble maintaining an altitude alone. I was forced to add into the altitude control system (contrary to the notes I was using) a throttle element. This allowed the control system to modify the throttle setting to a more "manageable" level at which the elevator could effectively contribute to maintaining level flight. Unfortunately, motor dynamics are slow, which causes the model to take a longer time to respond to altitude commands (compared to the heading commands).
Next, we'll think about how we can issue speed hold commands that do not conflict with the altitude control such that the airplane is destabilized. We'll also look, as mentioned earlier, at developing the control logic that will be used to generate the appropriate commands. For example, when issuing a heading change command, we may want to have the aircraft respond by going through differently sized incremental heading changes depending on how large the difference between current and desired heading is.
The responses are rather sensitive, however. We need to do some detailed analysis of the system to see where it fails. We would also like to find out its responses to some complex signals that are more representative of what we would ask it to do. Concurrently, I suppose, we will have to develop the control logic determining the commands issued to the autopilot. The end product, ideally, will be an interface for inputting a set of GPS coordinates and altitudes which will be translated into commands such that the autopilot can navigate a set of waypoints. If it works in simulation, we'll go out and test it. We're still waiting on the sensor fusion, though, so flight testing is still a long ways away.
But speaking of testing....Jon and I have both purchased an Easy Star. I'm expecting mine--the kit version, actually--this week. Ideally, it would be a good platform on which to mount the prototyped version of our electronics and control systems. We could then go out to a large park (possibly the RC airfield in Van Nuys) and test out our integrated systems. In the end, though, we'll have to test it in the contest airplane, and that'll be a whole different ballgame. I'll need to get a better feel for the tuning of our control system before we do that.
Anyway, I'm rather proud of what we've accomplished so far. It's a whole lot more than a lot of students at UCLA ever set out to do in college. I'll be a little more relieved if we see some positive, concrete results, though. But we have some hurdles to jump before we reach that point. Building the contest plane, for example. There are some manufacturing and design problems that cropped up there...but those shall wait for another post. Wish us luck!
Friday, February 29, 2008
Late February Updates
Well, there's a lot to cover. First off, we've decided on the necessity of a avionics test bed. After looking at a professor's thesis papers on an autonomous plane he worked on and noting that they had a plane dedicated to testing avionics that crashed three times, it seemed natural that we should make one as well. Considering our inexperience and how much we would like it if our competition plane didn't crash, we're going to make a sort bare bones skeleton plane. We're not kidding about the skeleton part too; it'll probably be a carbon fiber tube with a foam wing and tail attached, an electric motor, some batteries, and a box to hold the avionics we're testing. It should be straightforward to design and build. Having that done early will give us extra time to work on our competition plane.
Looking into the camera transmitter part of things, it looks like one of us will need an amateur radio license to legally handle the power that we'll be transmitting. I found this site that talks about the steps to take to get a license. I also found out that this book is apparently the Bible for studying for the test. Hopefully that book will arrive soon and the license not too long after.
On the electronics side of things, I've got a few bits ordered and coming in (hopefully I can pick them up tomorrow). Things include voltage regulators, 16 MHz crystals for Arduino ATmega168's, and an AVR programmer among other things. The programmer is cool because it's a first step towards moving away from the Arduino and programming directly for the microcontroller. Once those IMU units come in I can start testing those integrated with the multiplexing scheme. I also need to test the timing in the multiplexing scheme by doing some pinging.
For the data link, we're reevaluating our options again. Before I was really stuck on the Digi XStream OEM RF Modem at 19200 baud rate. It gave us a range of about 7 miles but the kicker was its $150 price tag. The XBee-PRO 802.15.4 modules sport a range of about 1 mile with a much nicer price tag of $32. However, the trick is we still don't have a concrete maximum flight range. The rules keep talking about staying within the airfield, but the air field is about a mile by mile (square mile). If we're in the center than a 1 mile range will probably do fine. If we're positioned at the edge of the airfield then we're going to need a range of greater than 1 mile. Which is it? Well, I've got an email on it's way to the contest contact point to see if I can get a clearer picture. That's what's holding up the RF modem selection along with antenna selection.
We still don't know what camera we want. Today we were looking at camera transmitters to see if we could find a range. Of course, bad news. The most powerful 2.4 GHz transmitters we've seen so far (of what little we've seen so far) are about 1 Watt with a range of 300 meters (less than a third of a mile). That's making us think we'd want to send the video over 900 MHz to get more range. However, I don't want a 900 MHz camera transmitter and 900 MHz data link on the same plane. The interference might be intolerable. This gives the XBee-PRO data link some more credibility since it operates on a 4.2 GHz frequency instead. Before we start making conclusions though, we haven't actually found a 900 MHz camera transmitter with the range we want yet. Well, time will tell.
As for the camera gimbal? Unfortunately I wasted time on thinking if we could get away with a single-axis gimbal because the contest requires that we be able to see a 60 degree cone from below the plane in all directions. That basically means we need a two-axis gimbal. We've been looking at the Lynxmotion one as a real quick and simple solution to the problem.
We've looked more into the need of a pitot-static system. From what we can tell, if we want airspeed relative to the plane the pitot-state system is unbeatable in its reliability. It'll be costly though with the tube itself already costing $80 already and the pressure transducers another $35 to $65 each (we need two). We considered software methods, anemometers, hot-wire anemometers and turbine meters already. One of my professors says that a pitot-static tube will probably be what we end up with anyway. Finding the pressure transducers has been an uphill battle, but lucky for us Matt found a company named Servoflo that sells them in a beautiful DIP package. The problem is that we haven't figured out how to order from them yet.
Anyway, I think that's a pretty good update for now. Time to crash.
Looking into the camera transmitter part of things, it looks like one of us will need an amateur radio license to legally handle the power that we'll be transmitting. I found this site that talks about the steps to take to get a license. I also found out that this book is apparently the Bible for studying for the test. Hopefully that book will arrive soon and the license not too long after.
On the electronics side of things, I've got a few bits ordered and coming in (hopefully I can pick them up tomorrow). Things include voltage regulators, 16 MHz crystals for Arduino ATmega168's, and an AVR programmer among other things. The programmer is cool because it's a first step towards moving away from the Arduino and programming directly for the microcontroller. Once those IMU units come in I can start testing those integrated with the multiplexing scheme. I also need to test the timing in the multiplexing scheme by doing some pinging.
For the data link, we're reevaluating our options again. Before I was really stuck on the Digi XStream OEM RF Modem at 19200 baud rate. It gave us a range of about 7 miles but the kicker was its $150 price tag. The XBee-PRO 802.15.4 modules sport a range of about 1 mile with a much nicer price tag of $32. However, the trick is we still don't have a concrete maximum flight range. The rules keep talking about staying within the airfield, but the air field is about a mile by mile (square mile). If we're in the center than a 1 mile range will probably do fine. If we're positioned at the edge of the airfield then we're going to need a range of greater than 1 mile. Which is it? Well, I've got an email on it's way to the contest contact point to see if I can get a clearer picture. That's what's holding up the RF modem selection along with antenna selection.
We still don't know what camera we want. Today we were looking at camera transmitters to see if we could find a range. Of course, bad news. The most powerful 2.4 GHz transmitters we've seen so far (of what little we've seen so far) are about 1 Watt with a range of 300 meters (less than a third of a mile). That's making us think we'd want to send the video over 900 MHz to get more range. However, I don't want a 900 MHz camera transmitter and 900 MHz data link on the same plane. The interference might be intolerable. This gives the XBee-PRO data link some more credibility since it operates on a 4.2 GHz frequency instead. Before we start making conclusions though, we haven't actually found a 900 MHz camera transmitter with the range we want yet. Well, time will tell.
As for the camera gimbal? Unfortunately I wasted time on thinking if we could get away with a single-axis gimbal because the contest requires that we be able to see a 60 degree cone from below the plane in all directions. That basically means we need a two-axis gimbal. We've been looking at the Lynxmotion one as a real quick and simple solution to the problem.
We've looked more into the need of a pitot-static system. From what we can tell, if we want airspeed relative to the plane the pitot-state system is unbeatable in its reliability. It'll be costly though with the tube itself already costing $80 already and the pressure transducers another $35 to $65 each (we need two). We considered software methods, anemometers, hot-wire anemometers and turbine meters already. One of my professors says that a pitot-static tube will probably be what we end up with anyway. Finding the pressure transducers has been an uphill battle, but lucky for us Matt found a company named Servoflo that sells them in a beautiful DIP package. The problem is that we haven't figured out how to order from them yet.
Anyway, I think that's a pretty good update for now. Time to crash.
Monday, February 25, 2008
Multiplexing Scheme Works
Well, apparently that multiplexing scheme I was merely hoping would work actually does work. It manages to pick up data from two ATmega168's acting as buffers to other sensor data. One buffers the GPS data and the other buffers ADC conversions of soon to be accelerometer and rate gyro data (just ordered an ADXL330 and three ADXRS300's today). A sensor collector ATmega168 queries each buffer for data. If data is unavailable then it simply moves on to the next sensor. If data is available it will read it and echo it back to the computer for debugging purposes. Eventually the sensor collector will filter the signal and pre-process it for the flight computer in another ATmega168. Why so many ATmega168's? Well, the Arduino makes it so easy to work with the ATmega168 already so I figured it'd be better to just go with what we know works. Anyway, check out the pictures:
Thursday, February 14, 2008
Updated Components Diagram
This is still preliminary, but it's been updated to show more detail and some new decisions. Instead of the IMU package it looks like we're going to go for the raw components: one triple-axis accelerometer (the one used in Wiimotes actually) and three rate gyros. We'll have to write our own firmware to integrate the gyro rates and come up with actual position and orientation in conjunction with the GPS module. This came up after a discussion with our advisor Damian Toohey. It'll be more work and a bit riskier, but it's also probably more fun and we'll get a lot more from it. I also detailed how I plan on using multiple microcontrollers. This still needs to be proven out since there are concerns about communication speed involving both the multiplexers and controller to controller communication.
Friday, February 8, 2008
Components Diagram (Preliminary)
Hey guys, I couldn't sleep so I updated the components diagram a bit. It's still preliminary, but it's very close to being complete. I'll be proving out the multiplexing scheme for the UART soon. If that works, we might very well be able to pull this off with just an ATmega168 microcontroller. Will we want to? Not quite sure yet. It'll be interesting, that's for sure!
Things that need more design and/or selection: fail-safe system (hard line? relay?), main board selection, camera w/ transceivers, pitot-static system, RC receiver mixing with controller
Things that need proof of concept: UART multiplexing scheme (includes data buffering with microcontroller), range finder (sonic Maxbotix LV-EZ1), magnetic compass, IMU package, servo controller
Things that need research: video capture directly in ground station (possibly using DirectShow)
Well, the above is an incomplete list; it's just all I can think of for now. Great progress has been made in computer to electronics communication, GPS interfacing, and command-query programming. Details to follow, hopefully.
Things that need more design and/or selection: fail-safe system (hard line? relay?), main board selection, camera w/ transceivers, pitot-static system, RC receiver mixing with controller
Things that need proof of concept: UART multiplexing scheme (includes data buffering with microcontroller), range finder (sonic Maxbotix LV-EZ1), magnetic compass, IMU package, servo controller
Things that need research: video capture directly in ground station (possibly using DirectShow)
Well, the above is an incomplete list; it's just all I can think of for now. Great progress has been made in computer to electronics communication, GPS interfacing, and command-query programming. Details to follow, hopefully.
Wednesday, February 6, 2008
GPS Logger Works!
Hey everyone,
So despite not have any sleep the night before and it being about 10:42 pm, I managed to get my programming on. I spruced up the ground station interface a bit and added the ability to log received GPS coordinates. The program then plots the GPS coordinate history on top of a satellite view that I took screenshots of from Google Maps. If you can tell in that bottom-left window there, that's Engineering IV (practically my current residence). The red and blue +'s are calibration marks. They are actually off for some reason; the math is right but the image scaling is wrong. Anyway, those green +'s are recorded GPS positions. It was basically me standing out there in the 50 degree cold walking around. I'm actually pretty happy with the precision from this EM-406A GPS module. It's probably a perfectly fine candidate for use in the plane. Anyway, there's more feature building to be done on the ground station program. However, I think it might be time to move onto some controls stuff... Either way, pics of the progress below.
So despite not have any sleep the night before and it being about 10:42 pm, I managed to get my programming on. I spruced up the ground station interface a bit and added the ability to log received GPS coordinates. The program then plots the GPS coordinate history on top of a satellite view that I took screenshots of from Google Maps. If you can tell in that bottom-left window there, that's Engineering IV (practically my current residence). The red and blue +'s are calibration marks. They are actually off for some reason; the math is right but the image scaling is wrong. Anyway, those green +'s are recorded GPS positions. It was basically me standing out there in the 50 degree cold walking around. I'm actually pretty happy with the precision from this EM-406A GPS module. It's probably a perfectly fine candidate for use in the plane. Anyway, there's more feature building to be done on the ground station program. However, I think it might be time to move onto some controls stuff... Either way, pics of the progress below.
Subscribe to:
Posts (Atom)