Posted on by

Welcome back to my series on How to make a space mission! Previously, we talked about how making a space mission has never been easier and my preferred method for coming up with a space mission idea. Some of our more enterprising readers might’ve already come up with a few ideas for space missions!

Now, ideas are great, but execution of an idea is what creates success. A bad idea with great execution will always beat a great idea with terrible execution. But to execute on an idea, you need two resources: time and money. While the focus of this blog post will be how to get money to fund your space mission idea, it is worth remembering that money can buy time in the form of partners, subordinates, and contractors.

Whether or not it's the root of all evil, money remains a useful way to get things done. Source: centonomy.com
Whether or not it’s the root of all evil, money remains a useful way to get things done. Source: centonomy.com

 

Before we get into the meat of funding sources and how to get them, we first need to discuss how you will execute your space mission idea. Generally, you will fall into one of the three below:

  1. Amateur
    • This refers to a space mission that you plan to execute without payment and without expecting outcomes for anyone other than yourself. Developing an amateur space mission can be considered a hobby, like amateur radio.
  2. Research
    • A research space mission plans to produce outcomes that expand the sphere of human knowledge. This can be done by individuals, but is most often done through research institutions, such as universities and companies.
  3. Commercial
    • A commercial entity conducting a space mission is expected at some point to generate revenue greater than the costs incurred. Our previous example of taking images of cars from space and selling them to the government would fall under this category.

At BLUEsat, we fall under the Amateur and Research categories. This means that there are avenues of funding that we cannot follow that a commercial entity can. On the other hand, this gives us the ability to be less strict with our resources and bring in students that a company might not hire and turn them into space engineers by the time they finish. There are pros and cons to whichever category or categories you choose to fall under, so make sure you think carefully about yourself and your ideas and choose wisely!

Once you have made your choice/s, you can consider the following funding opportunities. Note that my list is not exhaustive. There are likely other opportunities out there. Nonetheless, I hope the following is informative. The list of funding sources I have considered are:

  1. Yourself – applicable to all three categories
  2. Sponsorship – applicable to amateur and research
  3. Grants – applicable to research and commercial
  4. Investment – applicable to commercial
  5. Bootstrapping – applicable to commercial

Now let’s discuss these funding sources in a little more detail.

Self Funding

Self funding is simultaneously the easiest and hardest funding method we’ll discuss today. It’s simple because it has the least administrative barriers out of all the other funding methods. If you want to spend your own money on items related to your space mission idea, all you have to do is open your wallet. There’s no one else to tell you what to do, and no one is going to expect you to produce funding request forms or documentation. You’re accountable to no one but yourself.

But that self-accountability makes it hard as well. After all, this is your money that you worked hard to get. Every dollar of your money that you spend on a space mission is a dollar less that could go to credit card repayments, to that trip to Europe you’ve been saving up for, or even to a smashed avo on toast you wanted for lunch. Surely you should keep your savings invested wisely in a bank or a safe fund, and not waste it on frivolous things like a space mission. Wait, frivolous? Is spending money on achieving your dreams really that frivolous?

These are difficult questions that you’ll have to ask of yourself if the time comes when you might have to invest some money in a space mission or other activity or venture. The difficulty of these questions makes it clear why getting money from others is so difficult. They’re asking themselves these exact same questions!

Sponsorship

When we think of sponsors, the sports-minded among us think of sports events and TV shows with announcements saying: “This program is proudly brought to you by this company and that company.” This essentially makes sponsorship the provision of money or resources in exchange for advertisement of the sponsor. The sponsor may also be able to derive other benefits from the relationship. Sponsorship works best with highly visible activities that are followed by the demographic groups sponsors are interested in. And as it just so happens, space activities almost never fail to generate significant public interest!

BLUEsat’s biggest sponsor is UNSW Engineering. In return for their monetary and in kind support, BLUEsat performs regular outreach events to appeal to the next generation of UNSW students, conducts high visibility activities (such as balloon launches and rover competitions) that produce positive PR for UNSW, and gives UNSW students an incredible educational experience they can’t get anywhere else.

Getting sponsorship is a difficult process, but the rewards are obvious. You need to develop your space mission idea to a high level of detail. You must then be able to communicate to potential sponsors what exactly you will need, and what benefits they will receive in return. There isn’t any real formalised process for getting sponsorship, so just pick up the phone and start calling!

Grants

The line separating grants and sponsorship is not completely clear to me, personally. If I had to guess, I would say that unlike sponsorships, grants are provided without the grant provider expecting a direct return. The grant provider will likely stipulate how the money should be spent, however. The grant provider will typically want the grants they provide to go to activities that they believe society will benefit from.

An example of this is the grant BLUEsat received under the Science and Engineering Student Competition Sponsorship Program from the NSW Government to participate in the 2016 European Rover Competition (ERC). While the NSW Government did not directly benefit from this, they understand that encouraging BLUEsat’s participation in competitions like the ERC helps produce skilled engineers and future tax payers capable of contributing to the industries of tomorrow.

BLUEsat OWR ERC Team. Back row (left to right): myself, Timothy Chin, Denis Wang, Simon Ireland, Nuno Das Neves, Helena Kertesz. Front row (left to right): Harry J.E. Day, Seb Holzapfel. Centre: BLUEtongue Mars Rover w/ NSW Government Logo
BLUEsat’s attendance of the 2016 European Rover Competition was supported by a grant from the NSW Government.

 

Application to grants will typically involve a formalised process of filling out forms and producing justifications for how the grant money will be spent. Grants are typically awarded on a competitive basis, so if you’re interested in a grant, make sure your application is top notch by developing your space mission idea as much as you can.

Investment

Angel investors can help entrepreneurs execute on their business ideas by providing seed funding. Source: entrepreneur.com

 

Investment is a funding source that is only really available for commercial entities. Groups invest in companies because they anticipate the value of their investment rising (capital growth) and/or income from profits (dividends). If your space mission idea is not intended to produce revenue down the line that is greater than the expenditure required to start, then you cannot qualify for investment. While there are many loss-making companies that receive significant investment (Twitter, Uber, Tesla), they are able to continue receiving investment because they have convinced investors that they will experience significant capital growth and will produce dividends in the future.

Investment money is most often provided in exchange for a portion of a company (equity). For example, suppose Company X has managed to persuade investors that they have a shot at great success down the road. Investors will assess Company X’s records and business plans and ultimately decide to provide an appropriate sum of money in return for a sizeable portion of the business. In the seed investment round, when a business is just starting out, 20-30% of equity is usually exchange for anywhere between a few hundred thousand dollars to millions of dollars, depending on the perceived potential of the business model and the team running it.

Just like all the other funding sources, investment isn’t easy to get! While a well developed idea is all you’ll need for the previous funding sources, for investment you must show that there are customers waiting to pay for the solution your space mission is selling. This can be in the form of letters of interest, letters of intent, or some other official documentation. Applying to space friendly accelerator programs, such as MoonshotX, the Founder Institute, or UNSW’s own Textbook Ventures can help as well. But the best proof of all is getting customers to actually pay! Which brings us to…

Bootstrapping

Bootstrapping is an odd phrase, originally referring to the impossibility of raising your entire body into the air by pulling on the straps of your boots. Nowadays it is used to refer to companies that have not taken significant outside investment and have only used revenue from selling products or services to grow. A bootstrapped business is funded entirely by customers. The humble lemonade stall is a common example of a bootstrapped business.

It goes without saying that only a commercial space mission has the potential to bootstrap. And considering the significant costs involved in affording space hardware and launching it to space even with the CubeSat revolution, bootstrapping a space mission is incredibly challenging. After all, a commercial space mission must be in space to create revenue. But without revenue, a space mission cannot be bootstrapped to space! While bootstrapping has its place, it clearly cannot often substitute the initial funding required to get things going. However, a space business that can be bootstrapped is more likely to receive investment. Quite a catch-22!

 

We now have an idea of what the sources of funding for a space mission are. Now we just need to go and get them! Polish your ideas with as much detail as you can furnish, and start probing! Ask everyone who you believe might be relevant for advice, resources, or money. After all, you never get what you don’t ask for.

However, as you progress in your efforts, you will be met with frustration as people ignore you or turn you down for no good reason. How is it that the Elon Musks of the world can offer up an idea and immediately have investors lining up behind them? What do they have that you don’t? My answer to this is that they have experience, knowledge, and connections that you and I can barely dream of. If summed up in one word, I would say that they have credibility.

Join me in Part 4, where we’ll discuss strategies for how you can go about gaining credibility in your field.


Posted on by

It’s one thing to design a satellite or rover, but without manufacturing you’re dead in the water. Over the years at BLUEsat the problem, or more specifically the cost, of manufacturing has been recurring issue for our mechanical engineering teams. It’s not unusual for the bill for manufacturing our designs to come in at several times our material costs, not to mention long lead times, lack of quality control and no second chances once the part comes in.

Late last year the society decided that enough was enough and purchased a CNC router. A CNC is a simple machine at its core. It consists of a rapidly spinning tool that cuts away at material, which is then mounted on driven guide rails, controlling its position in space. Using this system in combination with computer controls, a CNC router can cut out almost any geometry that we choose.

BLUEsat's CNC Router
BLUEsat’s CNC Router

The process for making a part on the CNC has three stages:

  1. Model the part in CAD, we use Autodesk Inventor.
  2. Create a tool path using CAM software, we use HSM
  3. Secure your material to the CNC router, load the tool path, and begin cutting.

One of the parts that we made recently was an aluminium work holding jig. The model is shown below. This part has some complex features such as bottom rails, notched sides, counterbored holes and raised supports. To make this part by hand would take days and a very competent machinist, and we don’t have access to either.

Jig Plate CAD model
Jig Plate CAD model

Using this model, a tool path was developed with CAM software. The program does most of the heavy lifting, but the user must define the positions of each feature, the speed the machine moves at and how fast it will spin. These speeds are very important to the quality of the final piece and must be tailored to each feature. Below is an example of what the tool path looks like on the computer, red lines indicate that the machine is moving, blue lines show it is cutting.

 

Jig Plate CAM operations
Jig Plate CAM operations

Finally, with our tool path created, we were ready to set up the CNC itself. The material needs to be secured to the surface of the bed to prevent any movement during the cutting operation. This can be done in a number of ways, such as using a machine vice or work holding clamps. For this piece, we started with work holding clamps and then secured it using holes drilled into the material itself.

Now onto the fun part, the cutting. The tool path is loaded onto the CNC and the machine is set to run. Generally, we do a single operation at a time. This gives us time to clean up after each cut and inspect if it was successful. Here are a few videos of cutting.

 

 

 

 

All up, this part took 6hrs to machine. That included the setup, cutting and cleaning up of the part. Below is the final part:

Completed Jig Plate
Completed Jig Plate
Bottom View
Bottom View

 

 

 

 

 

 

 

 

 

 

 

Using our CNC has allowed for rapid prototyping of parts, drastically reduced lead times and most importantly, cut manufacturing costs by an order of magnitude.


Posted on by


 

 

At the start of semester we ran a number of seminars on different skills that we use at BLUEsat. In the first of these videos, former Rover Software Team Lead, Harry J.E Day went through an introduction to “the Robotics Operating System” (ROS) covering setting up ROS catkin workspaces, and basic publishers and subscribers.

You will need the vm image from here: http://bluesat.com.au/vm-image/

The slides from this presentation can be found here: http://bluesat.com.au/an-introduction…

Errata:

  • Some slides in the recording refer to “std_msg” there should be an ‘s’ on the end (i.e “std_msgs”)
  • On the slide “A Basic Node – CMakeLists.txt (cont)” there should only be one ‘o’ in “node.”
  • In step 4 of the “Publisher (cont)” section there should be an e on the end of “pub_node”
  • The person on the last slide was the first robotics COO at BLUEsat not CTO.

These have all been corrected in the slides.


Posted on by

Welcome back to my series on How to make a space mission! Last time we talked about how doing space activities has never been easier. CubeSats are making spacecraft cheaper and easier to make. Companies like Spaceflight and Nanoracks are making launch opportunities easier to access. And companies like SpaceX and Rocket Lab are reducing the costs of launch. As happened with the internet, opportunities for science and business are appearing in areas no one could have reasonably expected. For example, who would have expected people to pay to have their ashes put in space? That’s why this is the time to be thinking of ideas for space missions.

Here’s how I try to come up with ideas:

  1. Identify a problem;
  2. Understand the problem;
  3. Establish possible solutions; and
  4. Find best solution.

As simple as they may sound, these steps are sufficient to build a really strong idea for a space mission. That said, this is by no means an easy process. The more time you put in these steps, the stronger your idea will be. Even if your idea turns out to be unfeasible right now, it just might be achievable in a few short years. And if it doesn’t turn out to be feasible? That’s failure, right? It is failure, but under the fail-fast approach, failing early when you’re in this brainstorming phase is best. You don’t want to spend months or years developing software or hardware, only to find out that it’s not possible or that no one’s interested in it!

Let’s dive in.

1) Identifying a problem

Wait a minute, why are we talking about problems? Why aren’t we talking about ideas and solutions? Well, it turns out that engineers, scientists, and startup founders all agree that the problem is the first thing that needs to be identified when trying to build something. It is the first step in the engineering design process, the scientific method, and in the lean startup approach. The engineering design process is shown below. Being an engineer, it is the process I’m most familiar with.

Steps of the Engineering Design Process
The engineering design process. Source: www.sciencebuddies.org

 

Figuring out the problem you’re trying to solve is probably the most important step. People with money are incredibly stingy folks, whether they be investors, grant providers, or otherwise. They won’t care about your solution, no matter how cool or amazing it is if you can’t persuade them that the problem you’re solving is important.

Fortunately (or unfortunately), problems aren’t hard to come by. You can read about problems all day on the internet, often in news articles and blogs. Simply asking someone about their day might be enough for you to hear three or four problems. And since you, the reader, are part of several demographics, your problems might well be problems worth solving.

2) Understanding the problem

This step is where I would try to understand what it is that the problem needs. In engineering, we call this step the “Specify Requirements” step. Simultaneously, I aim to determine whether this problem is one that can be theoretically solved within the limits of natural laws and the resources we are capable of gathering. For example, no matter how much various groups might demand faster than light travel, we simply do not have any techniques to make a warp drive or hyperdrive! A more grounded example might be something like the following.

The government has found a need to track individual cars for what they assure you are perfectly non-dystopian reasons. To do this, we require a telescope in space capable of seeing objects in the size range 1m or smaller. This is our requirement. Simple right?

Now, assuming our space telescope is at a 500km altitude and it needs to be able to resolve objects of 1m size, we can do a bit of trigonometry to show that this comes to an angular size of 0.4 arcseconds (or 0.0001 degrees). Due to something called the diffraction limit, there is a limit to how small of an object a telescope can see. The rule can be generalised as: the bigger the telescope, the smaller the things it can see. We can see this relationship below.

The relationship between telescope diameter (vertical column) and angular resolution (horizontal column). Source: en.wikipedia.org

 

Assuming the government wants us to take pictures of cars in visible light, this means that for a resolution of 0.4 arcsecs we need a telescope 16 inches (41cm) in diameter. Considering CubeSats are generally made of 10cm cubes, fitting such a large telescope into a CubeSat would be quite a tall order! This lets us rule out this idea for CubeSats. That said, a larger satellite could quite easily take images of sufficient resolution.

This step is a hard one, and will require significant research and review of scientific and commercial principles to get through. But again, the more time you spend here, the stronger your case!

3) & 4) Establishing possible solutions and picking the best one

Now we finally move into the design phase! These two steps are where things can get reeeeally complicated very quickly. Coming up with solutions may require some serious imagination and creativity. Picking the best solution is harder still, and may require some serious engineering chops and commercial considerations. As such, I won’t go into very much depth for these steps in this blog post.

While understanding the problem in the previous step, a number of solutions hopefully came to mind already. Indeed, we already considered one possible solution: 40cm telescope satellites at a 500km altitude. But what about other solutions? Why not just have a few drones flying around to take pictures? How about a plane? Or high altitude balloons? Considering my bias towards space, you can guess which solution I would pick! Here’s the justification:

  • A well made satellite will produce images for years and years at a time with a single investment. The other solutions require regularly purchasing flights, fuel, or balloons.
  • The satellite can take images of almost any place in the world without any additional investment, allowing you to make your business global as soon as your satellite launches. In comparison, the other solutions can only take images locally.

There are more issues that I haven’t covered. Nor have I produced any proof for the above statements. The reason for this is simple – I’m only writing a blog post, not proposing an actual mission! In the course of your own efforts, you will need to produce numbers through engineering and market analysis to back your assertions. These will be covered in Part 4 of this series.

 

So we didn’t go into very much depth at all! “Where is the space engineering?” you may ask. What was the point of all this? Well, my dear space-loving reader, it turns out that if you’ve done these steps to a reasonable level of detail, you’ve qualified yourself to take the next step – raising funds!

At BLUEsat, we’re in the midst of developing our own space mission. We’ve named it GreenSat. Creative, right? It’s to be a platform for agricultural and biological experiments in space, with the goal of enabling agriculture in space. Right now, we’re working on step 3 and heading towards step 4. We’ll be taking our work to the International Astronautical Congress, where we will present our ideas to an international audience. Through this, we’ll hopefully be able to get GreenSat funded and launched.

Join me in Part 3 where we’ll discuss the various avenues that now exist to raise money for space missions!


Posted on by

One of the most surprising things about our experience at the European Rover Challenge last year was how incredibly close to total failure we came. Two days before the competition began, while we were in Poland, our control board failed. In addition to having to port all of our embedded codebase to Arduino in two days, we had to fix our overcurrent protection mechanism on the rover’s claw. This was a critical system since it prevents the claw servo from overheating when try to pick up objects. Before we developed our original software solution, a large number of servos had been destroyed due to overheating. Due to errors we’d made in calibration during the port to Arduino, our original software solution didn’t work and we had to think of something else.

Seb Holzapfel and I realised that a hardware solution would also solve this problem. We designed the circuit shown below. It consists of an Op-Amp, a diode, a mosfet and a few resistors. It was designed such that when a large amount of current flows through the 100m Ohm resistor, the PWM signals will be cut off from the claw. This causes the servo motor in the claw to stop drawing current, and therefore prevents overheating.

But why was this worth writing about? Well, we had to build this in a very short period of time and we didn’t really have the correct spare parts on hand. We only had a few op-amps, some jumper cables, some veroboard and a few resistors. This wasn’t enough to build the circuit shown above. We had to improvise. I realised that since our control boards had all failed, we could, in fact, harvest them for the parts we needed. Fortunately, after doing a quick stocktake of the parts on the old control boards, I determined that all the parts we would need were present. We just had to salvage them.

 

Seb is shown above trying, ultimately unsuccessfully, to fix one of our control boards.

 

 

While everyone else was out testing the rover and after we ported the code to Arduino successfully, Seb and I found a bit of spare time on our hands. About 2 hours. We got to work, I desoldered parts from the dead control boards with the hot air gun, while Seb put those parts together into the monstrosity you see below.

It isn't pretty, but it worked.

We then tested it using a coil of wire as a load and verified that it worked. It was then deployed onto the rover. Despite being built in just an afternoon, it actually worked better than the previous software solution when we tested it with the rover. And with this “solution”, we came 9th.

And that’s how we built a critical system in just 2 hours from parts we salvaged from dead control boards.

 

BLUEsat OWR ERC Team.
Back row (left to right): myself, Timothy Chin, Denis Wang, Simon Ireland, Nuno Das Neves, Helena Kertesz.
Front row (left to right): Harry J.E. Day, Seb Holzapfel

Posted on by

In a previous post we covered the software to perform detumbling – the first function of an Attitude Determination + Control System (ADCS). We now move onto the second (and more interesting) function of an ADCS – to point yourself in a certain direction. This functionality is critical when you have any direction-specific equipment on your satellite – whether you have a parabolic antenna for providing the internet or a space laser for destroying the internet, you need your equipment to be pointing in the right direction for it to work properly.

Now in order to point, you must be able to work out what angle you’re currently sitting at – this is normally done by using a magnetometer (an electronic compass). However, in our setup there was a lot of magnetic interference from the motor, making the magnetometer very inaccurate in calculating the platform’s angle. Thus we have to make use of the other sensors on board – a gyro and an accelerometer (the latter being fairly useless for measuring rotation).

The Problem

Imagine that you’re driving a windowless tram, and someone tells you that there’s five workers chained to the tracks exactly 100m ahead. Now it just so happens that you’ve brought along your favourite pair of bolt-cutters, but you also happen to be super lazy and would rather drive the tram to them instead of walking.

Bonus points if you use a bang-bang controller
The Trolley Problem for Engineers

 

As you look down at your odometer, you remember your old physics teacher going on about how to calculate your distance from your velocity (remember: distance = velocity ⨉ time). Unfortunately, this formula only works for constant velocities, and the accelerator is way too touchy to keep a constant speed. What do you do?

The Solution

Just like dealing with incriminating evidence, this problem can be solved by chopping it into tiny pieces. Let’s say for the first second of our journey, we recorded our average velocity – say 2m/s. Then we know that we’ve travelled 2m down the road (using our handy formula d = v ⨉ t). Similarly for the next second if we measure our average velocity to be 3m/s, then we know we’ve travelled 3m. So in total we’ve travelled 2m + 3m = 5m. It turns out we can calculate our position by repeating this process until we arrive at the workers.

Now in the case of our ADCS, our trusty gyro measures angular velocity. Angular velocity formulas work the same as linear ones, so we can actually use the same approach to work out the angle of our platform with ease. For example, if after 0.1s we measured our angular velocity to be 2°/s, then 0.1s later we measured it to be 3°/s, then our current angle would be (2⨉0.1 + 3⨉0.1)° = 0.5° anticlockwise from where we started (positive angles are anticlockwise by convention).

In code, this process (also known as ‘integration’) is simple. If every 5ms a new gyro measurement is taken, then the following line can be used to calculate the new platform angle.

currAng = oldAng + angVel * 0.005;
    //where angVel is the latest unbiased gyro measurement

The Controller

Now that we have a way to calculate the angle of our ADCS, we can reuse our proportional controller from our detumbling code:

Output = K ⨉ error

(where K is some tuned constant, and error was the difference between the current value and the target value.)

Now it’s possible to use this same controller to control our angle, but variety is the spice of life so let’s go for something a bit fancier – a ‘proportional derivative’ (PD) controller. In math-speak, a PD controller looks something like this:

Output = K ⨉ error + C ⨉ error’

(where K and C are two constants.)

The little apostrophe indicates a derivative (rate of change) – in this case it’s the derivative of the error, or how fast the error is changing. Remember that the error = current angle – target angle. The target angle is usually constant, so the rate of change of the target angle is 0 (because it’s not changing). Thus

error’ = current angle’

Now we’re just left with the rate of change of the current angle (how fast the angle of the platform is changing) – sound familiar? The rate of change of the current angle is the same as the angular velocity, i.e. what we originally got from the gyro!

Putting this all together, in code-speak our PD controller is given by:

output = k * posnErr + c * angVel;
    //where posnErr = targetAng - currAng
    //and angVel = angular velocity from gyro 

Tuning K and C is simply a matter of trial and error, finding the pair of values that minimises the time to reach the target angle without overshooting it.

The Radio

Now it’s a bit boring if your satellite can only point in a predetermined direction – what’s really fun is being able to point it wherever you want while it’s in operation. A simple potentiometer on a separate Arduino allows us to digitise angles:

angle = analogRead(A0)/1024.0*(2*PI);
    //assuming your pot rotates a full 360deg

So now all we need to do is somehow transmit this angle to the Arduino on-board the reaction wheel system – to do this we need to utilise RF (Radio Frequency), the black magic of electrical engineering.
In the realm of RF, things behave in strange ways. Radio signals vary in range depending on the time of day, straight wires transfer more power than bendy ones, and you can even use funny-shaped wires to improve your signal quality. It takes mad skills to truly harness the power of RF, and many believe that RF engineers are actually wizards.

Luckily for us, these wizards also sell radio modules that can interface easily with our Arduinos, such as the NRF24L01 chip. After wiring up a module to both the on-board and remote Arduinos (example), we can transmit data using TMRh20’s RF24 library and the following code:

On the transmitter side:

RF24 radio(9, 10);              //set pins 9 and 10 to be CE and SCN respectively
const byte rxAddr[6] = "00001"; //set address of radio
radio.begin();                  //initialise radio
radio.setRetries(15, 15);       //if message not received, wait 4ms ((15+1)*250us) before retrying, retry 15x before giving up
radio.openWritingPipe(rxAddr);  //open a pipe for writing
radio.stopListening();          //stop listening for RF messages, switch to transmit mode

float angle = analogRead(A0)/1024.0*(2*PI); //read in angle of potentiometer
radio.write(&angle, sizeof(angle));         //transmit angle as a radio message

On the receiver side:

RF24 radio(18, 19);
const byte rxAddr[6] = "00001";
radio.begin();
radio.openReadingPipe(0, rxAddr);  //open a pipe for reading
radio.startListening();            //switch to receive mode, start listening for messages
if (radio.available()){            //if there's an incoming message,
    float rx;
    radio.read(&rx, sizeof(rx));   //store it in the variable rx
}

These code stumps transmit the angle of the potentiometer from the transmitting Arduino to the receiving Arduino (the one on the ADCS). This allows us to update the target angle on the on-board Arduino, thus giving us run-time control of its position.

Here’s a demonstration of this whole thing in action:

 

 

(As always, code is available here)


Posted on by

Did you want to be an astronaut growing up? Were your lofty ambitions brought down as you got older?

I’m here today to tell you to aim high once again – to aim for space. Maybe not as high as actually personally going to space, but you can get pretty close thanks to advancements in miniature spacecraft. It has never been easier to send something you built yourself to space. While it’s still a lot of work, the rewards are incredible.

In recent years, increasing numbers of small satellites have been launched by people and organisations that historically had no ability to reach space. The most common architecture for these small satellites are known as CubeSats. These CubeSats are built with commercial off-the-shelf parts and can be developed by individuals or small teams in the space of a few years. They are launched into space by hitchhiking on the backs of larger satellites. These advances mean that CubeSats have become as much as 1000 times cheaper than traditional satellites. This cost decrease has enabled the rise and growth of NewSpace startups such as Planet, which has grown to a valuation of over a billion dollars in five years.

Two of Planet's Dove CubeSates being deployed from the International Space Station.
The first pair of Planet’s Dove CubeSats being deployed from the International Space Station.

 

Here’s what you’ll need to get started on developing your own CubeSat mission:

  1. An idea;
  2. Some money; and
  3. A few skills.

It doesn’t sound like much, does it? Let’s go into a bit more depth.

The Idea

The idea you come up will be what your bit of space hardware will do once it’s up there, or in other words, its mission. Satellites are the invisible MVPs of today’s world, taking care of weather forecasts, global navigation, communications and much more. If you want to send some hardware up there in the form of a satellite or otherwise, you will first need to find a problem to solve with it.

There are over 2000 operational satellites in space today, all doing their part for us. However, the small satellites and hosted payloads you or I can send up will not be doing the same work as the bigger billion dollar satellites. I mention this because the key to finding and building on a good idea isn’t sitting around and thinking really hard. To build a solid idea, you will have to read widely, speak to the people whose problem you’re looking to solve, and to listen carefully to their feedback.

Money

While money isn’t as big an issue nowadays as it once was thanks to the NewSpace revolution, reaching space is still an expensive ordeal. You will most likely need hundreds of thousands of dollars to pay for construction, testing, launch, and operations.

Now, there is a way to reverse this problem entirely, and instead make money from your space mission. The way to do this is to go back to your idea and to ask: Is this something people would pay for? Am I tackling a big enough pain point for people? While this is not the traditional way, you and I are even less likely to find success begging NASA or ESA for money.

Skills

Now here is where we at BLUEsat come in! As engineers with few ideas and little money, skills are where we try to excel.

Some serious engineering ability is still needed nowadays to reach space. But with open source architectures and modular off-the-shelf parts becoming more readily available, the level of knowledge needed has dropped considerably. A bit of background on the basics of spacecraft engineering, electrical engineering and coding is all you’ll need to get started. Learning the rest will happen automatically as you design and build.

This is more or less how BLUEsat approaches spacecraft engineering. Students joining BLUEsat aren’t equipped with encyclopedic knowledge of how spacecraft are built and how they work. We simply teach our members the basics, install some software for them and point them towards some problem that we would like to solve. Every one of our senior members has started from such humble origins and slowly googled and built their way to greater understanding.

Members of BLUEsat's ground station team messing about with RF electronics.
Members of BLUEsat’s ground station team messing about with RF electronics.

So why am I telling you this?

At BLUEsat, our Orbital Systems Division is hard at work on a number of projects. We have recently put together a team to work on developing a mission for our own CubeSat, and we need your help. No matter your year or degree, we will gladly take you in and help build your space engineering capabilities. We meet at Electrical Engineering (G17) room 419 every Saturday between 10:30AM and 5PM. Feel free to pop in and say hi.

I’ll see you folks in Part 2, where we talk a little more about how to come up with space mission ideas.


Posted on by

BUCKLE UP EVERYONE WE’RE GOING TO GO ON A WILD AND EXHILARATING JOURNEY INVOLVING SPREADSHEETS AND LOTS OF MEETINGS

BLUEsat does a lot of cool stuff. Robots, satellites and radios are all super cool. They get you engaged, using practical skills and building something physical that you can show off.

TO DO ALL OF THAT, YOU NEED MONEY

Soft drinks paid for a surprising amount of the robot

Before you can even start on figuring out how to allocate money to all the people who need it, you need to work out a rough budget for the project itself. As a non-technical member, I have no idea how much stuff costs. It therefore falls on team leads and the CTO to give me the numbers that we need to work with.

Batteries are expensive

I speak to work out a reasonable amount that can be allocated to each project, based on funding from previous years. Team leads then return a budget, and we discuss which parts are essential, working towards a target amount that everyone is happy with.

It’s at this point that we start stressing about how we’re going to afford all of this.

After figuring out how much money we’ll roughly need for the year, how we’re going to get the money suddenly become very important. Traditionally, BLUEsat has gotten a significant chunk of its funding from the University. In more recent years, our operations have grown and we’ve started working on two projects in tandem. This naturally increases costs. To keep up with our increasing capacity to churn through cash, we’ve started to seek sponsors from outside the university.

BLUEsat is currently sponsored by:

Platinum Sponsors:

NSW Government Logo  UNSW

Silver Sponsors:

Arc Clubs Logo

Bronze Sponsors:

ServoCity

Once all of that is done, we’ve got our budget for the year! Wouldn’t life be nice if nothing unplanned for happened?

 

 

 


Posted on by

The Waterfall Plot

Want to get your hands metaphorically dirty with some BLUEsat projects but don’t have enough cash to fund both your HECS debt and your rover? This is a project so simple to follow along that even an arts undergraduate can complete. We will be transforming radio signals that exist everywhere around you into a graph known as a Waterfall Plot. It will look something like this:


Leave this running on your computer for long enough that your mates walk by, they will think you’re tapping into Russian communications, and then land yourself an internship at Telstra.

Technically you can tap into Russian communications, its not a joke there, but other practical and less anti-facist applications include checking the signal strength in your network, interpretting packet radio, and listening to the Triple J hottest 100 (as shown in diagram).

 

So lets get started!

 

What you will need

You will need these to get started:

  • SDR (software defined radio)
  • Antenna
  • GNU Radio
  • Python
  • A Computer with at least 1 usb port

Software defined radios (SDR) are extremely handy pieces of equipment due to their size, cost and effectiveness. They connect via USB and only require an Antenna. We found a source that sells the model we will be using for only $20 which you can go to here.

GNU Radio is a python based open-source graphical tool to create signal flow graphs and generating flow-graph source code. We will be using GNU Radio to communicate with our SDR and it has the potential to do much much more. You can download their software here: www.gnuradio.org

Since GNU Radio will be operating in Python, it kind of makes sense to have Python. But what is Python? No it is not malware, so rest assured it won’t swallow up your operating system. Python is a widely used programming language, and the one that we will be using in this project. Make sure you get the right bit version by checking whether your downloaded version of GNU Radio runs on 64-bit or 32-bit. You can download it here: www.python.org

Here is an image of the SDR:

An SDR Compatible with GNU Radio

 

GNU Radio

Assuming that we have been successful up to this point in purchasing the equipment, downloading GNU Radio and setting up, we can begin creating our program.

A template that we will be using can be downloaded through this link. This will save time learning how to configure GNU Radio, however you may learn in your spare time how to add more powerful tools to improve and diversify from this template. Opening the file in the hyperlink will look like this.

The two main blocks that allows this program to function are the source block and sink block. The source block funnels the data from the SDR to GNU Radio and the sink block compiles the infomation to be displayed on a custom GUI. You may tweak the template as you gradually gain a better understanding of how the program works, including adding an audio sink which isn’t hard but that’s homework for you to figure out.

The last step is to compile and run the program, either by clicking the ‘play button’ or the F5 shortcut if you can’t find it. This will create a new window with the waterfall plot showing all the receivable frequencies in your range. The frequency slider on the bottom will allow you to adjust the centre frequency that you want to listen to.

So now you have your own cheap and miniature device for frequency capture! But now it is time to test it out on bigger and much more expensive equipment, like maybe a 2m antenna on top of the Electrical Engineering building…

Join BlueSat to participate in bigger and better projects than this by contacting us. Happy tapping into communications in the meantime!

Former BLUEsat President Tom Dixon with a groundstation antenna (do not hold operational antenna's this way!)


Posted on by

In our last article, as part of our investigation into different Graphical User Interface (GUI) options for the next European Rover Challenge (ERC) we looked at a proof of concept for using QML and Qt5 with ROS. In this article we will continue with that proof of concept by creating a custom QML component that streams a ros sensor_msgs/Video topic, and adding it to the window we created in the previous article.

Setting up our Catkin Packages

  1. In qt-creator reopen the workspace project you used for the last tutorial.
  2. For this project we need an additional ROS package for our shared library that will contain our custom QML Video Component. We need this so the qt-creator design view can deal with our custom component. In the project window, right click on the “src” folder, and select “add new”.
  3. Select “ROS>Package” and then fill in the details so they match the screenshot below. We’ll call this package “ros_video_components” and  the Catkin dependencies are “qt_build roscpp sensor_msgs image_transport” The QT Creator Create Ros Package Window
  4. Click “next” and then “finish”
  5. Open up the CMakeLists.txt file for the ros_video_components package, and replace it with the following file.
    ##############################################################################
    # CMake
    ##############################################################################
    
    cmake_minimum_required(VERSION 2.8.3)
    project(ros_video_components)
    
    ##############################################################################
    # Catkin
    ##############################################################################
    
    # qt_build provides the qt cmake glue, roscpp the comms for a default talker
    find_package(catkin REQUIRED COMPONENTS qt_build roscpp sensor_msgs image_transport)
    include_directories(include ${catkin_INCLUDE_DIRS})
    # Use this to define what the package will export (e.g. libs, headers).
    # Since the default here is to produce only a binary, we don't worry about
    # exporting anything.
    catkin_package(
        CATKIN_DEPENDS qt_build roscpp sensor_msgs image_transport
        INCLUDE_DIRS include
        LIBRARIES RosVideoComponents
    )
    
    ##############################################################################
    # Qt Environment
    ##############################################################################
    
    # this comes from qt_build's qt-ros.cmake which is automatically
    # included via the dependency ca ll in package.xml
    find_package(Qt5 COMPONENTS Core Qml Quick REQUIRED)
    
    ##############################################################################
    # Sections
    ##############################################################################
    
    file(GLOB QT_RESOURCES RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} resources/*.qrc)
    file(GLOB_RECURSE QT_MOC RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} FOLLOW_SYMLINKS include/ros_video_components/*.hpp)
    
    QT5_ADD_RESOURCES(QT_RESOURCES_CPP ${QT_RESOURCES})
    QT5_WRAP_CPP(QT_MOC_HPP ${QT_MOC})
    
    ##############################################################################
    # Sources
    ##############################################################################
    
    file(GLOB_RECURSE QT_SOURCES RELATIVE ${CMAKE_CURRENT_SOURCE_DIR} FOLLOW_SYMLINKS src/*.cpp)
    
    ##############################################################################
    # Binaries
    ##############################################################################
    add_library(RosVideoComponents ${QT_SOURCES} ${QT_RESOURCES_CPP} ${QT_FORMS_HPP} ${QT_MOC_HPP})
    qt5_use_modules(RosVideoComponents Quick Core)
    target_link_libraries(RosVideoComponents ${QT_LIBRARIES} ${catkin_LIBRARIES})
    target_include_directories(RosVideoComponents PUBLIC include)
    
    

    Note: This code is based on the auto generated CMakeList.txt file provided by the qt-create ROS package.
    This is similar to what we did for the last example, but with a few key differences

    catkin_package(
        CATKIN_DEPENDS qt_build roscpp sensor_msgs image_transport
        INCLUDE_DIRS include
        LIBRARIES RosVideoComponents
    )
    

    This tells catkin to export the RosVideoComponents build target as a library to all dependencies of this package.

    Then in this section we tell catkin to make a shared library target called “RosVideoComponents” that links the C++ source files with the Qt MOC/Header files, and the qt resources. Rather than a ROS node.

    add_library(RosVideoComponents ${QT_SOURCES} ${QT_RESOURCES_CPP} ${QT_FORMS_HPP} ${QT_MOC_HPP})
    qt5_use_modules(RosVideoComponents Quick Core)
    target_link_libraries(RosVideoComponents ${QT_LIBRARIES} ${catkin_LIBRARIES})
    target_include_directories(RosVideoComponents PUBLIC include)
    
  6. Next we need to fix our package.xml file, the qt-creator plugin has a bug where it puts all the ROS dependecies in one build_depends and run_depends tag, rather than putting them seperatly. You need to seperate them like so:
      <buildtool_depend>catkin</buildtool_depend>
      <buildtool_depend>catkin</buildtool_depend>
      <build_depend>qt_build</build_depend>
      <build_depend>roscpp</build_depend>
      <build_depend>image_transport</build_depend>
      <build_depend>sensor_msgs</build_depend>
      <build_depend>libqt4-dev</build_depend>
      <run_depend>qt_build</run_depend>
      <run_depend>image_transport</run_depend>
      <run_depend>sensor_msgs</run_depend>
      <run_depend>roscpp</run_depend>
      <run_depend>libqt4-dev</run_depend>
    
  7. Again we need to create src/ resources/ and include/ros_video_components folders in the package folder.
  8. We also need to make some changes to our gui project to depend on the library we generate. Open up the CMakeLists.txt file for the gui package and replace the following line:
    find_package(catkin REQUIRED COMPONENTS qt_build roscpp sensor_msgs image_transport)

    with

    find_package(catkin REQUIRED COMPONENTS qt_build roscpp sensor_msgs image_transport ros_video_components)
  9. Then add the following lines to the gui package’s package.xml,
    <build_depend>ros_video_components</build_depend>
    <run_depend>ros_video_components</run_depend>
    
  10. In the file browser create src and include/ros_video_components folders in the ros_video_components folder.

Building the Video Streaming Component

When we are using the rover the primary purpose the GUI serves in most ERC tasks is displaying camera feed information to users. Thus it felt appropriate to use streaming video from ROS as a proof of concept to determine if QML and Qt5 would be an appropriate technology choice.

We will now look at building a QML component that subscribes to a ROS image topic, and displays the data on screen.

  1. Right click on the src folder of the ros_video_components folder, and select “Add New.”
  2. We first need to create a class for our qt component so select,  “C++>C++ Class”
  3. We’ll call our class “ROSVideoComponent” and it has the custom base class “QQuickPaintedItem.” We’ll also need to select that we want to “Include QObject” and adjust the path of the header file so the compiler can find it. Make sure your settings match those in this screenshot:
    Qt Creator C++ Class Creation Dialouge
  4. Open up the header file you just created and update it to match the following
     
    #ifndef ROSVIDEOCOMPONENT_H
    #define ROSVIDEOCOMPONENT_H
    
    #include <QQuickPaintedItem>
    #include <ros/ros.h>
    #include <image_transport/image_transport.h>
    #include <sensor_msgs/Image.h>
    #include <QImage>
    #include <QPainter>
    
    class ROSVideoComponent : public QQuickPaintedItem {
        // this marks the component as a Qt Widget
        Q_OBJECT
        
        public:
            ROSVideoComponent(QQuickItem *parent = 0);
    
            void paint(QPainter *painter);
            void setup(ros::NodeHandle * nh);
    
        private:
            void receiveImage(const sensor_msgs::Image::ConstPtr & msg);
    
            ros::NodeHandle * nh;
            image_transport::Subscriber imageSub;
            // these are used to store our image buffer
            QImage * currentImage;
            uchar * currentBuffer;
            
            
    };
    
    #endif // ROSVIDEOCOMPONENT_H
    

    Here, QQuickPaintedItem is a Qt class that we can override to provide a QML component with a custom paint method. This will allow us to render our ROS video frames.
    Also in the header file we have a setup function which we use to initialise our ROS subscriptions since we don’t control where the constructor of this class is called, and our conventional ROS subscriber callback.

  5. Open up the ROSVideoComponent.cpp  file add change it so it looks like this:
     
    #include <ros_video_components/ROSVideoComponent.hpp>
    
    ROSVideoComponent::ROSVideoComponent(QQuickItem * parent) : QQuickPaintedItem(parent), currentImage(NULL), currentBuffer(NULL) {
    
    }
    

    Here we use an intialiser list to call our parent constructor, and then initialise our currentImage and currentBuffer pointers to NULL. The latter is very important as we use it to check if we have received any ROS messages.

  6. Next add a “setup” function:
    void ROSVideoComponent::setup(ros::NodeHandle *nh) {
        image_transport::ImageTransport imgTrans(*nh);
        imageSub = imgTrans.subscribe("/cam0", 1, &ROSVideoComponent::receiveImage, this, image_transport::TransportHints("compressed"));
        ROS_INFO("setup");
    }
    

    This function takes in a pointer to our ROS NodeHandle, and uses it to create a subscription to the “/cam0” topic.  We use image transport, as is recomended by ROS for video streams, and direct it to call the recieveImage callback.

  7. And now we implement  said callback:
    void ROSVideoComponent::receiveImage(const sensor_msgs::Image::ConstPtr &msg) {
        // check to see if we already have an image frame, if we do then we need to delete it
        // to avoid memory leaks
        if(currentImage) {
            delete currentImage;
        }
    
        // allocate a buffer of sufficient size to contain our video frame
        uchar * tempBuffer = (uchar *) malloc(sizeof(uchar) * msg->data.size());
        
        // and copy the message into the buffer
        // we need to do this because the QImage api requires the buffer we pass in to continue to exist
        // whilst the image is in use, but the msg and it's data will be lost once we leave this context.
        memcpy(tempBuffer, msg->data.data(), msg->data.size());
        
        // we then create a new QImage, this code below matches the spec of an image produced by the ros gscam module
        currentImage = new QImage(tempBuffer, msg->width, msg->height, QImage::Format_RGB888);
        
        ROS_INFO("Recieved");
        
        // We then swap out of buffer to avoid memory leaks
        if(currentBuffer) {
            delete currentBuffer;
            currentBuffer = tempBuffer;
        }
        // And re-render the component to display the new image.
        update();
    }
    
  8. Finally we override the paint method
    
    void ROSVideoComponent::paint(QPainter *painter) {
        if(currentImage) {
            painter->drawImage(QPoint(0,0), *(this->currentImage));
        }
    }
    
  9. We now have our QML component, and you can check that everything is working as intended by building the project (hammer icon in the bottom right or the IDE or using catkin_make). In order to use it we must add it to our qml file, but first since we want to be able to use it in qt-creator’s design view we need to add a plugin class.
  10. Right click on the src folder and select “Add New” again.
  11. Then select “C++>C++ Class.”
  12. We’ll call this class OwrROSComponents, and use the following settings:OwrROSCOmponents class creation dialouge
  13. Replace the header file so it looks like this
    #ifndef OWRROSCOMPONENTS_H
    #define OWRROSCOMPONENTS_H
    
    #include <QQmlExtensionPlugin>
    
    class OWRRosComponents : public QQmlExtensionPlugin {
        Q_OBJECT
        Q_PLUGIN_METADATA(IID "bluesat.owr")
    
        public:
            void registerTypes(const char * uri);
    };
    
    #endif // OWRROSCOMPONENTS_H
    
  14. Finally make the OwrROSComponents.cpp file look like this
    #include "ros_video_components/OwrROSComponents.hpp"
    #include "ros_video_components/ROSVideoComponent.hpp"
    
    void OWRRosComponents::registerTypes(const char *uri) {
        qmlRegisterType<ROSVideoComponent>("bluesat.owr",1,0,"ROSVideoComponent");
    }
    
  15. And now we just need to add it our QML and application code. Lets do the QML first. At the top of the file (in edit view) add the following line:
    import bluesat.owr 1.0
    
  16. And just before the final closing bracket add this code to place the video component below the other image
    ROSVideoComponent {
       // @disable-check M16
       objectName: "videoStream"
       id: videoStream
       // @disable-check M16
       anchors.bottom: parent.bottom
       // @disable-check M16
       anchors.bottomMargin: 0
       // @disable-check M16
       anchors.top: image1.bottom
       // @disable-check M16
       anchors.topMargin: 0
       // @disable-check M16
       width: 320
       // @disable-check M16
       height: 240
    }
    

    This adds our custom “ROSVideoComponent” who’s type we just registered in the previous steps to our window.

    Note: the @disable-check M16 prevents qt-creator from getting confused about our custom component, which it doesn’t detect properly. This is an unfortunate limit of using cmake (catkin) rather than qt’s own build system.

  17. Then because Qt’s runtime and qt-creator use different search paths we also need to register the type on the first line of our MainApplication::run() function
    qmlRegisterType<ROSVideoComponent>("bluesat.owr",1,0,"ROSVideoComponent");
    
  18. Finally we need to add the following lines to the end of our run function in main application to connect our video component to our NodeHandle
    ROSVideoComponent * video = this->rootObjects()[0]->findChild<ROSVideoComponent*>(QString("videoStream"));
    video->setup(&nh);
    

    And the relevant #include

    #include <ros_video_components/ROSVideoComponent.hpp>
    
  19. To test it publish a video stream using your preferred ros video library.
    For example if you have the ROS gscam library setup and installed you could run the following to stream video from a webcam:

    export GSCAM_CONFIG="v4l2src device=/dev/video0 ! videoscale ! video/x-raw,width=320,height=240 ! videoconvert"
    rosrun gscam gscam __name:=camera_1 /camera/image_raw:=/cam0

Conclusion

So in our previous post we learnt how to setup Qt and QML in ROS’s build system, and get that all working with the Qt-Creator IDE. Then this time we built on that system to develop a widget that takes ROS video data and renders it to the screen, demonstrating how to integrate ROS’s message system into a Qt/QML environment.

The code in this tutorial forms the basis of BLUEsat’s new rover user interface, which is currently in active development. You can see the current progress on our github, where there should be a number of additional widgets being added in the near future.  If you want to learn more about the kind of development we do at BLUEsat or are a UNSW student interested in joining feel free to send an email to info@bluesat.com.au.

Acknowledgements

Some of the code above is based on a Stack Overflow answer by Kornava, about how to created a custom image rendering component, which can be found here.