Tuesday, May 17, 2016

Raspberry Pi Quadcopter (Almost 1 year update)

So I haven't been working on the Quad very much. I took it to college with me, but I couldnt find the time, space, or motivation to make any significant progress with it. the problem I was having last winter break, i believe, was that the code from PiStuffing worked, but The Quad just kept accelerating to its maximum thrust, and blew off its propeller caps. At first, I thought that this may be a problem with the PID gains, because the guy who built the code probably had a quad with different specs and mass properties. I thought I would have to somehow find the right gains for my design.

Then just recently, I had a different Idea. I realized that the commands that we were giving the quad were just simple velocity commands, and so the gains had nothing to do with the constant max thrust. Even if the gains were wrong, we would still see a constant thrust(PWM Signal)rather than an increasing one, when the velocity was set to a constant. The problem is not with the code. It is with the experiment. When testing, I did not want to destroy my ceiling, or fly in to international airspace, so I had tethered the quad to the ground using thread, with some slack.

When I ran the tests, the incorrect gains caused the quad to thrust up a little too much but because of the thread, the quad could not rise effectively. The PID took the small velocity change as an input and we needed more velocity, so it increased the velocity as an output. Now the thread was pulled taut, and velocity was zero, so the PID just kept increasing the thrust until the max thrust was reached.

I need to find a way to test the quad in a more effective way.

Solidworks and Matlab

robotics is an expensive hobby. with all of the electronics parts and the structural components, and the power supplies and the testing equipment, it is hard to find an easy way for me to build some of the ideas i had in mind. for example, I wanted to prove to myself that I could fully design and build a robotic arm, but i have no idea where to start, since randomly buying servos and parts and hoping it works is not good engineering practice, nor is it cost efficient. 

I talked to one of my professors, Dr. Panogiotis Artemiadis, about using some of his EMG devices to control a simple robotic arm, as an interesting project. he agreed to help me (and I am extremely thankful), but first I have to have a robotic arm to use. I decided that instead of spending so much time and money on building a real arm, why not design one on SOLIDWORKS, and simulate the motion and control systems in Matlab. I learned both of these softwares inn these first two years of college. So i created a very rudimentary robotic arm, with only two degrees of freedom, meaning two arms, in SOLIDWORKS. 

Once I had that assembly made, after a little bit of online research, I Installed the Simmechanics toolbox in to Matlab, and the simmechanics link in to solidworks. Then I exported the assembly as an xml file, through the simmechanics link, which seems to be the only way to actually import a model in to matlab/simulink. When I opened the xml model of the arm in simulink, the progrram automatically had created a project with all of the individual parts from my assembly, with all the relations put in. 

When I ran the simulation, the output was a 10 second video of the arm starting in one orientation, and then moving based on the effects of gravity and the relations and geometries of the parts. Because it was just a two joint arm, it modeled like a two arm pendulum, with very complex motions. 

I realized that the simulink model did not take in to account any collisions between the parts, as the arms freely swung through the base, despite both being solid parts. I am now going to focus on adding angle constraints to prevent part collision. I will also add a PID control loop, with a step function, and joint actuators at the hinges, to be able to control the angle, and the measure the torque necessary to hold up the arm. 

That max torrque will tell me how powerful of a servo will be necessary to power the arm and not break. 

Friday, March 18, 2016

Does technology drive new product development?

Whenever I see a new piece of tech come out to the market, and get mad publicity, it seems like there is always some crazy breakthrough research that is being used. How do I go about coming up with creative ideas for a useful, successful product, if I dont have the research to back it?

My current creative process involves using technologies that I found on the internet, mashing them together with preexisting ideas I have from my own experience, and part by part, work the idea in to a product. Then when I check online if anything like this already exists,  it turns out that it does, and it is 100 times better than the one I thought of. 

Either that happens, or my Idea for a cool invention is just so completely bizzare and unnecessary, that I can't justify turning i in to a reality. 

For a truly revolutionary idea, in the past people, with revolutionary ideas like Tesla, or Google, had truly original ideas, that were ahead of their time. Maybe taking inspiration from current new tech is the wrong approach,  Maybe the best way to do this is by changing the way I approach coming up with new Ideas. Maybe I can try avoiding new tech all for a while and see what happens. I draw inspiration from it, but If i stop that, perhaps I can  solely focus on looking for real world problems that need to be solved. That is, of course what most experienced people suggest. 


Monday, March 14, 2016

EEG vs EMG

In the last post, which was a while back, I was discussing some Ideas I had about the EEG headset and future technologies. The obvious problem with that is the fact that the tech is so underdeveloped due to our minimal knowledge of the human brain, that we cant hope to create any worthwhile robotic applications in the near future.

When I was discussing this, a friend of mine pointed me toward another similar tech called Electromyography, or EMG. It uses actual muscular stimulation as the data input instead of extrapolated neural signal patterns which currently have a very low degree of accuracy. With the EMG you place electrodes similar to the EEG electrodes, on specific locations on the muscle you are targeting. The sensor detects electrical signals sent to the muscles through nerves. This signal has a direct influence on the tension or relaxation of the muscle, and there is almost no way to interpret the signal incorrectly.

MYO is a product already available for purchase, that uses this tech for basic audio/video control, such as pause, play, fast forward, rewind. https://www.myo.com. It is an arm band that detects gestures, and relays them to any device with bluetooth. It is also open for developers, and people like https://www.youtube.com/watch?v=nDeOFxhH5lY have used it in basic robotics as well.

I want to go a step further, and use the EMG tech to develop a fully prosthetic robotic arm/robotic sleeve that slips on to the arm.

The device will be an extension of the user. There are endless possibilities. Spiderman webs shooting at one end? a specific hand pattern to unlock, improving security? hydraulic shoulder extension to punch harder? It'll be like Inspector Gadget! This is within our current capabilities as engineers.

Wednesday, January 27, 2016

Brainwave Reading Applicatons

As mentioned last time, This new technology has limitless applications. It is one of the first steps towards integrating technology in to the human body.

The mind is a highly versatile and flexible tool. Techniques such as hypnosis, as well experiences like learning a new skill or habit show that the brain is not completely hardwired, and can be rearranged to link one mental thought pattern(trigger) to a completely unrelated action. This means we can link different thoughts with specific effects that they have, caused by the EEG headset. This has been most prominently used in prosthetics. New experiments have shown that artificial limbs can be controlled by thoughts through the headset, in a natural fluid motion. Obviously, you're not using the same neural pathways to control the artificial hand as the ones everyone else uses,  however the different pattern becomes second nature after constant use. That is how we learn

A direct extension of this is the use of EEG headsets to control a full robotic body!

-Jump cut to the scene form James Cameron's Avatar(2009)- and now replace the blue humanoids with the robots from iRobot. HOW COOL WOULD THAT BE!!

Of course this level of thought reading is probably hundreds of years in the future, but the implications are so beautiful. Currently, the commercially available headsets have a very narrow range of functions. The MindWave specifically, reads raw data, attention, relaxation, blinks, and a broad spectrum of brain wave frequencies. While the relatively mediocre technology reads a larger band width, it can not effectively read more detailed fluctuations/patterns. 

I was looking at the problem of controlling an RC car with this tech. There are four basic commands needed to operate: forward acceleration, negative acceleration, turn left, and turn right. The headset only reads two mental states: attention, and relaxation. 

Some people have approached this problem by only using the headset for part of the controls. for example, use a joystick for controlling direction, and the headset for moving forward or backward. I don't know about you, but this seems pretty boring and unnecessary to me. I want full control. For this, I think we need to experimentally find new mental states/brain wave patterns to use. 

To be continued...


Monday, January 25, 2016

TELEPATHY!!

Recently, while I was sitting in the university library, letting my thoughts wander, thinking about the technology of the future, I remembered a ted talk I had watched a couple years ago, about a headset that could read your thoughts! It was pretty far developed at the time, but I never heard of it actually taking off. So I looked in to it again, and what I found was a Gold Mine!

There are currently So many businesses and research labs designing and  working with small EEG (electroencephalogram) scanners in the form of a headset or a cap, that read your thoughts. The scanners read a wide range of electromagnetic waves from your brain, and neuroscience has correlated the different frequencies to different thought patterns.

Currently, the most common application for these devices, is treating mental health through biofeedback, or helping the disabled. However, several low cost scanners have also been added to the market for enthusiasts and developers.

The Potential is Endless for this tech. It can be integrated into robotics, internet of things, and data analytics, just to name a few. The best part, is that while the actual devices are a minimum investment of $100,  the software to operate them is virtually free, making it a developer's dream!

Neurosky is one of the products I was looking in to. They have a cheap EEG headset with Bluetooth capabilities, and furthermore, have made it a Point, to invite Developers to use their product. While I don't doubt that they have some selfish motives for this (publicity and increased market size), giving the people a chance to experiment and play with it is a big step in pushing technological development forward.

I am eager to invest in a Neurosky headset, and experiment with it. In my next post, i'll explain some of the ideas I had in using this technology and its future, and also how I plan to use it.

Saturday, January 9, 2016

Raspberry Pi Quadcopter Version 4

I went to college after the props shattered and I took the Pi with me, to continue my analysis on how I could tune the gains easier.

I am a part of the Micro Air Vehicles (MAV) Club at ASU, and I took my problem as well as my designs to some of the members there. I got some pretty good feedback from them: first of all, the frame was too wide, and so the props were back washing on to the quad, reducing the effectiveness of the motors, and causing further turbulence that the PID was to slow to correct for. Essentially, I was pushing against myself.

After I came home for winter break 2015, i continued my search for a better design. I found a blog called Pistuffing. Some guy was using the Pi to build a Quad, just like me, and although it was a work in progress, His code was a million times more sophisticated than the chicken scratch I had thrown together. Still, I found that I had the right idea, and so I began to look at his Github Repository, and analyze his code. I adopted the files he had created, and started experimenting. It took almost the whole break to understand it, because I started out having no Idea how to even begin, but eventually I got it.

I bought strips of balsa wood, which is light, and new propellers. I once again mounted everything, and began testing, by tying the four corners of the Quad to chairs, so that it would not flip over or shoot up in to the sky. I learned how to debug code using print statements to narrow down the location of the problem.

I found some flaws in the code, which were causing the quad to just go full thrust instantly, or perhaps that is due to my design. It turns out that the latest version is lighter than anticipated, and so it takes less thrust to lift.

That is the point where i am right now. It may work reasonably, and have found a way to test it. I am going back to college now, but will for sure continue to try and get it to fly in the future.

Quadcopter V.4