Articles by tag: software

Articles by tag: software

    Making a Ports Map

    Making a Ports Map By Omar and Jayesh

    Task: Create a list of motors/servos and what ports they're connected to

    Very often, when we disconnect a motor or servo (maybe on accident), we forget what port we got it from. This even happened to us today when we unplugged the servo that lifts the trough. Because of this hassle, we decided to write out a list of all the motors and servos on our robot and what port they're connected to so we can refer back to it if we ever forget where we took something from again.


    This didn't take us much time at all and was easy to do, so we might want to make a lot more of things like these in the future to keep ourselves organized. Organization is definitely something we need to work on for this next competitive year, and even small things like these help out with that.

    Programming our New Robot

    Programming our New Robot By Tycho, Caitlin, Ethan, and Jayesh

    Task: Program our new mecanum wheel driving platform

    Now that our new robot has been built with a mecanum wheel platform, we can start write our drive code and figure out how to make our robot preform three basic motions: forwards and backwards, side-to-side and to rotate. We decided that, in order to get the best understanding of our robot, how it moves and our code, that we would try to write our drive code through trial and error. However, we did reference some guides written by other various FTC and FRC teams if we got stuck on something and needed to figure out where to start in solving the problem.


    In order to drive our mecanum wheels properly, we need to first discuss how each wheel is placed on the robot, and also how each wheel needs to move in respect to the others in order move in a certain direction. Each wheel has small rollers that point 45 degrees off of the larger wheel itself. In order to properly set up mecanum wheels, the rollers on each wheel have to point towards the center of the robot.

    This is important, because if these wheels are not pointing in the proper direction, then the rollers will begin to fight against each other, causing strange driving patterns that aren't very useful. We learned this the hard way, because when reconstructing the wheel mounts, the positions of two wheels on robot were flipped, causing the robot to drive in circles when we tried to drive sideways.

    The main reason we decided to go with mecanum wheels is because they open so many different ways to navigate the field. Robots that properly use mecanum wheels can not only go forwards, backwards and turn, but they can also make a robot move side to side. These three types of movement can be mixed with each other to do even cooler things like move in diagonals or even strafe, which is when a robot moves in an arc while moving sideways. Of course, we cannot use mecanum wheels to their full potential if we do not first understand the first three basic types of movement. Driving forwards and backwards is pretty simple, and it's the same as any other robot; all of the wheels have to move in the same direction. Turning also remains unchanged from other platforms; the left side of the robot has to move one way, and the right side of the robot has to move the other.

    However, moving side to side is not really intuitive compared to the others. In order to move side to side, the wheels on either side have to move opposite of each other. For example, if I wanted to, from a top-down perspective, drive to the right, the wheels on the left side of the robot would have to drive away from each other, while the wheels on the right side of the robot would have to drive towards each other. The rollers start spinning away from the center and to the right on the left side of the robot, or towards the center and to the right for the right side of the robot. The forwards and backwards components of the wheels and the rollers cancel each other out, and the robot moves to the right.

    Mecanum Driving

    Mecanum Driving By Tycho

    Task: Code driving under mecanum wheels

    Today, I wrote the whole code for controlling our mecanum wheels. It is entirely fron scratch, and works perfectly right off the bat. This code allows us to strafe, move backwards and forwards, and rotate, in one method.


    We still have a lot of coding to do, as we're currently working on a particle-launching system. As well, we need to consider autonomous soon.

    Autonomous Setup Options

    Autonomous Setup Options By Tycho

    Task: Create a basic autonomous

    Autonomous is one of the things that we tend to be weak on every year, and this year, we really want to get to super-regionals. So, to start off this year's autonomous, we first mapped out a potential path for the robot on the field. We then followed up with programming, using our previous methods like driveForward and driveCrab. So now, we have a basic autonomous program in which we can push the cap ball and attempt to shoot the vortex.


    We still have a long way to go in working on our autonomous - we need to be more accurate in shooting the vortex, we would like to hit the beacons, and we want to get parking points as well.

    Combining TeleOp and Autonomous

    Combining TeleOp and Autonomous By Tycho

    Task: Combine TeleOp and Autonomous code

    Today, I combined the autonomous and teleop so that we can demo both more easily. As well, during testing, we now can switch between them seamlessly so that our testing is power. The most important part of this code is that we can configure the autonomous before we launch - telling the robot how many balls we have, how many to shoot, what side the robot is on, and other pertinent options.


    We still need more code and fixes - our robot keeps having random errors while launching. As well, we have intermittent lag.

    Fixing Faulty Encoder

    Fixing Faulty Encoder By Tycho and Jayesh

    Task: Fix a faulty encoder on our robot

    This shows a test of our encoder issues. It might have been a month ago that we noticed a strange behavior in our autonomous code when the robot was moving forward at low speed. It would curve to the right when we were telling it to go straight. We probably would have noticed the problem earlier if we had any kind of subtlety in our driving. But we didn't, partly because the problem goes away when driving at full speed. We did suspect that the problem was in the encoder feedback of the front left drive wheel. In this video you can see how it ramps up to full speed much faster than the other motors. Here we are driving the robot with encoder PID active. But when driving backward, the motor/wheel behaves properly. This indicated to us that one channel of the encoder was working normally while the other was skipping ticks. But the problem could be in the encoder, the encoder cable or the motor controller. Since our motors take some time to change out and we were heading off to the Arkansas State Championship, we decided to simply turn off PID control. The problem goes away when we are simply driving the motors with open loop power levels - proving that the encoder is the culprit and that motor imbalance was not an issue. The problem with just turning of PID control is that we were still getting bad odometry since our method of calculating distance traveled is based on averaging the encoders across the four motors. So we had to adjust our target distances in autonomous based on trial and error instead of proper calculations.

    Here we are trying to identify the source of our encoder issues. We swapped the encoder cable and and ports on the motor controller, but the problem stayed with the front left motor. That told us that it was the motor's encoder that was the issue. We confirmed through telemetry that the encoder was giving no ticks when driving forward, but worked fine in the reverse. So we replaced the whole motor and this time it sped up faster than the other motors in both directions. That really mystified us for a bit. Turns out the substitute motor had an encoder that was dead on both channels. What were the odds of that? We'd simply pulled a motor from our inventory and assumed it would be OK. But it must have been one we hadn't needed to be encoder controlled. We finally found a third motor, this time tested its encoder before throwing on the robot, and all is well now. We can now drive the robot under PID control and it drives as expected at various power levels. We need to re-calibrate our blended travel distances with the working encoder, but feel much better about our performance. We can even reliably use run to position commands if we want, though our navigation code doesn't require that.

    This just shows the proper behavior of the robot after our encoder troubles were resolved:

    Return to Machine Vision

    Return to Machine Vision By Tycho

    Task: Prepare to reintegrate machine vision

    A year and a half ago while the new Android-based platform was still in pre-launch, we were the first team to share a machine vision testbed on the FTC Forums. That color-blog tracker was implemented with OpenCV on Android, but with a different low-level control system and robotics framework. Then we integrated OpenCV into our implementation of ftc_app, which was in turn based on the great work of rgatkinson's team supporting Swerve Robotics. Our main game repo for FIRST RESQ was also open sourced and we gained a lot of experience using it. But we had many issues that prevented full usage of the capability. There were problems with the whole control system that created extremely variable loop times which really challenged our custom PID implementation. On top of that, we found that in many tournaments the beacons were not working, or the lighting was too bright and our camera was being flooded by the white shell of the beacons. That made it an unreliable solution.

    So this year we switched to the modern robotics color sensor as a slightly more reliable method of detecting the current color while up close. This also gave us the ability to add color sensors to both sides of the robot so we no longer have to turn around when on the blue alliance. And we found we had good-enough navigation based on calibrated odometry so that we could get into position without color tracking.

    But now we need to go ahead and try to re-integrate our previous machine vision code and see if we can improve on the situation. We also need to at least try out the Vuforia object tracking capabilities, even though we've set that as a lower priority because we know that specular reflections are likely to be a problem under varying lighting conditions at different competition spaces. We've noticed this problem at a couple of spaces due to the marker placement behind the planar polycarb of the border walls. So we don't actually plan to rely on this as a primary means of navigation and positioning, but we should try it out and see how robust it might be.

    We still want to use machine vision to possibly track beacons and particles. We are hoping to track particles to create an autonomous behavior that triggers during teleop so that a particle near the front of our particle collector can be automatically approached and pulled in without operator intervention. This should help since picking up particles on the far side of the robot from the drivers is very difficult because of blocked sight lines. We want to use color tracking to make sure we don't pick up opposing alliance particles.

    Research / References

    I checked out Vuforia and there is no ability to track based on color. So we need to use OpenCV again, but when Vuforia is present it also locks up access to the camera. Fortunately there is now a way to get a frame from Vuforia and reformat the image data for OpenCV's use.

    I plan another post to document the actual steps we went through. Stay tuned. If Vuforia proves troublesome, we might revert to getting our image from a camera preview just like last year. Though that would mean messing around with the Android manifest and the layouts in the main FtcRobotController folder.

    Inspire Award

    Inspire Award By Tycho, Jayesh, Caitlin, Omar, Max, Darshan, Evan, Ethan, Janavi, and Charlotte

    1st Place at North Texas Regional Championship

    Iron Reign members left to right are Ethan Helfman (Build, Communications), Janavi Chada (Programming, Communications), Tycho Virani (Programming Lead, Main Driver), Jayesh Sharma (Business Lead, Build, Communications), Darshan Patel (Build), Caitlin Rogers (Communications Lead, Logistics, Business) and Charlotte Leakey (Programming, Logistics), with Evan Daane (from BTW, Build, Photography) in repose. Not shown: Max Virani (Design Lead, Programming), Omar Ramirez (Build Lead) and Rohit Shankar (Programming).

    Wow, we did it. I mean, we were going for it, but wow - we did it! Out of 118 teams competing in our region, we got 1st Place Inspire (Top Award) at our regional championship! We finally earned the coveted Inspire Banner. We've been building toward this for 7 years! Ever since we started as an FLL team.

    Our total awards included Inspire 1st, Finalist Alliance 2nd, Motivate 2nd, Connect 3rd, Innovate 3rd.

    Not going to Disney World yet

    We are now qualified for the Texas State UIL Robotics Championship and the 12 State South Super Regionals. And we are preparing with the goal of making it to the World Championship. We have an extended season and while some of us have been to super regionals before, this is the first time the whole team gets to go. Our coffers are empty, we need a whole new round of fundraising to keep up the progress for the extended season. We need your help! Please consider contributing to support our extended season and help us represent North Texas at Supers.

    In case you don't know how the game works, it's broken into a 30 second autonomous phase followed by a 2 minute driver controlled period. Two alliances of two robots each compete in each match. Here is our division winning match with alliance mates Technibots. Autonomous:

    And Tele-Op:


    Vuforia By Janavi and Tycho

    Task: Use Vuforia to enhance autonomous

    We use Vuforia and Open CV vision to autonomously drive our robot to the beacon and then click the button corresponding to our team's colour. We started this by getting the robot the recognize the image below the beacon and keep it within its line of vision. Vuforia is used by the phone's camera to inspect it's surroundings, and to locate target images. When images are located, Vuforia is able to determine the position and orientation of the image relative to the camera.

    To start setting up our robots vision, we watched Team 3491 FixIt's videos on Vuforia to help us understand how to set it up. After finishing the code for following the image, we went to go test it out. We found out that we had accidentally coded the robot to follow the picture by moving up and down, as we had coded the phone for portrait mode instead of landscape. After fixing that, we tested out the robot and it ended up attacking Tycho by running at him and the image at full speed. Apparently, we had accidentally told the robot to go much farther than it was supposed to go by placing a parenthesis in the wrong spot. We tested the code one more time, only this time I held the picture while standing on top of the chair. Luckily the robot worked this time and was able to follow the image both ways.


    We would like to explore the uses of Vuforia+OpenCV. We are considering using it to determine particle color as well as using it to view the images beneath the beacons.


    OpenCV By Ethan and Tycho

    Task: Implement OpenCV in autonomous

    Last year, we had some experience with OpenCV to press the beacons, and this year we decided to do the same. We use OpenCV to find the color we are looking for on the beacon in conjunction with Vuforia. First, it detects the search pattern in the view with vuforia, then isolates that area and finds the side of the beacon with the correct color. Our code is based off of FTC team 3491, Fixit.

    In a previous post, we talked about OpenCV and Vuforia research, as well as how we plan to use it in the robot game. And now, I am happy to say, we have implemented it partially in our autonomous. We can now figure out what side a specified color is on the beacon using our new phones, the Galaxy S5. (Side note - do not update them to Android Lolipop if you want to use OpenCV, it will not work.)


    In the future, we would like to implement OpenCV so that we can detect the color of a particle during autonomous and pick it up to shoot it. However, we probably don't have time to program it before Super Regionals.