Articles by tag: software

Articles by tag: software

    Balancing and PID

    Balancing and PID By Tycho

    Task: Test and improve the PID system and balance code

    We're currently testing code to give Argos a balancing system so that we can demo it. This is also a test for the PID in the new REV robotics expansion hubs, which we plan on switching to for this season if reliable. Example code is below.

    public void BalanceArgos(double Kp, double Ki, double Kd, double pwr, double currentAngle, double targetAngle)
     {
         //sanity check - exit balance mode if we are out of recovery range
     
     
     
         if (isBalanceMode()){ //only balance in the right mode
     
             setHeadTilt(nod);
     
             //servo steering should be locked straight ahead
             servoSteerFront.setPosition(.5);
             servoSteerBack.setPosition(0.5);
     
             //double pwr = clampMotor((roll-staticBalance)*-.05);
     
             balancePID.setOutputRange(-.5,.5);
             balancePID.setPID(Kp, Ki, Kd);
             balancePID.setSetpoint(staticBalance);
             balancePID.enable();
             balancePID.setInput(currentAngle);
             double correction = balancePID.performPID();
     
             logger.UpdateLog(Long.toString(System.nanoTime()) + ","
                     + Double.toString(balancePID.getDeltaTime()) + ","
                     + Double.toString(currentAngle) + ","
                     + Double.toString(balancePID.getError()) + ","
                     + Double.toString(balancePID.getTotalError()) + ","
                     + Double.toString(balancePID.getDeltaError()) + ","
                     + Double.toString(balancePID.getPwrP()) + ","
                     + Double.toString(balancePID.getPwrI()) + ","
                     + Double.toString(balancePID.getPwrD()) + ","
                     + Double.toString(correction));
     
             timeStamp=System.nanoTime();
             motorFront.setPower(correction);
     
    

    REV Robot Reveal

    REV Robot Reveal By Tycho, Austin, Charlotte, Omar, Evan, and Janavi

    Argos V2 - a REV Robot Reveal

    This video was pulled from Argos visits to: The NSTA STEM Expo in Kissimmee FL, in the path of eclipse totality in Tennessee, and in North Texas at The Dallas Makerspace, The Southwest Center Mall, Southside on Lamar and the Frontiers of Flight Museum. We hope you find it interesting:

    Machine Vision Goals – Part 1

    Machine Vision Goals – Part 1 By Tycho

    We’ve been using machine vision for a couple of years now and have a plan to use it in Relic Rescue for a number of things. I mostly haven’t gotten to it because college application deadlines have a higher priority for me this year. But since we already have experience with color blob tracking in OpenCV and Vuforia tracking, I hope this won’t be too difficult. We have 5 different things we want to try:

    VuMark decode – this is obvious since it gives us a chance to regularly get the glyph crypto bonus. From looking at the code, it seems to be a single line different from the Vuforia tracking code we’ve already got. It’s probably a good idea to signal the completed decode by flashing our lights or something like that. That will make it more obvious to judges and competitors.

    Jewel Identification – most teams seem to be using the REV color sensor on the arm their jewel displacement arm. We’ll probably start out doing that too, but I’d also like to use machine vision to identify the correct jewel. Just because we can. Just looking at the arrangement, we should be able to get both the jewels and the Vuforia target in the same frame at the beginning of autonomous.

    Alignment – it is not legal to extend a part of the robot outside of the 18” dimensions during match setup. So we can’t put the jewel arm out to make sure it is between the jewels. But there is nothing preventing us from using the camera to assist with alignment. We can even draw on the screen where the jewels should appear, like inside the orange box below. This will also help with Jewel ID – we won’t have to hunt for the relevant pixels – we can just compare the average hue of the two regions around the wiffle balls.

    Autonomous Deposition – this is the most ambitious use for machine vision. The dividers on the crypto boxes should make pretty clear color blob regions. If we can find the center points between these regions, we should be able to code and automatically centering glyph depositing behavior.

    Autonomous glyph collection – ok this is actually harder. Teams seem to spend most of their time retrieving glyphs. Most of that time seems to be spent getting the robot and the glyphs square with each other. Our drivers have a lot of trouble with this even though we have a very maneuverable mecanum drive. What if we could create a behavior that would automatically align the robot to a target glyph on approach? With our PID routines we should be able to do this pretty efficiently. The trouble is we need to figure out the glyph orientation by analyzing frames on approach. And it probably means shape analysis – something we’ve never done before. If we get to this, it won’t be until pretty late in the season. Maybe we’ll come up with a better mechanical approach to aligning glyphs with our bot and this won’t be needed.

    Tools for Experimenting

    Machine vision folks tend to think about image analysis as a pipeline that strings together different image processing algorithms in order to understand something about the source image or video feed. These algorithms are often things like convolution filters that isolate different parts of the image. You have to decide which stages to put into a pipeline depending on what that pipeline is meant to detect or decide. To make it easier to experiment, it’s good to use tools that let you create these pipelines and play around with them before you try to hard-code it into your robot.

    I've been using a tool called ImagePlay. http://imageplay.io/ It's open source and based on OpenCV. I used it to create a pipeline that has some potential to help navigation in this year's challenge. Since ImagePlay is open source, once you have a pipeline, you can figure out the calls to it makes to opencv to construct the stages. It's based on the C++ implementation of OpenCV so we’ll have to translate that to java for Android. It has a very nice pipeline editor that supports branching. The downside is that this tool is buggy and doesn't have anywhere near the number of filters and algorithms that RoboRealm supports.

    RoboRealm is what we wanted to use. We’ve been pretty closely connected with the Dallas Personal Robotics Group (DPRG) for years and Carl Ott is a member who has taught a couple of sessions on using RoboRealm to solve the club’s expert line following course. Based on his recommendation we contacted the RoboRealm folks and they gave use a 5 user commercial license. I think that’s valued at $2,500. They seemed happy to support FTC teams.

    RoboRealm is much easier to experiment with and they have great documentation so now have an improved pipeline. It's going to take more work to figure out how to implement that pipeline in OpenCV because it’s not always clear what a particular stage in RoboRealm does at a low level. But this improved pipeline isn’t all that different from the ImagePlay version.

    Candidate Pipeline

    So here is a picture of a red cryptobox sitting against a wall with a bunch of junk in the background. This image ended up upside down, but that doesn’t matter for just experimenting. I wanted a challenging image, because I want to know early if we need to have a clean background for the cryptoboxes. If so, we might need to ask the FTA if we can put an opaque background behind the cryptoboxes:

    Stage 1 – Color Filter – this selects only the reddest pixels

    Stage 2 – GreyScale – Don’t need the color information anymore, this reduces the data size

    Stage 3 – Flood Fill – This simplifies a region by flooding it with the average color of nearby pixels. This is the same thing when you use the posterize effect in photoshop. This also tends to remove some of the background noise.

    Stage 4 – Auto Threshold – Turns the image into a B/W image with no grey values based on a thresholding algorithm that only the RoboRealm folks know.

    Stage 5 – Blob Size – A blob is a set of connected pixels with a similar value. Here we are limiting the output to the 4 largest blobs, because normally there are 4 dividers visible. In this case there is an error. The small blob on the far right is classified as a divider even though it is just some other red thing in the background, because the leftmost column was mostly cut out of the frame and wasn’t lit very well. It ended up being erased by this pipeline.

    Stages 6 & 7 – Moment Statistics – Moments are calculations that can help to classify parts of images. We’ve used Hu Moments since our first work with machine vision on our robot named Argos. They can calculate the center of a blob (center of gravity), its eccentricity, and its area. Here the center of gravity is the little red square at the center of each blob. Now we can calculate the midpoint between each blob to find the center of a column and use that as a navigation target if we can do all this in real-time. We may have to reduce image resolution to speed things up.

    Working on Autonomous

    Working on Autonomous By Tycho

    Task: Create a temporary autonomous for the bot

    We attempted to create an autonomous for our first scrimmage. It aimed to make the robot to drive forward and drive into the safe zone. However, we forgot to align the robot and it failed at the scrimmage.

    Instead of talking about the code like usual, the code's main functions are well documented so that any person can understand its functions without a prior knowledge of coding.

     public void autonomous2 (){
    
            switch(autoState){
                case 0: //moves the robot forward .5 meters
                    if (robot.driveStrafe(false, .60, .35)) {
    
                        robot.resetMotors(true);
                        autoState++;
                    }
                        break;
                case 1: //scan jewels and decide which one to hit
                    if (robot.driveForward(false, .25, .35)) {
                        autoTimer = futureTime(1f);
                        robot.resetMotors(true);
                        autoState++;
                    }
    
                    break;
                case 2: //short move to knock off jewel
    
                    robot.glyphSystem.ToggleGrip();
                    autoTimer = futureTime(1f);
    
                    robot.resetMotors(true);
                    autoState++;
                    break;
                case 3: //back off of the balance stone
                    if (robot.driveForward(true, .10, .35)) {
                        autoTimer = futureTime(3f);
                        robot.resetMotors(true);
                        autoState++;
                    }
                    break;
                case 4: //re-orient the robot
                    autoState++;
                    break;
                case 5: //drive to proper crypto box column based on vuforia target
                    autoState++;
                    break;
                case 6: // turn towards crypto box
                    autoState++;
                    break;
                case 7: //drive to crypto box
                    autoState++;
                    break;
                case 8: //deposit glyph
                    autoState++;
                    break;
                case 9: //back away from crypto box
                    autoState++;
                    break;
            }
        }
    

    Adding Code Fixes to the Robot

    Adding Code Fixes to the Robot By Tycho

    Task: Add code updates

    These commits add said functionality:

    • Pre-game logic - joystick control
    • Fix PID settings
    • Autonomous resets motor
    • Jewel Arm functionality
    • Autonomous changes
    • Tests servos

    These commits allow better QoL for our drivers, allow our robot to function more smoothly both in autonomous and during TeleOp, allows us to score the jewels, and lets us test servos.

    Jewel Arm


    package org.firstinspires.ftc.teamcode;
    
    import com.qualcomm.robotcore.hardware.NormalizedColorSensor;
    import com.qualcomm.robotcore.hardware.Servo;
    
    /**
     * Created by 2938061 on 11/10/2017.
     */
    
    public class JewelArm {
    
        private Servo servoJewel;
        private NormalizedColorSensor colorJewel;
        private int jewelUpPos;
        private int jewelDownPos;
    
        public JewelArm(Servo servoJewel, NormalizedColorSensor colorJewel, int jewelUpPos, int jewelDownPos){
            this.servoJewel = servoJewel;
            this.colorJewel = colorJewel;
            this.jewelUpPos = jewelUpPos;
            this.jewelDownPos = jewelDownPos;
        }
    
        public void liftArm(){
            servoJewel.setPosition(ServoNormalize(jewelUpPos));
        }
        public void lowerArm(){
            servoJewel.setPosition(ServoNormalize(jewelDownPos));
        }
    
        public static double ServoNormalize(int pulse){
            double normalized = (double)pulse;
            return (normalized - 750.0) / 1500.0; //convert mr servo controller pulse width to double on _0 - 1 scale
        }
    
    }
    

    Autonomous

    		public void autonomous(){
            switch(autoState){
                case 0: //scan vuforia target and deploy jewel arm
                    robot.jewel.lowerArm();
                    autoTimer = futureTime(1.5f);
                    if(autoTimer < System.nanoTime()) {
                        relicCase = getRelicCodex();
                        jewelMatches = robot.doesJewelMatch(isBlue);
                        autoState++;
                    }
                    break;
                case 1: //small turn to knock off jewel
                    if ((isBlue && jewelMatches)||(!isBlue && !jewelMatches)){
                        if(robot.RotateIMU(10, .5)){
                            robot.resetMotors(true);
                        }
                    }
                    else{
                        if(robot.RotateIMU(350, .5)){
                            robot.resetMotors(true);
                        }
                    }
                    break;
                case 2: //lift jewel arm
                    robot.jewel.liftArm();
                    autoTimer = futureTime(1.5f);
                    if(autoTimer < System.nanoTime()) {
                        jewelMatches = robot.doesJewelMatch(isBlue);
                        autoState++;
                    }
                case 3: //turn parallel to the wall
                    if(isBlue){
                        if(robot.RotateIMU(270, 2.0)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    else{
                        if(robot.RotateIMU(90, 2.0)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    autoState++;
                    break;
                case 4: //drive off the balance stone
                    if(robot.driveForward(true, .3, .5)) {
                        robot.resetMotors(true);
                        autoState++;
                    }
                    break;
                case 5: //re-orient robot
                    if(isBlue){
                        if(robot.RotateIMU(270, 1.0)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    else{
                        if(robot.RotateIMU(90, 1.0)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    break;
                case 6: //drive to proper crypto box column based on vuforia target
                    switch (relicCase) {
                        case 0:
                            if(robot.driveForward(true, .5, .35)) {
                                robot.resetMotors(true);
                                autoState++;
                            }
                            break;
                        case 1:
                            if(robot.driveForward(true, .75, .35)) {
                                robot.resetMotors(true);
                                autoState++;
                            }
                            autoState++;
                            break;
                        case 2:
                            if(robot.driveForward(true, 1.0, .35)) {
                                robot.resetMotors(true);
                                autoState++;
                            }
                            autoState++;
                            break;
                    }
                    break;
                case 7: //turn to crypto box
                    if(isBlue){
                        if(robot.RotateIMU(315, 1.5)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    else{
                        if(robot.RotateIMU(45, 1.5)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    break;
                case 8: //deposit glyph
                    if(robot.driveForward(true, 1.0, .50)) {
                        robot.resetMotors(true);
                        robot.glyphSystem.ReleaseGrip();
                        autoState++;
                    }
                    break;
                case 9: //back away from crypto box
                    if(robot.driveForward(false, .5, .50)){
                        robot.resetMotors(true);
                        autoState++;
                    }
                    break;
                default:
                    robot.resetMotors(true);
                    autoState = 0;
                    active = false;
                    state = 0;
                    break;
            }
        }
        public void autonomous2 (){
    
            switch(autoState){
                case 0: //scan vuforia target and deploy jewel arm
                    robot.jewel.lowerArm();
                    autoTimer = futureTime(1.5f);
                    if(autoTimer < System.nanoTime()) {
                        relicCase = getRelicCodex();
                        jewelMatches = robot.doesJewelMatch(isBlue);
                        autoState++;
                    }
                    break;
                case 1: //small turn to knock off jewel
                    if ((isBlue && jewelMatches)||(!isBlue && !jewelMatches)){
                        if(robot.RotateIMU(10, .5)){
                            robot.resetMotors(true);
                        }
                    }
                    else{
                        if(robot.RotateIMU(350, .5)){
                            robot.resetMotors(true);
                        }
                    }
                    break;
                case 2: //lift jewel arm
                    robot.jewel.liftArm();
                    autoTimer = futureTime(1.5f);
                    if(autoTimer < System.nanoTime()) {
                        jewelMatches = robot.doesJewelMatch(isBlue);
                        autoState++;
                    }
                case 3: //turn parallel to the wall
                    if(isBlue){
                        if(robot.RotateIMU(270, 2.0)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    else{
                        if(robot.RotateIMU(90, 2.0)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    autoState++;
                    break;
                case 4: //drive off the balance stone
                    if(robot.driveForward(true, .3, .5)) {
                        robot.resetMotors(true);
                        autoState++;
                    }
                    break;
                case 5: //re-orient robot
                    if(isBlue){
                        if(robot.RotateIMU(270, 1.0)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    else{
                        if(robot.RotateIMU(90, 1.0)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    break;
                case 6: //drive to proper crypto box column based on vuforia target
                    switch (relicCase) {
                        case 0:
                            if(robot.driveStrafe(true, .00, .35)) {
                                robot.resetMotors(true);
                                autoState++;
                            }
                            break;
                        case 1:
                            if(robot.driveStrafe(true, .25, .35)) {
                                robot.resetMotors(true);
                                autoState++;
                            }
                            autoState++;
                            break;
                        case 2:
                            if(robot.driveStrafe(true, .50, .35)) {
                                robot.resetMotors(true);
                                autoState++;
                            }
                            autoState++;
                            break;
                    }
                    break;
                case 7: //turn to crypto box
                    if(isBlue){
                        if(robot.RotateIMU(215, 1.5)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    else{
                        if(robot.RotateIMU(135, 1.5)){
                            robot.resetMotors(true);
                            autoState++;
                        }
                    }
                    break;
                case 8: //deposit glyph
                    if(robot.driveForward(true, 1.0, .50)) {
                        robot.resetMotors(true);
                        robot.glyphSystem.ReleaseGrip();
                        autoState++;
                    }
                    break;
                case 9: //back away from crypto box
                    if(robot.driveForward(false, .5, .50)){
                        robot.resetMotors(true);
                        autoState++;
                    }
                    break;
                default:
                    robot.resetMotors(true);
                    autoState = 0;
                    active = false;
                    state = 0;
                    break;
            }
        }
    

    Code Fixes and Readability

    Code Fixes and Readability By Tycho

    Task: Make the code more readable

    So, we can't include all the code changes we made today, but all of it involved cleaning up our code, removing extra functions we didn't use, refactoring, adding comments, and making it more readable for the tournament. We had almost 80k deletions and 80k additions. This marks a turning point in the readablity of our code so that less experienced team members can read it. We went through methodically and commented out each function and method for future readability, as we will have to pass the codebase on to next year's team.