99爱在线视频这里只有精品_窝窝午夜看片成人精品_日韩精品久久久毛片一区二区_亚洲一区二区久久

合肥生活安徽新聞合肥交通合肥房產生活服務合肥教育合肥招聘合肥旅游文化藝術合肥美食合肥地圖合肥社保合肥醫院企業服務合肥法律

代做CSC477、代寫Python,C++設計編程

時間:2024-02-05  來源:合肥網hfw.cc  作者:hfw.cc 我要糾錯



CSC**7 – Introduction To Mobile Robotics

assignment 1, 15 points due: Jan 31, 2024, at 6pm ET

Overview: In this assignment you will write a feedback controller that enables a ground robot, which obeys differential drive dynamics, to follow a wall. The purpose of this assignment is to make you develop experience with the following concepts:

 Robot Operating System: its architecture and its publisher-subscriber model, which abstracts away the details of distributed computation and message passing from the robot programmer. Familiarity will also be developed with ROS’s main visualization tool, called rviz.

 Controlling the yaw of a ground robot via PID control

 Processing 2D laser measurements

 The gazebo simulator, which is currently one of the most popular simulators in the robotics community.

Setting up VNC:

If you are planning on using the lab machines for this assignment, you will have to use VNC to access the lab PCs since you will need a GUI to run the simulations. If you plan on working from your local machine please skip this section and go to ”Setting Up ROS”.

We have posted a shared google spreadsheet containing the assignment of MCS lab machines to students, so please refer to Quercus for instructions on how to get a lab PC for yourself, if you need one. You can also work from your own machine, but note the requirements for setting up ROS, listed in the following sections. We might not be able to provide support or advice for the starter code if you run it on machines that do not satisfy those specs.

VNC is already installed on all the systems, but you need to set up your credentials so you can run your own VNC server that only you can access.

1. ssh into your account on your assigned lab PC

2. Type vncpasswd to create a password for yourself to access the PC

3. run vncserver with the following command so it creates the required directories.

vncserver

4. Then kill the server with

vncserver -kill :*

5. Use your favourite text editor to open the .vnc/xstartup file and put the following text in it:

#!/bin/sh

unset SESSION_MANAGER

exec /etc/X11/xinit/xinitrc

6. Now start a vncserver again with the same command:

vncserver

page 1 of 10

 

CSC**7: Introduction to Mobile Robotics - Assignment 1 Jan 18, 2024

 7. In the output of the command you just ran, make note of the :1 (or whatever number it is, it will be :x where x is an integer)

Here is an example of the output where the x=2 :

UTORid@dh2026pc09:~$ vncserver

New ’dh2026pc09.utm.utoronto.ca:2 (UTORid)’ desktop at :2 on machine dh2026pc09.utm.utoronto.ca

Starting applications specified in /student/UTORid/.vnc/xstartup

Log file is /student/UTORid/.vnc/dh2026pc09.utm.utoronto.ca:2.log

Use xtigervncviewer -SecurityTypes VncAuth -passwd /student/UTORid/.vnc/passwd :2 to

connect to the VNC server.

Now on your desktop:

1. Install a vncviewer (I installed realvnc-vnc-viewer but I think any one will work)

2. Start an ssh tunnel which the vncserver will go through. We are doing this because the vnc connection by itself is not encrypted and insecure, so its important that all the packets for your vnc connection pass through the secure ssh tunnel. To start the tunnel use the command:

ssh -L yyyy:localhost:5**x -C UTORid@[your assigned lab pc].utm.utoronto.ca

where yyyy is a port you want to use on your workstation that is not already used. And x is the :x number that is shown to you when you started vncserver. The ’-C’ is to allow compression, it should help your remote instance be more responsive.

3. Once you log in, leave the window open and start the vncviewer on your local pc. The address you would be connecting to would be

localhost:yyyy

4. It will prompt for the vncpasswd that you created, and you should see your desktop.

If you run into trouble at any of the above steps, you may want to go through the relevant sections of the following VNC guides to see if they solve your problem before you ask for help.

For Ubuntu 18:

www.digitalocean.com/community/tutorials/how-to-install-and-configure-vnc-on-ubuntu-18-04

For Ubuntu 20:

www.digitalocean.com/community/tutorials/how-to-install-and-configure-vnc-on-ubuntu-20-04

For Ubuntu 22:

www.digitalocean.com/community/tutorials/how-to-install-and-configure-vnc-on-ubuntu-22-04

For Mac OS and Windows:

www.linode.com/docs/applications/remote-desktop/install-vnc-on-ubuntu-18-04/

page 2 of 10

 

CSC**7: Introduction to Mobile Robotics - Assignment 1 Jan 18, 2024

 Setting up ROS: There are multiple ways to install ROS. If you are planning to work from the MCS lab machines in Deerfield Hall (e.g. DH2020, DH2010, DH2026), Ubuntu and ROS are already installed and available to you. If you want to work from your own machine, we are assuming that you are running Ubuntu 20.04 on a computer which you have sudo access, then please make sure the following are installed:

 Ubuntu 20.04. NOTE: Ubuntu 22.04 is by default not compatible with precompiled ROS 1 noetic packages, so it will not work. Compiling ROS noetic from source should work, but it is quite a long process, and it is not recommended. If you have Ubuntu 22.04 on your personal desktop then you might need to follow these instructions to install ROS 1 precompiled packages https://robostack. github.io/index.html, but we have not tested this. If this does not work, you might need to run ROS nodes inside a Docker container, as shown here https://roboticseabass.com/2021/04/21/ docker-and-ros/.

 ROS 1 (noetic) http://wiki.ros.org/noetic/Installation/Ubuntu. If you have trouble installing ROS 1 please post on Piazza.

 ROS 1 simulation dependencies (if you didn’t do a full install of ROS you can get them from sudo apt-get install ros-noetic-simulators)

any video screen recorder for assignment submission (e.g. recordmydesktop https://wiki.ubuntu.com/ScreenCasts/RecordMyDesktop)

Get and run the starter code: We have created a few simulated worlds consisting of sequences of walls and a simulated ground robot for you for the purposes of this assignment. A screenshot of the what the Gazebo simulation environment looks like is shown in Fig 1.
CSC**7: Introduction to Mobile Robotics - Assignment 1 Jan 18, 2024

  Figure 1: Gazebo environment with walls only on one side of the robot.

We are also providing starter code for this assignment. This starter code is provided in the form of ROS packages in your workspace. Make sure the following lines are at the end of your /home/[username]/.bashrc file:

export ROS_HOME=~/.ros

source /opt/ros/noetic/setup.bash

source ~/csc**7_ws/devel/setup.bash

Then run the following command, this adds the paths to catkin and ros for your bash terminal, allowing you to use them

source ~/.bashrc

Then actually create your workspace:

mkdir -p ~/csc**7_ws/src

cd ~/csc**7_ws/src

catkin_init_workspace

In csc**7 ws/src download the starter code:

git clone https://github.com/florianshkurti/csc**7_winter24.git

Then compile it, but you have to run catkin make twice, as it fails the first time :

cd ~/csc**7_ws

catkin_make

source ~/.bashrc

page 4 of 10

 

CSC**7: Introduction to Mobile Robotics - Assignment 1 Jan 18, 2024

 If this last command results in errors, you might need to install additional packages. Please post your questions on Piazza if this is the case and you don’t know what to do. In the MCS lab machines this will most likely not be an issue. If you are working from a personal laptop or desktop, however, you might need to install the following packages:

sudo apt-get install ros-noetic-control-toolbox ros-noetic-joystick-drivers

sudo apt-get install ros-noetic-realtime-tools ros-noetic-ros-control

sudo apt-get install ros-noetic-ros-controllers ros-noetic-gazebo-ros-control

If you still get errors after installing them please post a question on Quercus or email us as soon as possible. If compilation goes smoothly, move on to the following.

Your assignment is run using 3 commands, each in their own separate terminal. First you create the simulation environment, then you populate it with a virtual husky robot, and finally you run your code piloting the husky around the environment. This is what the commands look like at a glance, they are explained in further detail, and with some more setup info + checks below:

roslaunch wall_following_assignment gazebo_world.launch  world_name:=[world_name]

roslaunch wall_following_assignment husky_follower.launch

roslaunch wall_following_assignment wall_follower_[language].launch

Here is a step-by-step walkthrough of the commands: Bring up a world with walls in the gazebo simulator

roslaunch wall_following_assignment gazebo_world.launch  world_name:=walls_one_sided

Bring up the robot (husky) in that world

roslaunch wall_following_assignment husky_follower.launch

The only error that should appear after these two commands is that no joystick can be found. If another error is printed, please let us know. If these two commands go well this should make the gazebo image in Fig 1 shown above appear. Then, call rviz, which is the default visualization system for ROS:

rosrun rviz rviz

And then go to File > Open config and select the config file

csc**7 ws/src/wall following assignment/resources/csc**7.rviz You should see what’s shown in Fig 2:

The rainbow-colored line is actually a set of points detected by the simulated 2D laser. The other line and frames represent the tree of reference frames that the system is aware of. If all of this goes well then you will be ready to proceed to the next section, which is the essence of the assignment, and to write your controller code to make the robot move. At this point, you can run a simple last check:

rosrun teleop_twist_keyboard teleop_twist_keyboard.py  \

   cmd_vel:=/husky_1/husky_velocity_controller/cmd_vel

This command bring up a node publishing on topic /husky 1/husky velocity controller/cmd vel and enable you to control the robot through keyboard. If you can successfully move husky around using the keyboard through the command-line interface shown in Fig. 3 then you are ready to proceed with writing your PID controller.

Implement and test a wall-following controller (15 pts) The input to the robot will be laser scan messages from the robot’s simulated laser scanner. See here http://docs.ros.org/api/sensor_msgs/ html/msg/LaserScan.html for the ROS message definition. Also, make sure you have understood these ROS tutorials about the publisher-subscriber model of distributed computing: http://wiki.ros.org/ROS/ Tutorials. We have provided starter code for Python and C++ in the files

page 5 of 10

 

CSC**7: Introduction to Mobile Robotics - Assignment 1 Jan 18, 2024

  Figure 2: Rviz visualization for the Husky robot in Fig 1. Overlaid on the robot you can see the various reference frames from ROS’s tf system.

csc**7_ws/src/csc**7_fall22/wall_following_assignment/python/wall_follower.py

csc**7_ws/src/csc**7_fall22/wall_following_assignment/src/wall_follower_node.cpp

csc**7_ws/src/csc**7_fall22/wall_following_assignment/include/wall_following_assignment/pid.h

Choose your language and edit the appropriate files. Specifically:

[Part A, 2 pts] Compute the cross-track error based on each incoming laser scan and publish it un- der the topic name /husky 1/cte with the ROS message type std msgs/Float**, which can be found here http://wiki.ros.org/std_msgs. Tutorials for how to write your own publisher in Python can be found at http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber%28python%29.

[Part B, 7 pts] Populate the PID class based on the provided API. For each incoming laser scan is- sue an angular velocity command to the robot at the topic /husky velocity controller/cmd vel, based on the output of the PID controller. You need to follow the wall on the LEFT of the robot, as it is placed in its initial configuration.

[Part C, 2 pts]

Create a dynamic reconfigure server for tweaking your controller’s parameters in real time. Follow the in- structions presented here: http://wiki.ros.org/dynamic_reconfigure/Tutorials. Dong forget to make your cfg file executable, according to the instructions above. Add at least three parameters for the PID gains and run

rosrun rqt_reconfigure rqt_reconfigure

to tweak the PID parameters manually. A set of parameters is considered good enough and the run is considered successful if the robot does not collide with the wall and completes at least 3/4 of a full circuit around the left wall. NOTE: If you have build or import errors during this part let us know on Piazza.

page 6 of 10

 

CSC**7: Introduction to Mobile Robotics - Assignment 1 Jan 18, 2024

  Figure 3: Command-line interface for the node teleop twist keyboard.py which allows you to drive the robot using your keyboard. If you can do this successfully then you are ready to proceed with designing the controller.

page 7 of 10

 

CSC**7: Introduction to Mobile Robotics - Assignment 1 Jan 18, 2024

 [Part D, 4 pts] For each of the two simple wall worlds in wall following assignment/worlds/, namely: walls one sided.world, walls two sided.world, do the following:

Launch your wall following controller like this:

   roslaunch  wall_following_assignment wall_follower_python.launch

   or

   roslaunch  wall_following_assignment wall_follower_cpp.launch

according to your language of choice.

 (1pt/world) Use a desktop recording program such as recordMyDesktop or something similar to record a video of your robot while it is following the wall. Be sure to include any failure cases.

 (1pt/world) Record the cross-track error published by your node as follows: rosbag record /husky_1/cte

and after your robot’s run is done, convert the recorded messages in the bag into a text file like so:

        rostopic echo -b file.bag -p /husky_1/cte  > cross_track_error.csv

        rosbag play file.bag

[Optional Part E, 1 bonus pt] Implement and evaluate the PID self-tuning algorithm that was mentioned in class. Plot the cross-track error as a function of epochs (or iterations), where an epoch is a round of evaluation of a PID parameter setting on the simulator.

[Optional Part F, 2 bonus pts] Evaluate your wall-following ROS node on one of the real racecars that we have available for this course at UTM, shown here:

Note that these racecars as well as a technician to support their operation are only available at UTM. If you are interested in completing this bonus question, please reach out to our robotics lab technician at UTM, Ruthrash Hari (ruthrash.hari@mail.utoronto.ca).

Submission Instructions Assignment submissions will be done on Quercus. You will submit a zip file containing the following:

1. Your csc**7 winter24/wall following assignment directory for Parts A, B, and C.

2. Two videos, one for demonstrating the robot’s navigation in the world walls one sided.world and another video for the world walls two sided.world as explained in Part D. The videos should be named as follows:

         FirstName_LastName_StudentNumber_walls_one_sided.[mp4/avi]

         FirstName_LastName_StudentNumber_walls_two_sided.[mp4/avi]

Each video should not exceed 10MB.

3. Similarly, two csv files from Part D. They should be named:

page 8 of 10

 

CSC**7: Introduction to Mobile Robotics - Assignment 1 Jan 18, 2024

  Figure 4: CSC**7 Racecar

         FirstName_LastName_StudentNumber_walls_one_sided.csv

         FirstName_LastName_StudentNumber_walls_two_sided.csv

The point of including a video of your controller in your submission is not just for us to easily examine your code. It’s also so that you can easily show your work later on to classmates/coworkers/employers. It becomes part of your portfolio. It is also worth noting that, due to the ROS abstraction layer, if you want, you could run your feedback controller on a real Husky robot, without major modifications.

4. If you’ve done the optional bonus question E, please submit the resulting figure as:

         FirstName_LastName_StudentNumber_bonus_question_PID_autotuning.[png/jpg]

5. If you’ve done the optional bonus question F, please submit a video (under 20MB) showing the racecar doing wall-following on the 3rd floor of Deerfield Hall at UTM:

         FirstName_LastName_StudentNumber_bonus_question_Racecar.[mp4/avi]

We expect your code to run on the MCS Lab machines. If it does not you will lose marks. We expect to be able to compile it using catkin make. Once we compile your code we will run the following:

roslaunch wall_following_assignment gazebo_world.launch  world_name:=[world_name]

roslaunch wall_following_assignment husky_follower.launch

roslaunch wall_following_assignment wall_follower_[language].launch

page 9 of 10

 

CSC**7: Introduction to Mobile Robotics - Assignment 1 Jan 18, 2024

 We will test your code on other similar worlds to the one provided. We will also test it using different desired distances away from the goal. The default desired distance is 1 meter away, and the default forward speed is 1m/s. We will test your code with different parameters in that neighbourhood. Your code will also be examined for correctness, style and efficiency. We recommend that you come by during office hours or email us if you are unsure of your implementation.
如有需要,請加QQ:99515681 或WX:codehelp

掃一掃在手機打開當前頁
  • 上一篇:CSC3002代做、代寫C/C++設計編程
  • 下一篇:COM3524代做、代寫Java,Python編程設計
  • 無相關信息
    合肥生活資訊

    合肥圖文信息
    急尋熱仿真分析?代做熱仿真服務+熱設計優化
    急尋熱仿真分析?代做熱仿真服務+熱設計優化
    出評 開團工具
    出評 開團工具
    挖掘機濾芯提升發動機性能
    挖掘機濾芯提升發動機性能
    海信羅馬假日洗衣機亮相AWE  復古美學與現代科技完美結合
    海信羅馬假日洗衣機亮相AWE 復古美學與現代
    合肥機場巴士4號線
    合肥機場巴士4號線
    合肥機場巴士3號線
    合肥機場巴士3號線
    合肥機場巴士2號線
    合肥機場巴士2號線
    合肥機場巴士1號線
    合肥機場巴士1號線
  • 短信驗證碼 豆包 幣安下載 AI生圖 目錄網

    關于我們 | 打賞支持 | 廣告服務 | 聯系我們 | 網站地圖 | 免責聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網 版權所有
    ICP備06013414號-3 公安備 42010502001045

    99爱在线视频这里只有精品_窝窝午夜看片成人精品_日韩精品久久久毛片一区二区_亚洲一区二区久久

          亚洲性人人天天夜夜摸| 国内精品写真在线观看| 久久国产精品黑丝| 夜夜嗨av一区二区三区| 狠狠入ady亚洲精品| 欧美日韩免费在线视频| 久久先锋影音av| 亚洲欧美精品在线观看| 日韩视频精品| 一区国产精品| 国内精品免费午夜毛片| 国产乱人伦精品一区二区| 欧美日韩日本国产亚洲在线| 噜噜噜久久亚洲精品国产品小说| 亚洲一区在线观看视频 | 亚洲欧洲一区二区三区久久| 国产精品视频第一区| 欧美日韩三级视频| 欧美日韩国产免费| 欧美激情精品久久久久久| 免费欧美在线视频| 美日韩在线观看| 欧美jizz19性欧美| 蜜臀久久99精品久久久久久9| 久久久综合网| 蜜桃av一区二区三区| 久久夜色精品国产欧美乱极品| 久久久久se| 久久久人成影片一区二区三区| 久久精品日韩一区二区三区| 久久精品视频在线看| 久久久天天操| 美女视频网站黄色亚洲| 欧美电影免费观看高清| 欧美xart系列高清| 欧美全黄视频| 国产精品电影在线观看| 国产亚洲aⅴaaaaaa毛片| 国产一区二区三区四区| 影音先锋久久| 一本久道综合久久精品| 午夜精品久久久久久久久| 午夜精品成人在线| 久久综合99re88久久爱| 欧美日韩18| 国产精品你懂得| 一区二区三区在线免费观看| 亚洲欧洲一区二区在线播放| 中文一区在线| 久久精品人人| 欧美性大战久久久久久久| 国产深夜精品| 日韩视频免费| 久久国产精品一区二区三区| 欧美风情在线| 国产精品人人做人人爽| 亚洲国产成人av好男人在线观看| 一本色道久久| 久久久av毛片精品| 欧美色综合网| 亚洲国产cao| 欧美在线播放一区| 欧美日韩国产色站一区二区三区| 国产精品试看| 亚洲精品在线视频| 欧美一区国产一区| 欧美四级在线观看| 亚洲国产精品女人久久久| 亚洲欧美精品在线| 欧美精品18| 国产一区二区三区免费在线观看| 亚洲区中文字幕| 久久久噜噜噜久久人人看| 国产精品国产三级欧美二区| 在线日本高清免费不卡| 欧美在线免费播放| 欧美午夜精品一区| 日韩系列欧美系列| 欧美高清视频一区二区| 国产在线不卡视频| 欧美一区三区二区在线观看| 欧美日韩综合在线| 日韩亚洲欧美一区二区三区| 久久一区二区三区四区| 国产综合色产在线精品| 亚洲综合首页| 国产精品国产三级国产专播精品人 | 99re这里只有精品6| 久久午夜国产精品| 国产自产精品| 久久九九国产| 国模一区二区三区| 欧美一区二区视频观看视频| 国产精品亚洲综合| 亚洲欧美成人网| 国产精品久久国产精麻豆99网站| 亚洲视频久久| 国产精品视频yy9299一区| 亚洲制服欧美中文字幕中文字幕| 欧美日韩成人综合| 99精品久久久| 欧美调教vk| 午夜欧美不卡精品aaaaa| 国产日韩亚洲欧美综合| 久久国产精品黑丝| 亚洲高清免费| 欧美精品精品一区| 亚洲一区二区三区在线观看视频| 国产精品久久久久国产a级| 午夜精品视频一区| 精品不卡一区| 欧美日韩小视频| 性欧美长视频| 极品尤物久久久av免费看| 欧美不卡视频| 中文av一区二区| 国产日韩欧美在线播放| 久热爱精品视频线路一| 亚洲日本va午夜在线影院| 欧美视频在线一区二区三区| 欧美一区二区三区四区在线观看地址 | 亚洲第一区在线观看| 欧美二区在线| 午夜日韩av| 最新69国产成人精品视频免费| 欧美日韩一区视频| 久久蜜桃资源一区二区老牛| 亚洲精品影院| 国产一区久久久| 欧美精品久久一区| 久久激五月天综合精品| 99国产精品久久久久久久| 国产精品自拍一区| 欧美精品激情blacked18| 午夜精品久久久久影视| 亚洲日本一区二区三区| 国产农村妇女毛片精品久久莱园子 | 欧美激情va永久在线播放| 亚洲影音先锋| 亚洲人成人一区二区在线观看| 国产欧美大片| 欧美视频观看一区| 欧美成人自拍视频| 久久av红桃一区二区小说| 99精品视频免费观看视频| 国产一区二区三区直播精品电影| 欧美精品一区二区三区视频| 久久激情中文| 亚洲欧美日本国产专区一区| 日韩视频一区二区三区在线播放免费观看 | 在线观看91精品国产入口| 国产精品爱啪在线线免费观看 | 国产一区99| 国产精品久久久久久久久久三级| 欧美高清免费| 欧美sm重口味系列视频在线观看| 久久精品亚洲一区二区三区浴池| 亚洲视频观看| 一区二区三区四区五区视频| 亚洲国产日韩欧美在线99 | 久久夜色精品一区| 欧美中文字幕久久| 欧美一区二区在线免费播放| 午夜精品久久久久久久| 亚洲影院在线| 亚洲综合精品一区二区| 亚洲视频在线观看| 亚洲午夜电影在线观看| 亚洲视频欧洲视频| 亚洲视频播放| 中国亚洲黄色| 欧美一级日韩一级| 久久精品72免费观看| 欧美在线日韩| 欧美在线视频一区二区| 欧美专区第一页| 久久综合国产精品| 欧美精品色网| 国产精品成人在线观看| 国产伦精品免费视频| 国产午夜亚洲精品羞羞网站 | 久久午夜色播影院免费高清| 久久人人97超碰人人澡爱香蕉 | 亚洲国产一区二区三区高清| 亚洲人成人一区二区三区| 一区二区日韩| 午夜在线播放视频欧美| 久久九九电影| 欧美精品在线免费| 国产精品久久久久久久久免费| 国产精品无码专区在线观看| 黄色工厂这里只有精品| 亚洲国产精品尤物yw在线观看 | 在线午夜精品自拍| 欧美影院精品一区| 蜜桃精品一区二区三区| 欧美日韩一区在线| 国产综合第一页| 一本色道久久综合亚洲精品高清| 午夜精品999|